Why Are Americans So Bad at Math?

The New York Times has a noteworthy piece on why math education is so poor in the United States. Borrowing examples from how math is taught in Japan, the article outlines how different initiatives to reform math education in America have failed (and why they are likely to continue to fail). Worth the read.

It wasn’t the first time that Americans had dreamed up a better way to teach math and then failed to implement it. The same pattern played out in the 1960s, when schools gripped by a post-Sputnik inferiority complex unveiled an ambitious “new math,” only to find, a few years later, that nothing actually changed. In fact, efforts to introduce a better way of teaching math stretch back to the 1800s. The story is the same every time: a big, excited push, followed by mass confusion and then a return to conventional practices.

The new math of the ‘60s, the new new math of the ‘80s and today’s Common Core math all stem from the idea that the traditional way of teaching math simply does not work. As a nation, we suffer from an ailment that John Allen Paulos, a Temple University math professor and an author, calls innumeracy — the mathematical equivalent of not being able to read. On national tests, nearly two-thirds of fourth graders and eighth graders are not proficient in math. More than half of fourth graders taking the 2013 National Assessment of Educational Progress could not accurately read the temperature on a neatly drawn thermometer.

I hadn’t heard of this parable/story before, but it is quite the embarrassment:

One of the most vivid arithmetic failings displayed by Americans occurred in the early 1980s, when the A&W restaurant chain released a new hamburger to rival the McDonald’s Quarter Pounder. With a third-pound of beef, the A&W burger had more meat than the Quarter Pounder; in taste tests, customers preferred A&W’s burger. And it was less expensive. A lavish A&W television and radio marketing campaign cited these benefits. Yet instead of leaping at the great value, customers snubbed it.

Only when the company held customer focus groups did it become clear why. The Third Pounder presented the American public with a test in fractions. And we failed. Misunderstanding the value of one-third, customers believed they were being overcharged. Why, they asked the researchers, should they pay the same amount for a third of a pound of meat as they did for a quarter-pound of meat at McDonald’s. The “4” in “¼,” larger than the “3” in “⅓,” led them astray.

Maybe we need to develop more system-wide efforts to showcase teaching styles to observers, like they do in Japan:

In Japan, teachers had always depended on jugyokenkyu, which translates literally as “lesson study,” a set of practices that Japanese teachers use to hone their craft. A teacher first plans lessons, then teaches in front of an audience of students and other teachers along with at least one university observer. Then the observers talk with the teacher about what has just taken place. Each public lesson poses a hypothesis, a new idea about how to help children learn. And each discussion offers a chance to determine whether it worked. Without jugyokenkyu, it was no wonder the American teachers’ work fell short of the model set by their best thinkers.

What else matters? That teachers embrace new teaching styles, and persevere:

Most policies aimed at improving teaching conceive of the job not as a craft that needs to be taught but as a natural-born talent that teachers either decide to muster or don’t possess. Instead of acknowledging that changes like the new math are something teachers must learn over time, we mandate them as “standards” that teachers are expected to simply “adopt.” We shouldn’t be surprised, then, that their students don’t improve.

Here, too, the Japanese experience is telling. The teachers I met in Tokyo had changed not just their ideas about math; they also changed their whole conception of what it means to be a teacher. “The term ‘teaching’ came to mean something totally different to me,” a teacher named Hideto Hirayama told me through a translator. It was more sophisticated, more challenging — and more rewarding. “The moment that a child changes, the moment that he understands something, is amazing, and this transition happens right before your eyes,” he said. “It seems like my heart stops every day.”

Worth reading in entirety here.

On Business and Design Considerations of 1st Class Airplane Seating

David Owen, writing in The New Yorker, in a cleverly titled article “Game of Thrones,” describes the business and design considerations of seats in modern-day airplanes. While the economy seating is fairly routine (cramped), there is a lot of creativity involved in how 1st class and business seating is designed and built:

Airplane interiors are even more tightly regulated. Nearly every element undergoes a safety-enhancing process called “delethalization”: seats have to withstand an impact equal to sixteen times the force of gravity, and to remain in place when they do, so that they don’t block exit routes or crush anyone, and they can’t burst into flames or release toxic gases when they get hot. Doing something as simple as slightly increasing the thickness of the padding in a seat cushion can necessitate a new round of testing and certification, because a more resilient seat could make a passenger bounce farther after an impact, increasing the risk of injury caused by turbulence or a hard landing. Delethalizing some premium-class seats—in which a passenger’s head and torso have a lot of room to accelerate before being stopped by something solid—requires the addition of a feature that many passengers don’t even realize is there: an air bag concealed in the seat belt.

This bit about how expensive video-back video screens is fascinating:

In economy, the tight spacing of the seats makes air bags mostly unnecessary. But seat-back video screens and the hard frames that surround them pose a safety challenge, partly because of the potential for injuries caused by head strikes, and partly because the computers and the electrical systems that serve them have to be both fireproof and fully isolated from the plane’s—so that crossed wires in somebody’s seat don’t allow a ten-year-old playing a video game to suddenly take control of the cockpit. Largely as a result, in-flight entertainment systems are almost unbelievably expensive. The rule of thumb, I was told, is “a thousand dollars an inch”—meaning that the small screen in the back of each economy seat can cost an airline ten thousand dollars, plus a few thousand for its handheld controller.

1st_class_seat

The article mentions but doesn’t link to the TheDesignAir’s Top 10 International Business Classes of 2014 (it’s well worth a look).

The Male Deficit Model and Friendships

Do men suck at friendships? Or, at least, are they worse at being friends than their female counterparts? Research suggests the answer is yes. This Men’s Journal article provides an excellent overview:

The Male Deficit Model is based on 30 years of research into friendship and relationships — from Mayta Caldwell’s and Letitia Peplau’s 1982 UCLA study, which found that male friendships are far less intimate than female friendships, to a 2007 study at the College of the Holy Cross in Worcester, Massachusetts, which reported greater interpersonal competition and lower friendship satisfaction among men. A just-completed report from California State University Humboldt, meanwhile, holds that the closer men adhere to traditional male gender roles, like self-reliance and a reluctance to spill their guts, the worse their friendships fare. “Since most men don’t let themselves think or feel about friendship, this immense collective and personal disappointment is usually concealed, sloughed over, shrugged away,” writes the psychologist Stuart Miller in his opus, Men and Friendship. “The older we get, the more we accept our essential friendlessness.”

What’s the key to healthy aging? Good diet and exercise, right? Well, perhaps another factor outweighs them all:

That’s because nearly all research into healthy aging has found that the key to a long, happy life is not diet or exercise but strong social connections – that is, friendships. Loneliness accelerates age-related declines in cognition and motor function, while a single good friend has been shown to make as much as a 10-year difference in overall life expectancy. A huge meta-study performed in part at Brigham Young University, which reviewed 148 studies with a combined 308,849 subject participants, found that loneliness is just as harmful to health as not exercising, smoking 15 cigarettes a day, and alcoholism, and fully twice as bad as being obese. Still more startling is a 2010 study published in the Journal of Clinical Oncology that looked at 2,230 cancer patients in China. Social well-being, including friendship, turned out to be the number one predictor of survival.

Some of this stems from the fact that isolated people tend to exercise less, eat poorly, and drink too much. But some researchers believe that loneliness has a negative health impact all on its own. In numerous studies over the past 30 years, John T. Cacioppo, a professor of psychology at the University of Chicago and the pioneer of the biological study of loneliness, has found that lonely people have chronically elevated levels of the stress and fear hormones cortisol and epinephrine. In a 2007 paper published in Genome Biology, Cacioppo even demonstrated a correlation between loneliness and the activity of certain genes associated with systemic inflammation, elevating risk for viral invasion and cardiovascular disease.

And yet the capacity of men to combat loneliness – and improve their health – by building strong friendships seems to be steadily eroding. Cambridge, Massachusetts, professors Jacqueline Olds and Richard S. Schwartz, writing in The Lonely American: Drifting Apart in the Twenty-First Century, point to a current tendency among adults to build stronger, more intimate marriages at the expense of almost all other social connections. In a study of contemporary childcare arrangements, Olds and Schwartz found a deep sense of loneliness among many parents, especially men. “Almost every father we spoke with explained that he had lost contact with most of his male friends,” they write. And lest you believe family is company enough, the 2005 Australian Longitudinal Study of Aging showed that family relationships have almost no impact on longevity. Friendships, by contrast, boosted life span as much as 22 percent.

Read the rest here.

Valeria Lukyanova, Real-Life Barbie

This is a creepy piece in GQ Magazine on Valeria Lukyanova, who looks like a real-life Barbie. Here’s a photo of her:

valeria_barbie

The future Barbie was born nowhere near Malibu. Valeria hails from Tiraspol, a gloomy city in Europe’s poorest country, Moldova. Valeria remembers both her Siberian-born grandfather and her father as very strict and began to rebel at the usual age of 13. Stage one involved dyeing her hair, which is naturally a low-key shade of brown. Valeria went for the goth look first—about the farthest you could get from Barbie. She wore all-black clothes to accentuate her very white skin. Kids at school began to tease her.Look, a witch! At 15, traumatized by the name-calling, she doubled down: bracelets with sharp two-inch spikes, artificial fangs. She was dismissed from a school choir for standing bolt upright when the singers were instructed to sway; in different circumstances, this budding nonconformism could have brought her straight into Pussy Riot.

Instead, she began modeling, small-time stuff, and learned to apply makeup and hair dye in increasingly theatrical ways. Valeria was less interested in attracting men than in repelling them: “A dude would try to talk to me on the street and I’d be like—” she switches to a raspy basso—” ‘Oh, honey, aren’t I glad I had that operation.’ ” Another time, a guy tried grabbing her by the hand and she semi-accidentally cut him with her bracelet spike.

Proceed to read the whole thing with caution.

On the Brief History of the SAT Exam

From this excellent piece about a mother who decided to take the SATs (and naturally, decided to write a book about it: The Perfect Score Project: Uncovering the Secrets of the SAT), we learn about the history of the SAT exam:

The SATs were administered for the first time on June 23, 1926. Intelligence testing was a new but rapidly expanding enterprise; during the First World War, the United States Army had given I.Q. tests to nearly two million soldiers to determine who was officer material. (Walter Lippmann dismissed these tests as “quackery in a field where quacks breed like rabbits.”) The SAT’s inventor, a Princeton professor named Carl Campbell Brigham, had worked on the Army’s I.Q. test, and the civilian exam he came up with was a first cousin to the military’s. It contained some questions on math and some on identifying shapes. Mostly, though, it focussed on vocabulary. Brigham intended the test to be administered to students who had already been admitted to college, for the purposes of guidance and counselling. Later, he argued that it was foolish to believe, as he once had, that the test measured “native intelligence.” Rather, he wrote, scores were an index of a person’s “schooling, family background, familiarity with English, and everything else.”

By this point, though, the test had already been adopted for a new purpose. In 1933, James Bryant Conant, a chemist, became the president of Harvard. Conant, the product of a middle-class family, was dismayed by what he saw as the clubbiness of the school’s student body and set out to attract fresh talent. In particular, he wanted to recruit bright young men from public schools in the Midwest, few of whom traditionally applied to Harvard. Conant’s plan was to offer scholarships to ten such students each year. To select them, he decided to employ the SAT. As Nicholas Lemann observes in his book “The Big Test” (1999), this was one of those small decisions “from which great consequences later flow.” Not long after Harvard started using the SAT, Princeton, Columbia, and Yale followed suit. More and more colleges adopted the test until, by the mid-nineteen-fifties, half a million kids a year were taking it.

In the early decades of the test, scores were revealed only to schools, not to students. This made it difficult to assess the claim made by the College Board, the exam’s administrator, that studying for the SATs would serve no purpose. Still, a brash young high-school tutor named Stanley Kaplan concluded, based on the feedback he was getting from his pupils, that the claim was a crock. Kaplan began offering SAT prep classes out of his Brooklyn basement. Accusations that he was a fraud and a “snake oil salesman” failed to deter his clientele; the students just kept on coming. In the nineteen-seventies, Kaplan expanded his operations into cities like Philadelphia, Los Angeles, Chicago, and Miami; this is when the Federal Trade Commission decided to investigate his claims. The commission found that Kaplan was right: tutoring did boost scores, if not by as much as his testing service advertised. The College Board implicitly conceded the point in 1994, when it changed the meaning of the SAT’s central “A”; instead of “aptitude” it came to stand for “assessment.” Then the board took the even more radical step of erasing the meaning of the name altogether. Today, the letters “SAT” stand for nothing more (or less) than the SATs. As the Lord put it to Moses, “I am that I am.”

Read the rest here.

Before Laika: the Soviet Space Dogs

This is a very interesting post on Medium about the dogs the Soviets sent into space in the 1950s:

While the US test rocket programme used monkeys, about two thirds of whom died, dogs were chosen by the Soviets for their ability to withstand long periods of inactivity, and were trained extensively before they flew. Only stray female dogs were used because it was thought they’d be better able to cope with the extreme stress of spaceflight, and the bubble-helmeted spacesuits designed for the programme were equipped with a device to collect feces and urine that only worked with females.

Training included standing still for long periods, wearing the spacesuits, being confined in increasingly small boxes for 15-20 days at a time, riding in centrifuges to simulate the high acceleration of launch, and being placed in machines that simulated the vibrations and loud noises of a rocket.

The first pair of dogs to travel to space were Dezik and Tsygan (“Gypsy”), who made it to 110km on 22 July 1951 and were recovered, unharmed by their ordeal, the next day. Dezik returned to space in September 1951 with a dog named Lisa, but neither survived the journey. After Dezik’s death, Tsygan was adopted by Anatoli Blagronravov, a physician who later worked closely with the United States at the height of the Cold War to promote international cooperation on spaceflight.

They were followed by Smelaya (“Brave”), who defied her name by running away the day before her launch was scheduled. She was found the next morning, however, and made a successful flight with Malyshka (“Babe”). Another runaway was Bolik, who successfully escaped a few days before her flight in September 1951. Her replacement was ignomoniously named ZIB — the Russian acronym for “Substitute for Missing Bolik”, and was a street dog found running around the barracks where the tests were being conducted. Despite being untrained for the mission, he made a successful flight and returned to Earth unharmed.

A good piece of trivia from the piece: Laika wasn’t the original name for the most famous of Russian space dogs; it was named Kudryavka (Russian: Кудрявка, meaning Little Curly) before its name was changed.

The Human Element in Quantification

I enjoyed Felix Salmon’s piece in Wired titled “Why Quants Don’t Know Everything.” The premise of the piece is that while what quants do is important, the human element cannot be ignored.

The reason the quants win is that they’re almost always right—at least at first. They find numerical patterns or invent ingenious algorithms that increase profits or solve problems in ways that no amount of subjective experience can match. But what happens after the quants win is not always the data-driven paradise that they and their boosters expected. The more a field is run by a system, the more that system creates incentives for everyone (employees, customers, competitors) to change their behavior in perverse ways—providing more of whatever the system is designed to measure and produce, whether that actually creates any value or not. It’s a problem that can’t be solved until the quants learn a little bit from the old-fashioned ways of thinking they’ve displaced.

Felix discusses the four stages in the rise of the quants: 1) pre-disruption, 2) disruption, 3) overshoot, and 4) synthesis, described below:

It’s increasingly clear that for smart organizations, living by numbers alone simply won’t work. That’s why they arrive at stage four: synthesis—the practice of marrying quantitative insights with old-fashioned subjective experience. Nate Silver himself has written thoughtfully about examples of this in his book, The Signal and the Noise. He cites baseball, which in the post-Moneyball era adopted a “fusion approach” that leans on both statistics and scouting. Silver credits it with delivering the Boston Red Sox’s first World Series title in 86 years. Or consider weather forecasting: The National Weather Service employs meteorologists who, understanding the dynamics of weather systems, can improve forecasts by as much as 25 percent compared with computers alone. A similar synthesis holds in eco­nomic forecasting: Adding human judgment to statistical methods makes results roughly 15 percent more accurate. And it’s even true in chess: While the best computers can now easily beat the best humans, they can in turn be beaten by humans aided by computers.

Very interesting throughout, and highly recommended.

The Photography of Adam Magyar

In a long piece titled “Einstein’s Camera,” Joshua Hammer profiles the photography of Adam Magyar:

In a growing body of photographic and video art done over the past decade, Magyar bends conventional representations of time and space, stretching milliseconds into minutes, freezing moments with a resolution that the naked eye could never have perceived. His art evokes such variegated sources as Albert Einstein, Zen Buddhism, even the 1960s TV series The Twilight Zone.The images—sleek silver subway cars, solemn commuters lost in private worlds—are beautiful and elegant, but also produce feelings of disquiet. “These moments I capture are meaningless, there is no story in them, and if you can catch the core, the essence of being, you capture probably everything,” Magyar says in one of the many cryptic comments about his work that reflect both their hypnotic appeal and their elusiveness. There is a sense of stepping into a different dimension, of inhabiting a space between stillness and movement, a time-warp world where the rules of physics don’t apply.

Magyar’s work represents a fruitful cross-fertilization of technology and art, two disciplines—one objective and mathematical, the other entirely subjective—that weren’t always regarded as harmonious or compatible. Yet the two are intertwined, and breakthroughs in technology have often made new forms of art possible. Five thousand years ago, Egyptian technicians heated desert sand, limestone, potash, and copper carbonate in kilns to make a synthetic pigment known as “Egyptian blue,” which contributed to the highly realistic yet stylized portraiture of the Second and Third Dynasties.

adam_magyar_train

Fascinating profile. Definitely worth clicking over to see Adam’s photography, especially the Stainless photography and video features.

The Best Longreads of 2013

This is my fourth year compiling the best longreads of the year (see the 2010 best longreads2011 best longreads, and the 2012 best longreads). There was so much incredible writing that I’ve read this year that I am expanding my usual list of the best five longreads to the best ten longreads of the year. They are:

(1) And Then Steve Said, ‘Let There Be an iPhone’” [New York Times Magazine] — more than six years after Steve Jobs unveiled the iPhone, there were a number of things that the public had not known about. Fred Vogelstein’s piece that was published in October was incredibly revealing:

It’s hard to overstate the gamble Jobs took when he decided to unveil the iPhone back in January 2007. Not only was he introducing a new kind of phone — something Apple had never made before — he was doing so with a prototype that barely worked. Even though the iPhone wouldn’t go on sale for another six months, he wanted the world to want one right then. In truth, the list of things that still needed to be done was enormous. A production line had yet to be set up. Only about a hundred iPhones even existed, all of them of varying quality. Some had noticeable gaps between the screen and the plastic edge; others had scuff marks on the screen. And the software that ran the phone was full of bugs.

The iPhone could play a section of a song or a video, but it couldn’t play an entire clip reliably without crashing. It worked fine if you sent an e-mail and then surfed the Web. If you did those things in reverse, however, it might not. Hours of trial and error had helped the iPhone team develop what engineers called “the golden path,” a specific set of tasks, performed in a specific way and order, that made the phone look as if it worked.

But even when Jobs stayed on the golden path, all manner of last-minute workarounds were required to make the iPhone functional. On announcement day, the software that ran Grignon’s radios still had bugs. So, too, did the software that managed the iPhone’s memory. And no one knew whether the extra electronics Jobs demanded the demo phones include would make these problems worse.

Jobs wanted the demo phones he would use onstage to have their screens mirrored on the big screen behind him. To show a gadget on a big screen, most companies just point a video camera at it, but that was unacceptable to Jobs. The audience would see his finger on the iPhone screen, which would mar the look of his presentation. So he had Apple engineers spend weeks fitting extra circuit boards and video cables onto the backs of the iPhones he would have onstage. The video cables were then connected to the projector, so that when Jobs touched the iPhone’s calendar app icon, for example, his finger wouldn’t appear, but the image on the big screen would respond to his finger’s commands. The effect was magical. People in the audience felt as if they were holding an iPhone in their own hands. But making the setup work flawlessly, given the iPhone’s other major problems, seemed hard to justify at the time.

This bit about the compromises that Apple took to make the demo iPhone work is phenomenal:

The software in the iPhone’s Wi-Fi radio was so unstable that Grignon and his team had to extend the phones’ antennas by connecting them to wires running offstage so the wireless signal wouldn’t have to travel as far. And audience members had to be prevented from getting on the frequency being used. “Even if the base station’s ID was hidden” — that is, not showing up when laptops scanned for Wi-Fi signals — “you had 5,000 nerds in the audience,” Grignon says. “They would have figured out how to hack into the signal.” The solution, he says, was to tweak the AirPort software so that it seemed to be operating in Japan instead of the United States. Japanese Wi-Fi uses some frequencies that are not permitted in the U.S.

You do not have to be an Apple enthusiast like me to appreciate this piece. As I wrote back in October, “From concept to prototype to Steve Jobs’s unveiling of the revolutionary device, this piece has it all. It is so much better than the section devoted to the iPhone in Walter Isaacson’s biography of Steve Jobs.” And that ending to the piece? A tear jerker.

(2) “Thanksgiving in Mongolia” [The New Yorker] — reading this devastating account of a pregnancy gone wrong by Ariel Levy hit me like a brick. If you haven’t read it, it’s one of the best nonfiction pieces I’ve read the entire year. Just don’t read without a tissue nearby.

When I woke up the next morning, the pain in my abdomen was insistent; I wondered if the baby was starting to kick, which everyone said would be happening soon. I called home to complain, and my spouse told me to find a Western clinic. I e-mailed Cox to get his doctor’s phone number, thinking that I’d call if the pain got any worse, and then I went out to interview people: the minister of the environment, the president of a mining concern, and, finally, a herdsman and conservationist named Tsetsegee Munkhbayar, who became a folk hero after he fired shots at mining operations that were diverting water from nomadic communities. I met him in the sleek lobby of the Blue Sky with Yondon Badral—a smart, sardonic man I’d hired to translate for me in U.B. and to accompany me a few days later to the Gobi, where we would drive a Land Rover across the cold sands to meet with miners and nomads. Badral wore jeans and a sweater; Munkhbayar was dressed in a long, traditional deel robe and a fur hat with a small metal falcon perched on top. It felt like having a latte with Genghis Khan…

I felt an unholy storm move through my body, and after that there is a brief lapse in my recollection; either I blacked out from the pain or I have blotted out the memory. And then there was another person on the floor in front of me, moving his arms and legs, alive. I heard myself say out loud, “This can’t be good.” But itlooked good. My baby was as pretty as a seashell.

(3) “Photoshop is a City for Everyone: How Adobe Endlessly Rebuilds Its Classic App” [The Verge] — Paul Miller takes us on a delightful path with everyone’s favorite photography tool, Photoshop. We learn how the company iterates on its products and its vision for the future:

For instance, Adobe obsessively documents color profiles and lens distortion data for hundreds of cameras and lenses, taking hundreds of pictures with each combo. It’s expensive, laborious, and seemingly quixotic. But Camera RAW used those specs to automatically correct aberrations — even for multiple body / lens combinations. Then some researchers used the data to design a feature for CS6 that allows a user to straighten warped objects in extreme angle shots.

The holy grail is to give Photoshop computer vision. The app should simply select “objects” the way users see, like a “beach ball” or a “tree” or a “head,” not as “blob of color one,” “blob of color two.” Then the user should be able to do what she pleases to the object, with the software filling in the details like what might’ve been behind that object — something that’s available in a nascent form in CS6. Content vision also means the software should know when you’re working on a family photo and when you’re working on a logo, adjusting color grading techniques accordingly. It means unifying many of Photoshop’s features — which, once again, its architecture is uniquely suited to do.

Screen Shot 2013-12-25 at 9.40.50 PM

(4) “Bad Blood: The Mysterious Life and Brutal Death of a Russian Dissident” [Matter for Medium] — an incredibly detailed (9,000+ words,  fascinating piece that looks back on the life of Alexander Litvinenko, who as a Russian dissident fled to the U.K., was poisoned via radioactive polonium-210 in a London bar in November 2006, and the subsequent investigation that followed:

Because it is so highly soluble, polonium-210 is easily ingested. And when Litvinenko started vomiting on the evening of November 1st, the radiation had already begun to destroy the lining of his gut.

The cells lining the walls of the stomach are among the first to react to the toxin. They start sloughing and breaking away minutes after contact. The intestines, and the soft, unprotected skin inside the throat and mouth suffer the same fate.

Polonium is hugely radioactive, firing off a massive bombardment of alpha particles — and without any screening, the delicate mechanisms of the body’s internal organs get the full dose. As the atoms try to stabilize, alpha particles crash into nearby body tissue, knocking electrons from the molecules they encounter. Each time they do, the trail of wrecked cells expands; the poison turns them cancerous, or kills them off entirely…

How radioactive poison became the assassin’s weapon of choice, a story on Matter.

How radioactive poison became the assassin’s weapon of choice, a story on Matter.

At its height… the Soviet Union had the largest biological warfare program in the world. Sources have claimed there were 40,000 individuals, including 9,000 scientists, working at 47 different facilities. More than 1,000 of these experts specialized in the development and application of deadly compounds. They used lethal gasses, skin contact poisons that were smeared on door handles and nerve toxins said to be untraceable. The idea, at all times, was to make death seem natural — or, at the very least, to confuse doctors and investigators. “It’s never designed to demonstrate anything, only to kill the victim, quietly and unobtrusively,” Volodarsky writes in The KGB’s Poison Factory. “This was an unbreakable principle.”

Murderous poisons come in three varieties: chemical, biological, and radiological. It’s believed that the first Soviet attempt at a radiological assassination took place in 1957. The target was Nikolai Khokhlov, a defector who had left for the United States a few years earlier. He became drastically ill after drinking coffee at an anti-communist conference he was speaking at in West Germany. After his collapse, he was successfully treated at a US army hospital in Frankfurt for what was believed to be poisoning by radioactive thallium.

This was a beautifully illustrated piece and marked one of the best posts on Medium this year (originally the longform journalistic startup Matter took down their paywall and began publishing on Medium, one of my favorite publishing platforms).

(5) “Did Goldman Sachs Overstep in Criminally Charging Its Ex-Programmer?” [Vanity Fair] — perhaps the best piece Michael Lewis published the entire year, this 11,000 word “second trial” held by Michael Lewis was thoroughly fascinating:

A month after ace programmer Sergey Aleynikov left Goldman Sachs, he was arrested. Exactly what he’d done neither the F.B.I., which interrogated him, nor the jury, which convicted him a year later, seemed to understand. But Goldman had accused him of stealing computer code, and the 41-year-old father of three was sentenced to eight years in federal prison. Investigating Aleynikov’s case, Michael Lewis holds a second trial.

(6) “7 Things I Learned in 7 Years of Reading, Writing, and Living” [Brainpickings] — one of my favourite bloggers, Maria Popova, wrote a personal post on the things she’s learned maintaining her wildly popular blog on arts, culture, writing, history, books, and everything in between (in Maria’s words: “combinatorial creativity”) :
  1. Be generous. Be generous with your time and your resources and with giving credit and, especially, with your words. It’s so much easier to be a critic than a celebrator. Always remember there is a human being on the other end of every exchange and behind every cultural artifact being critiqued. To understand and be understood, those are among life’s greatest gifts, and every interaction is an opportunity to exchange them.

  2. Build pockets of stillness into your life. Meditate. Go for walks. Ride your bike going nowhere in particular. There is a creative purpose todaydreaming, even to boredom. The best ideas come to us when we stop actively trying to coax the muse into manifesting and let the fragments of experience float around our unconscious mind in order to click into new combinations. Without this essential stage of unconscious processing, the entire flow of the creative process is broken.Most importantly, sleep. Besides being the greatest creative aphrodisiac, sleep also affects our every waking momentdictates our social rhythm, and even mediates our negative moods. Be as religious and disciplined about your sleep as you are about your work. We tend to wear our ability to get by on little sleep as some sort of badge of honor that validates our work ethic. But what it really is is a profound failure of self-respect and of priorities. What could possibly be more important than your health and your sanity, from which all else springs?

  3. When people tell you who they are, Maya Angelou famously advised, believe them. Just as importantly, however, when people try to tell you who you are, don’t believe them. You are the only custodian of your own integrity, and the assumptions made by those that misunderstand who you are and what you stand for reveal a great deal about them and absolutely nothing about you.

Invaluable wisdom therein.

I try to support Brain Pickings with a one-time donation every year around the holidays. I recommend you do the same.

(7) “Slow Ideas” [The New Yorker] — Why do some innovations spread so quickly and others so slowly? That is the central premise that Atul Gawande answered in this enthralling piece:

Here we are in the first part of the twenty-first century, and we’re still trying to figure out how to get ideas from the first part of the twentieth century to take root. In the hopes of spreading safer childbirth practices, several colleagues and I have teamed up with the Indian government, the World Health Organization, the Gates Foundation, and Population Services International to create something called the BetterBirth Project. We’re working in Uttar Pradesh, which is among India’s poorest states. One afternoon in January, our team travelled a couple of hours from the state’s capital, Lucknow, with its bleating cars and ramshackle shops, to a rural hospital surrounded by lush farmland and thatched-hut villages. Although the sun was high and the sky was clear, the temperature was near freezing. The hospital was a one-story concrete building painted goldenrod yellow. (Our research agreement required that I keep it unnamed.) The entrance is on a dirt road lined with rows of motorbikes, the primary means of long-distance transportation. If an ambulance or an auto-rickshaw can’t be found, women in labor sit sidesaddle on the back of a bike.

(8) “Lost on Everest” [Outside Magazine] — Using never before published transcripts from the American 1963 expedition, Grayson Schaffer takes a deep look at an ascent to the world’s highest peak that many people (myself included) had no idea about before this piece was published:

By 1963, the golden age of Himalayan mountaineering was winding down. All but one of the world’s 8,000-meter peaks had been summited. Most of them were claimed by massive expeditions run like military campaigns, with siege-style tactics, top-down chains of command, and an emphasis on the collective over the individual. From an outsider’s perspective, the American expedition was no different. The operation required an army of men, including more than 900 lowland porters who carried 27 tons of equipment into Base Camp. And it was organized like a military detachment, with Dyhrenfurth in charge and the other men given ministerial titles like deputy leader and climbing leader.

On the other hand, the American expedition had a lot in common with modern climbing projects. It was laden with science experiments[2] that, like charity causes and awareness raising, have since become standard operating procedure for anybody who wants to get funding. Likewise, Dyhrenfurth’s desire for good footage of the trip for his film Americans on Everest was second only to his need to put somebody on the summit. (In 2012, you couldn’t find a climber on Everest who wasn’t making a documentary.) And as Dyhrenfurth admitted in his audio diary, the 1963 expedition was not run like those that came before it. “I am not a dictator,” he said. “We try to be as democratic as possible.”

This is a tour-de-force of an article, split into seven chapters, best read on your desktop (not in mobile).

(9) “I Am An Object Of Internet Ridicule, Ask Me Anything” [The Awl] — C.D. Hermelin’s personal story of how he brought a vintage typewriter and crafted stories for people on the spot made a deep impression of me:

When I set up at the High Line, I had lines of people asking for stories. At seven to 10 minutes per a story, I had to tell people to leave and come back. It surprised me when they would do just that. I never had writer’s block, although sometimes I would stare off into space for the right word, and people watching would say, “Look! He’s thinking!” Writing is usually a lonely, solitary act. On the High Line with my typewriter, all the joy of creating narrative was infused with a performer’s high—people held their one-page flash fictions and read them and laughed and repeated lines and translated into their own languages, right in front of me. Perhaps other writers would have their nerves wracked by instant feedback on rough drafts, but all I could do was smile.

Each time I went, I’d walk home, my typewriter case full of singles, my fingers ink-stained. Lots of people were worried about copycats—what if I saw someone “stealing” my idea? I tried to soothe them. If every subway guitarist had fights about who came up with the idea to play an acoustic cover of John Lennon’s “Imagine,” the underground would be a violent place. More violent than it already is. Others, perhaps drawn by the sounds of the typewriter, would stop and just talk to me, watch me compose a story for someone else. Then they’d shake their head and tell me that the idea and the execution were “genius.”

But then someone took a photo of him, posted it on Reddit, and the hipster-hating commenters flocked to the forums like a pack of wolves:

Of course I sat back down. Of course I read every single comment. I did not ready myself mentally for a barrage of hipster-hating Internet commenters critiquing me for everything: my pale skin, my outfit, my hair, my typing style, my glasses. An entire sub-thread was devoted to whether or not I had shaved legs. It was not the first time I had been labeled a “hipster.” I often wear tight jeans, big plastic-frame glasses, shirts bought at thrift stores. I listen to Vampire Weekend, understand and laugh at the references in “Portlandia.” I own and listen to vintage vinyl. The label never bothered me on its own. But with each successive violent response to the picture of me, I realized that hipsters weren’t considered a comically benign undercurrent of society. Instead, it seemed like Redditors saw hipsters and their ilk as a disease, and I was up on display as an example of depraved behavior.

But it was how C.D. chose to deal with the adversity that is worth highlighting (and the reason I pick this story as one of the ten best I’ve read this year):

The day after the first, un-memeified picture was posted to Reddit, I went out with my typewriter, very nervous. I tweeted on my “@rovingtypist” Twitter account that Redditors should stop by, say hello, talk about the post if they wanted. Someone responded immediately, told me that I should watch out for bullies—the message itself was more creepy than he probably meant it to be. I was nervous for nothing; a few Redditors came out, took pictures with me, grabbed a story. I was mostly finished for the evening when Carla showed up—Carla was the Brazilian tourist who took the picture of me and put it up onto Reddit. She was sweet and apologetic for the outpouring of hate, as bewildered by it as I was. She took a story as well, although I can’t remember what it was about. I messaged her when I first saw the picture posted with the meme text, letting her know that her picture had been appropriated. “I’m not concerned about it,” she said.

Hers was the position to take, and one I should have adopted earlier.

(10) “The Finish Line” [GQ Magazine] — it would not be an exaggeration to say that one of the most important events of 2013 were the Boston Marathon bombings. In a thoroughly researched piece for GQ, Sean Flynn profiles the harrowing minutes in which a “superhuman effort to help those injured” during that fateful day. The way the piece was written, in timeline form, only adds to the suspense of the piece:

10:00: Finish Line

Charles Krupa has photographed the Boston Marathon twenty-four times, every race since 1986 except for the three when the Associated Press posted him to the Philadelphia office. Krupa shoots a lot of things for the AP, but mostly he does sports. Boston’s a good town for a sports photographer: He’s shot the championships of all four major leagues, been there on the field or the court or the ice, been in the celebrations but not a part of them, the camera lens a small barrier that separates witness from participant.

The marathon coincides with a state holiday, Patriots’ Day, the third Monday in April, so traffic is always light on the drive south from New Hampshire, where Krupa lives. He was at the finish line in Copley Square by eight o’clock for his twenty-fifth marathon. It’s routine by now. Like riding a bike, he says. He’ll shoot from the media bridge spanning Boylston Street a few yards behind the line, like he always does, and his AP partner, Elise Amendola, will shoot from the pavement. He set up a remote camera on a riser to catch the line from the side if the finish is close. He knows exactly what pictures he needs: the wheelchair, men’s and women’s winners breaking the tape, an emotional reaction shot for each if he can get it, the top American finishers. Then he’ll edit those images on his laptop in the media center in the Fairmont Copley Plaza hotel and upload them to the AP’s servers. He might shoot a feature later, a runner crawling across the line or something like that guy last year who finished walking on his hands. Or he might call it a day after lunch.

2:49: The Blast
Inside the Fairmont Copley Plaza, Charles Krupa hears a tremendous metallic bang that reverberates and echoes. It sounds like a Dumpster dropped by a garbage truck in an alley before dawn. His gut tells him he’s just heard a bomb, but his head just as quickly tells him that can’t be true. He wonders if a forklift breaking down the staging might have dropped a scaffold.

Stephen Segatore hears a sound like a steel plate dropped onto cement from twenty feet. Then he feels the puff of a pressure wave that flutters the soft sides of the tent.

Michael Powers is talking to one of the physicians and another athletic trainer in the medical tent, remarking how good the weather’s been for the runners. He hears a bang, like a big firecracker, only an order of magnitude louder. He tells them, “That wasn’t thunder.”

Though the piece was published more than two months after the Boston Marathon bombings, I think it is the best all-around piece of journalism I’ve read on the topic.

###

Bonus (Published by Yours Truly)…

I experimented with writing this year more than in the last few years. To that end, I wrote something personal that can be tagged with #longreads as well. It’s something that I am proud of having compiled in one place, after more than a year of data aggregation, taking copious notes, and flushing ideas through my brain. It’s about my goal of taking control of one aspect of my life: health and fitness.

“How Fitness and Becoming Quantified Self Changed My Life” [Medium]:

I promised my sister that I would join a gym. But this promise was secondary: more importantly, I was making a promise to myself to make a difference in my life. One of my core life philosophies has been this: “If you keep saying you want to make something a priority in your life but aren’t doing something about it, then you have other priorities.” Becoming healthier became my number one priority. This wasn’t a resolution because resolutions never last. But habits do.

When I arrived to the Athletic Club at City Club of Buckhead that morning, I was committed. Having read much research on our mind’s tendency to sway us from sticking to our habits, I made a major financial commitment: I paid for six months of membership at the gym in advance. Plunking down about $350 was meant to serve as a reminder that if I quit, it was going to sting a little. You could call it an insurance policy, but I likened it to an investment in myself. I was going to kick some ass in the next six months.

You can read the entire piece here.

The year 2013 has been another spectacular one for @longreads/#longreads. I can’t wait what 2014 will bring.

Crazy Ants are Insane

I meant to highlight this fascinating piece in The New York Times earlier, but better late than never.

First, the name:

The ants are called crazy ants. That’s their actual name. Many people call them Rasberry crazy ants, and some people call them Tawny crazy ants and refuse to call them Rasberry crazy ants. 

Rasberry coming from not a scientist or a professor, but an exterminator who noticed these wild ants in Texas.

Rasberry crazy ants do not have a painful bite, but they effectively terrorize people by racing up their feet and around their bodies, coursing everywhere in their impossibly disordered orbits. (They’re called crazy ants because their behavior seems psychotic.) Some people in Texas have become so frustrated with crazy ants that they have considered selling their houses or been driven to the verge of divorce. “Usually, the husband doesn’t think it’s such a big deal, and the wife is going batty,” one exterminator explained. An attorney living on an infested farm south of Houston told me: “It reminds me of the scenes in Africa, where you see flies crawling all over people. Occasionally they’ll knock one off, but for the most part they’re so accustomed to it that they finally give up.”

Crazy ants decimate native insects. They overtake beehives and destroy the colonies. They may smother bird chicks struggling to hatch. In South America, where scientists now believe the ants originated, they have been known to obstruct the nasal cavities of chickens and asphyxiate the birds. They swarm into cows’ eyes.

So far, there is no way to contain them. In the fall, when the temperature drops, the worker ants are subject to magnificent die-offs, but the queens survive, and a new, often larger crop of crazy ants pours back in the following spring. Rasberry crazy ants were first discovered in Texas by an exterminator in 2002. Within five years, they appeared to be spreading through the state much faster than even the red imported fire ant has. The fire ant is generally considered one of the worst invasive species in the world. The cost of fire ants to Texas has been estimated at more than $1 billion a year.

Here is a three-year old video that shows how fast these crazy ants scurry about:

 

Definitely worth reading the entire thing. Fascinating reporting. And scary how species can be so invasive!