On the Brief History of the SAT Exam

From this excellent piece about a mother who decided to take the SATs (and naturally, decided to write a book about it: The Perfect Score Project: Uncovering the Secrets of the SAT), we learn about the history of the SAT exam:

The SATs were administered for the first time on June 23, 1926. Intelligence testing was a new but rapidly expanding enterprise; during the First World War, the United States Army had given I.Q. tests to nearly two million soldiers to determine who was officer material. (Walter Lippmann dismissed these tests as “quackery in a field where quacks breed like rabbits.”) The SAT’s inventor, a Princeton professor named Carl Campbell Brigham, had worked on the Army’s I.Q. test, and the civilian exam he came up with was a first cousin to the military’s. It contained some questions on math and some on identifying shapes. Mostly, though, it focussed on vocabulary. Brigham intended the test to be administered to students who had already been admitted to college, for the purposes of guidance and counselling. Later, he argued that it was foolish to believe, as he once had, that the test measured “native intelligence.” Rather, he wrote, scores were an index of a person’s “schooling, family background, familiarity with English, and everything else.”

By this point, though, the test had already been adopted for a new purpose. In 1933, James Bryant Conant, a chemist, became the president of Harvard. Conant, the product of a middle-class family, was dismayed by what he saw as the clubbiness of the school’s student body and set out to attract fresh talent. In particular, he wanted to recruit bright young men from public schools in the Midwest, few of whom traditionally applied to Harvard. Conant’s plan was to offer scholarships to ten such students each year. To select them, he decided to employ the SAT. As Nicholas Lemann observes in his book “The Big Test” (1999), this was one of those small decisions “from which great consequences later flow.” Not long after Harvard started using the SAT, Princeton, Columbia, and Yale followed suit. More and more colleges adopted the test until, by the mid-nineteen-fifties, half a million kids a year were taking it.

In the early decades of the test, scores were revealed only to schools, not to students. This made it difficult to assess the claim made by the College Board, the exam’s administrator, that studying for the SATs would serve no purpose. Still, a brash young high-school tutor named Stanley Kaplan concluded, based on the feedback he was getting from his pupils, that the claim was a crock. Kaplan began offering SAT prep classes out of his Brooklyn basement. Accusations that he was a fraud and a “snake oil salesman” failed to deter his clientele; the students just kept on coming. In the nineteen-seventies, Kaplan expanded his operations into cities like Philadelphia, Los Angeles, Chicago, and Miami; this is when the Federal Trade Commission decided to investigate his claims. The commission found that Kaplan was right: tutoring did boost scores, if not by as much as his testing service advertised. The College Board implicitly conceded the point in 1994, when it changed the meaning of the SAT’s central “A”; instead of “aptitude” it came to stand for “assessment.” Then the board took the even more radical step of erasing the meaning of the name altogether. Today, the letters “SAT” stand for nothing more (or less) than the SATs. As the Lord put it to Moses, “I am that I am.”

Read the rest here.

The 2014 Annual Shareholder Letter from Warren Buffett

Fortune Magazine has a sneak peek into the annual shareholder letter than Warren Buffett will soon share with the Berkshire Hathaway shareholders. He shares two personal stories from his life and how the investment decisions have paid off over time. He echoes his wisdom in the following points:

  • You don’t need to be an expert in order to achieve satisfactory investment returns. But if you aren’t, you must recognize your limitations and follow a course certain to work reasonably well. Keep things simple and don’t swing for the fences. When promised quick profits, respond with a quick “no.”

  • Focus on the future productivity of the asset you are considering. If you don’t feel comfortable making a rough estimate of the asset’s future earnings, just forget it and move on. No one has the ability to evaluate every investment possibility. But omniscience isn’t necessary; you only need to understand the actions you undertake.

  • If you instead focus on the prospective price change of a contemplated purchase, you are speculating. There is nothing improper about that. I know, however, that I am unable to speculate successfully, and I am skeptical of those who claim sustained success at doing so. Half of all coin-flippers will win their first toss; none of those winners has an expectation of profit if he continues to play the game. And the fact that a given asset has appreciated in the recent past is never a reason to buy it.

  • With my two small investments, I thought only of what the properties would produce and cared not at all about their daily valuations. Games are won by players who focus on the playing field — not by those whose eyes are glued to the scoreboard. If you can enjoy Saturdays and Sundays without looking at stock prices, give it a try on weekdays.

  • Forming macro opinions or listening to the macro or market predictions of others is a waste of time. Indeed, it is dangerous because it may blur your vision of the facts that are truly important. (When I hear TV commentators glibly opine on what the market will do next, I am reminded of Mickey Mantle’s scathing comment: “You don’t know how easy this game is until you get into that broadcasting booth.”)

Read the rest here.

A Zircon Crystal on Earth Dated to 4.4 Billion Years Old

From a recently published paper in Nature Geoscience, we learn that the oldest dated piece of Earth’s crust is currently dated to 4.4 billion years old. It is a piece of zircon crystal measuring just 400 micrometers long, and its biggest dimension is just a bit larger than a house dust mite, or about four human hairs:

The only physical evidence from the earliest phases of Earth’s evolution comes from zircons, ancient mineral grains that can be dated using the U–Th–Pb geochronometer1. Oxygen isotope ratios from such zircons have been used to infer when the hydrosphere and conditions habitable to life were established23. Chemical homogenization of Earth’s crust and the existence of a magma ocean have not been dated directly, but must have occurred earlier4. However, the accuracy of the U–Pb zircon ages can plausibly be biased by poorly understood processes of intracrystalline Pb mobility567. Here we use atom-probe tomography8 to identify and map individual atoms in the oldest concordant grain from Earth, a 4.4-Gyr-old Hadean zircon with a high-temperature overgrowth that formed about 1 Gyr after the mineral’s core. Isolated nanoclusters, measuring about 10 nm and spaced 10–50 nm apart, are enriched in incompatible elements including radiogenic Pb with unusually high 207Pb/206Pb ratios. We demonstrate that the length scales of these clusters make U–Pb age biasing impossible, and that they formed during the later reheating event. Our tomography data thereby confirm that any mixing event of the silicate Earth must have occurred before 4.4 Gyr ago, consistent with magma ocean formation by an early moon-forming impact4 about 4.5 Gyr ago.

###

(via CNN)

Humans Now Interpret the Smiley Emoticon :-) as a Real Smile

This is a fascinating new paper, published in Social Neuroscience, that explains how our brains now interpret the human smiley emoticon as a real smile. You know, the 🙂 or perhaps simply the 🙂 ?

🙂

Researchers at Australia’s Flinders University showed twenty participants smiley faces, along with real faces and strings of symbols that shouldn’t look like faces, all while recording the signals in the region of the brain that’s primarily activated when we see faces. This signal, called the N170 event-related potential, is the highest when people see actual faces, but was also high when people saw the standard emoticon :). “This indicates that when upright, emoticons are processed in occipitotemporal sites similarly to faces due to their familiar configuration,” the researchers write. 

Full study (PDF) here.

###

(via ABC Australia)

On the Online-Only Love Affairs

The most commonly written-about topic in 2011 was online-only love affairs. Rather than trying to figure out how to navigate a sexual relationship that excluded emotion, they were trying to figure out how to navigate an emotional relationship that excluded sex.

Fascinating but alarming revelation in the most recent story published in the Modern Love section at The New York Times.

How Creativity is Becoming an Academic Discipline

A fascinating New York Times piece on how some schools are leveraging the obscure field of creativity into teaching it in academia:

Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill. Pin it on pushback against standardized tests and standardized thinking, or on the need for ingenuity in a fluid landscape.

“The reality is that to survive in a fast-changing world you need to be creative,” says Gerard J. Puccio, chairman of the International Center for Studies in Creativity at Buffalo State College, which has the nation’s oldest creative studies program, having offered courses in it since 1967.

“That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”

Critical thinking has long been regarded as the essential skill for success, but it’s not enough, says Dr. Puccio. Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.

Very good point in this last paragraph:

The point of creative studies, says Roger L. Firestien, a Buffalo State professor and author of several books on creativity, is to learn techniques “to make creativity happen instead of waiting for it to bubble up. A muse doesn’t have to hit you.”

Also see the related slide show here; this one is my favorite:

vitamin_fork_and_spoon

On the Dangers of Certainty and the Importance of Tolerance

One of the best op-eds I have ever read is by Simon Critchley, recently published in The New York Times under the title “The Dangers of Certainty: A Lesson From Auschwitz.” It’s an absolute must-read:

For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion. As he eloquently put it, “Human knowledge is personal and responsible, an unending adventure at the edge of uncertainty.”

The relationship between humans and nature and humans and other humans can take place only within a certain play of tolerance. Insisting on certainty, by contrast, leads ineluctably to arrogance and dogma based on ignorance.

Before you read the rest, watch this powerful video filmed at the Auschwitz Concetration Camp, in which Dr. Bronowski reflects on the millions of lives extinguished at that location:

Dr.Bronowski was the man who developed the TV show The Ascent of Man, which aired on BBC in the 1970s. Continuing,

The play of tolerance opposes the principle of monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism. When we think we have certainty, when we aspire to the knowledge of the gods, then Auschwitz can happen and can repeat itself. Arguably, it has repeated itself in the genocidal certainties of past decades.

The pursuit of scientific knowledge is as personal an act as lifting a paintbrush or writing a poem, and they are both profoundly human. If the human condition is defined by limitedness, then this is a glorious fact because it is a moral limitedness rooted in a faith in the power of the imagination, our sense of responsibility and our acceptance of our fallibility. We always have to acknowledge that we might be mistaken. When we forget that, then we forget ourselves and the worst can happen.

This kind of philosophy has been ingrained in me from the youngest age. Be a skeptic. Question assumptions. Do not take anything as an absolute truth.

An incredible, must-read piece. I am going to try to find old videos of The Ascent of Man and watch them in my spare time.

Life as a Video Game

A fun, thoughtful post by Oliver Emberton, likening life to a game:

Every decision you have to make costs willpower, and decisions where you have to suppress an appealing option for a less appealing one (e.g. exercise instead of watch TV) require a lot of willpower.

There are various tricks to keep your behaviour in line:

  1. Keep your state high. If you’re hungry, exhausted, or utterly deprived of fun, your willpower will collapse. Ensure you take consistently good care of yourself.
  2. Don’t demand too much willpower from one day. Spread your most demanding tasks over multiple days, and mix them in with less demanding ones.
  3. Attempt the most important tasks first. This makes other tasks more difficult, but makes your top task more likely.
  4. Reduce the need to use willpower by reducing choices. If you’re trying to work on a computer that can access Facebook, you’ll need more willpower because you’re constantly choosing the hard task over the easy one. Eliminate such distractions.

A key part of playing the game is balancing your competing priorities with the state of your body. Just don’t leave yourself on autopilot, or you’ll never get anything done.

Worth clicking for the 8-bit character representations alone.

The Jamaican Bobsled Team is Headed to Sochi

This is a cool story (get it, cool?) about the Jamaican bobsled team heading to Sochi for the 2014 Winter Olympics. As the team was unable to cover all expenses, the founders of Dogecoin, stepped in to help:

The team — led by Winston Watt, a veteran driver who piloted the Olympic team in 2002, and anchored by brakeman Marvin Dixon — has found support among the founders of Dogecoin, the cryptocurrency that is part Bitcoin, part Internet dog meme. Liam Butler, who runs the Dogecoin Foundation, started Dogesled to raise money for the team after hearing that Watt had personally funded its trip to a training session but would be unable to come up with the money needed to fly the team to Sochi, Russia. The Dogesled fund raised almost $25,000 in 12 hours — causing the Dogecoin to Bitcoin exchange rate to spike by about 50 percent — and has reached its $30,000 goal.

No doubt the push to send the team to the Olympics is fueled by the cult following of 1993’s “Cool Runnings,” the quintessential underdog movie. The film was loosely based on the 1988 Jamaican bobsled team that became the country’s first to qualify for the Olympics, an incredible feat for four men from a tropical country with little to no experience on snow. As I’ve noted before, the generation that grew up in the ’90s is and should be a prime target for sports leagues and marketers — we’re more engaged with social media and willing to spend our money on such nostalgia.

Cool Runnings!

On the Strange Transcendence of Flappy Bird

Curious to find out what the big deal was, I downloaded a game called Flappy Bird on my iPhone. I’ve been playing the game for a total of about two hours or so, and yes, it is very addicting (and difficult!). In a great piece for The Atlantic, video game critic Ian Bogost explains why Flappy Bird is popular, difficult, and addicting:

Flappy Bird is a perversely, oppressively difficult game. Scoring even a single point takes most players a considerable number of runs. After an hour, I’d managed a high score of two. Many, many hours of play later, my high score is 32, a feat that has earned me the game’s gold medal (whatever that means).

There is a tradition of such super-difficult games, sometimes called masocoreamong the videogame-savvy. Masocore games are normally characterized by trial-and-error gameplay, but split up into levels or areas to create a sense of overall progress. Commercial blockbusters like Mega Man inaugurated the category (even if the term “masocore” appeared long after Capcom first released that title in 1987), and more recent independent titles like I Wanna Be The Guyand Super Meat Boy have further explored the idea of intense difficulty as a primary aesthetic. Combined with repetition and progression, the intense difficulty of masocore games often produces a feeling of profound accomplishment, an underdog’s victory in the dorky medium of underdogs themselves, 2d platformer videogames.

flappy

On what makes Flappy Bird so difficult:

Contemporary design practice surely would recommend an “easy” first pipe sequence to get the player started, perhaps a few pipes positioned at the bird’s initial position, or with wider openings for easier passage. More difficult maneuvers, such as quick shifts from high to low pipe openings, would be reserved for later in the game, with difficulty ramping up as the player demonstrates increased expertise.

But Flappy Bird offers no such scaffolding. Instead, every pipe and every point is completely identical: randomly positioned but uniform in every other way. A game of Flappy Bird is a series of identical maneuvers, one after the other. All you have to do is keep responding to them, a task made possible by the game’s predictable and utterly reasonable interactions. Just keep flapping.

This is a very introspective analysis:

What we appreciate aboutFlappy Bird is not the details of its design, but the fact that it embodies them with such unflappable nonchalance. The best games cease to be for us (or for anyone) and instead strive to be what they are as much as possible. From this indifference emanates a strange squalor that we can appreciate as beauty.

And then Bogost gets transcendental:

Flappy Bird is not amateurish nor sociopathic. Instead, it is something more unusual. It is earnest. It is exactly what it is, and it is unapologetic. Not even unapologetic—stoic, aloof. Impervious. Like a meteorite that crashed through a desert motel lobby, hot and small and unaware.

I flap, therefore I am.