On the Brief History of the SAT Exam

From this excellent piece about a mother who decided to take the SATs (and naturally, decided to write a book about it: The Perfect Score Project: Uncovering the Secrets of the SAT), we learn about the history of the SAT exam:

The SATs were administered for the first time on June 23, 1926. Intelligence testing was a new but rapidly expanding enterprise; during the First World War, the United States Army had given I.Q. tests to nearly two million soldiers to determine who was officer material. (Walter Lippmann dismissed these tests as “quackery in a field where quacks breed like rabbits.”) The SAT’s inventor, a Princeton professor named Carl Campbell Brigham, had worked on the Army’s I.Q. test, and the civilian exam he came up with was a first cousin to the military’s. It contained some questions on math and some on identifying shapes. Mostly, though, it focussed on vocabulary. Brigham intended the test to be administered to students who had already been admitted to college, for the purposes of guidance and counselling. Later, he argued that it was foolish to believe, as he once had, that the test measured “native intelligence.” Rather, he wrote, scores were an index of a person’s “schooling, family background, familiarity with English, and everything else.”

By this point, though, the test had already been adopted for a new purpose. In 1933, James Bryant Conant, a chemist, became the president of Harvard. Conant, the product of a middle-class family, was dismayed by what he saw as the clubbiness of the school’s student body and set out to attract fresh talent. In particular, he wanted to recruit bright young men from public schools in the Midwest, few of whom traditionally applied to Harvard. Conant’s plan was to offer scholarships to ten such students each year. To select them, he decided to employ the SAT. As Nicholas Lemann observes in his book “The Big Test” (1999), this was one of those small decisions “from which great consequences later flow.” Not long after Harvard started using the SAT, Princeton, Columbia, and Yale followed suit. More and more colleges adopted the test until, by the mid-nineteen-fifties, half a million kids a year were taking it.

In the early decades of the test, scores were revealed only to schools, not to students. This made it difficult to assess the claim made by the College Board, the exam’s administrator, that studying for the SATs would serve no purpose. Still, a brash young high-school tutor named Stanley Kaplan concluded, based on the feedback he was getting from his pupils, that the claim was a crock. Kaplan began offering SAT prep classes out of his Brooklyn basement. Accusations that he was a fraud and a “snake oil salesman” failed to deter his clientele; the students just kept on coming. In the nineteen-seventies, Kaplan expanded his operations into cities like Philadelphia, Los Angeles, Chicago, and Miami; this is when the Federal Trade Commission decided to investigate his claims. The commission found that Kaplan was right: tutoring did boost scores, if not by as much as his testing service advertised. The College Board implicitly conceded the point in 1994, when it changed the meaning of the SAT’s central “A”; instead of “aptitude” it came to stand for “assessment.” Then the board took the even more radical step of erasing the meaning of the name altogether. Today, the letters “SAT” stand for nothing more (or less) than the SATs. As the Lord put it to Moses, “I am that I am.”

Read the rest here.

Advertisements

The 2014 Annual Shareholder Letter from Warren Buffett

Fortune Magazine has a sneak peek into the annual shareholder letter than Warren Buffett will soon share with the Berkshire Hathaway shareholders. He shares two personal stories from his life and how the investment decisions have paid off over time. He echoes his wisdom in the following points:

  • You don’t need to be an expert in order to achieve satisfactory investment returns. But if you aren’t, you must recognize your limitations and follow a course certain to work reasonably well. Keep things simple and don’t swing for the fences. When promised quick profits, respond with a quick “no.”

  • Focus on the future productivity of the asset you are considering. If you don’t feel comfortable making a rough estimate of the asset’s future earnings, just forget it and move on. No one has the ability to evaluate every investment possibility. But omniscience isn’t necessary; you only need to understand the actions you undertake.

  • If you instead focus on the prospective price change of a contemplated purchase, you are speculating. There is nothing improper about that. I know, however, that I am unable to speculate successfully, and I am skeptical of those who claim sustained success at doing so. Half of all coin-flippers will win their first toss; none of those winners has an expectation of profit if he continues to play the game. And the fact that a given asset has appreciated in the recent past is never a reason to buy it.

  • With my two small investments, I thought only of what the properties would produce and cared not at all about their daily valuations. Games are won by players who focus on the playing field — not by those whose eyes are glued to the scoreboard. If you can enjoy Saturdays and Sundays without looking at stock prices, give it a try on weekdays.

  • Forming macro opinions or listening to the macro or market predictions of others is a waste of time. Indeed, it is dangerous because it may blur your vision of the facts that are truly important. (When I hear TV commentators glibly opine on what the market will do next, I am reminded of Mickey Mantle’s scathing comment: “You don’t know how easy this game is until you get into that broadcasting booth.”)

Read the rest here.

A Zircon Crystal on Earth Dated to 4.4 Billion Years Old

From a recently published paper in Nature Geoscience, we learn that the oldest dated piece of Earth’s crust is currently dated to 4.4 billion years old. It is a piece of zircon crystal measuring just 400 micrometers long, and its biggest dimension is just a bit larger than a house dust mite, or about four human hairs:

The only physical evidence from the earliest phases of Earth’s evolution comes from zircons, ancient mineral grains that can be dated using the U–Th–Pb geochronometer1. Oxygen isotope ratios from such zircons have been used to infer when the hydrosphere and conditions habitable to life were established23. Chemical homogenization of Earth’s crust and the existence of a magma ocean have not been dated directly, but must have occurred earlier4. However, the accuracy of the U–Pb zircon ages can plausibly be biased by poorly understood processes of intracrystalline Pb mobility567. Here we use atom-probe tomography8 to identify and map individual atoms in the oldest concordant grain from Earth, a 4.4-Gyr-old Hadean zircon with a high-temperature overgrowth that formed about 1 Gyr after the mineral’s core. Isolated nanoclusters, measuring about 10 nm and spaced 10–50 nm apart, are enriched in incompatible elements including radiogenic Pb with unusually high 207Pb/206Pb ratios. We demonstrate that the length scales of these clusters make U–Pb age biasing impossible, and that they formed during the later reheating event. Our tomography data thereby confirm that any mixing event of the silicate Earth must have occurred before 4.4 Gyr ago, consistent with magma ocean formation by an early moon-forming impact4 about 4.5 Gyr ago.

###

(via CNN)

Humans Now Interpret the Smiley Emoticon :-) as a Real Smile

This is a fascinating new paper, published in Social Neuroscience, that explains how our brains now interpret the human smiley emoticon as a real smile. You know, the 🙂 or perhaps simply the 🙂 ?

🙂

Researchers at Australia’s Flinders University showed twenty participants smiley faces, along with real faces and strings of symbols that shouldn’t look like faces, all while recording the signals in the region of the brain that’s primarily activated when we see faces. This signal, called the N170 event-related potential, is the highest when people see actual faces, but was also high when people saw the standard emoticon :). “This indicates that when upright, emoticons are processed in occipitotemporal sites similarly to faces due to their familiar configuration,” the researchers write. 

Full study (PDF) here.

###

(via ABC Australia)

On the Online-Only Love Affairs

The most commonly written-about topic in 2011 was online-only love affairs. Rather than trying to figure out how to navigate a sexual relationship that excluded emotion, they were trying to figure out how to navigate an emotional relationship that excluded sex.

Fascinating but alarming revelation in the most recent story published in the Modern Love section at The New York Times.

How Creativity is Becoming an Academic Discipline

A fascinating New York Times piece on how some schools are leveraging the obscure field of creativity into teaching it in academia:

Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill. Pin it on pushback against standardized tests and standardized thinking, or on the need for ingenuity in a fluid landscape.

“The reality is that to survive in a fast-changing world you need to be creative,” says Gerard J. Puccio, chairman of the International Center for Studies in Creativity at Buffalo State College, which has the nation’s oldest creative studies program, having offered courses in it since 1967.

“That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”

Critical thinking has long been regarded as the essential skill for success, but it’s not enough, says Dr. Puccio. Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.

Very good point in this last paragraph:

The point of creative studies, says Roger L. Firestien, a Buffalo State professor and author of several books on creativity, is to learn techniques “to make creativity happen instead of waiting for it to bubble up. A muse doesn’t have to hit you.”

Also see the related slide show here; this one is my favorite:

vitamin_fork_and_spoon

On the Dangers of Certainty and the Importance of Tolerance

One of the best op-eds I have ever read is by Simon Critchley, recently published in The New York Times under the title “The Dangers of Certainty: A Lesson From Auschwitz.” It’s an absolute must-read:

For Dr. Bronowski, the moral consequence of knowledge is that we must never judge others on the basis of some absolute, God-like conception of certainty. All knowledge, all information that passes between human beings, can be exchanged only within what we might call “a play of tolerance,” whether in science, literature, politics or religion. As he eloquently put it, “Human knowledge is personal and responsible, an unending adventure at the edge of uncertainty.”

The relationship between humans and nature and humans and other humans can take place only within a certain play of tolerance. Insisting on certainty, by contrast, leads ineluctably to arrogance and dogma based on ignorance.

Before you read the rest, watch this powerful video filmed at the Auschwitz Concetration Camp, in which Dr. Bronowski reflects on the millions of lives extinguished at that location:

Dr.Bronowski was the man who developed the TV show The Ascent of Man, which aired on BBC in the 1970s. Continuing,

The play of tolerance opposes the principle of monstrous certainty that is endemic to fascism and, sadly, not just fascism but all the various faces of fundamentalism. When we think we have certainty, when we aspire to the knowledge of the gods, then Auschwitz can happen and can repeat itself. Arguably, it has repeated itself in the genocidal certainties of past decades.

The pursuit of scientific knowledge is as personal an act as lifting a paintbrush or writing a poem, and they are both profoundly human. If the human condition is defined by limitedness, then this is a glorious fact because it is a moral limitedness rooted in a faith in the power of the imagination, our sense of responsibility and our acceptance of our fallibility. We always have to acknowledge that we might be mistaken. When we forget that, then we forget ourselves and the worst can happen.

This kind of philosophy has been ingrained in me from the youngest age. Be a skeptic. Question assumptions. Do not take anything as an absolute truth.

An incredible, must-read piece. I am going to try to find old videos of The Ascent of Man and watch them in my spare time.