NASA’s Glorious Recreation of the “Earthrise” Photograph from December 24, 1968

On the morning of December 24, 1968, the onboard cameras on NASA’s Apollo 8 spacecraft were focused on the lunar surface. However, that morning unfolded with a tiny bit of the unexpected. On board the spacecraft were astronauts Frank Borman, James Lovell, and William Anders. They all later recalled that perhaps the most important thing they discovered on their mission was Earth:

The famous image now known simply as Earthrise.

The famous image from Apollo 8 now known simply as Earthrise.

In a newly released video by NASA, seen below, NASA scientists use a number of photo mosaics and elevation data from their Lunar Reconnaissance Orbiter (LRO) to reconstruct for the very first time, 45 years later, exactly what these Apollo 8 astronauts saw on that December morning. As you listen to the talk, narrated by Andrew Chaikin (author of A Man on the Moon: The Voyages of the Apollo Astronauts), you’ll understand how this famous image (now known simply, elegantly as Earthrise) almost did not come to exist. Earthrise was captured on colour film with a modified Hasselblad 500 EL at 1/250 seconds at f/11, as you’ll hear in the film. The video, which can be viewed in 1080p HD, is well worth the seven minutes of your time:

Per the caption of the video:

The visualization draws on numerous historical sources, including the actual cloud pattern on Earth from the ESSA-7 satellite and dozens of photographs taken by Apollo 8, and it reveals new, historically significant information about the Earthrise photographs. It has not been widely known, for example, that the spacecraft was rolling when the photos were taken, and that it was this roll that brought the Earth into view. The visualization establishes the precise timing of the roll and, for the first time ever, identifies which window each photograph was taken from.

The key to the new work is a set of vertical stereo photographs taken by a camera mounted in the Command Module’s rendezvous window and pointing straight down onto the lunar surface. It automatically photographed the surface every 20 seconds. By registering each photograph to a model of the terrain based on LRO data, the orientation of the spacecraft can be precisely determined.

A still from NASA's new visualization of how Earthrise came to be.

A still from NASA’s new visualization of how Earthrise came to be.

NASA's visualization of Earthrise.

NASA’s visualization of Earthrise.

Errol Morris on the Abraham Lincoln Portraits

I spent the better half of the afternoon reading Errol Morris’s fascinating series “The interminable, Everlasting Lincolns” in The New York Times, in which he sets to establish how the last known photographs (portraits) of Abraham Lincoln came to be. The prologue sets the tone with a vivid dream that Lincoln presumably had a few days before his assassination, but it’s in Part I where Errol Morris comes firing:

The story of the crack, along with the original April 9 date, was printed in The New York Times on Feb. 12, 1922. O-118 was captioned: “The President Sat for This Photograph Just Five Days Before Booth Shot Him. The Cracked Negative Caused it To Be Discarded. It Has Only Once Before Been Published, and Then in a Retouched Form.” The accompanying text by James Young read:

Probably no other photograph of Lincoln conveys more clearly the abiding sadness of the face. The lines of time and care are deeply etched, and he has the look of a man bordering upon old age, though he was only 56. Proof that the camera was but a few feet away may be found by scrutiny of this picture… The print has been untouched, and this picture is an exact likeness of the President as he looked in the week of his death. [10]


This is Errol Morris’s motivation for writing the series:

My fascination with the dating and interpretation of photographs is really a fascination with the push-pull of history. Facts vs. beliefs. Our desire to know the origins of things vs. our desire to rework, to reconfigure the past to suit our own beliefs and predilections. Perhaps nothing better illustrates this than two radically different predispositions to objects — the storyteller vs. the collector.


The infamous “crack” photograph of Abraham Lincoln.

For the collector the image with the crack is a damaged piece of goods — the crack potentially undermining the value of the photograph as an artifact, a link to the past. The storyteller doesn’t care about the photograph’s condition, or its provenance, but about its thematic connections with events. To the storyteller, the crack is the beginning of a legend — the legend of a death foretold. The crack seems to anticipate the bullet fired into the back of Lincoln’s head at Ford’s Theater on Good Friday, April 14, 1865.

It should have a name. I call it “the proleptic crack.” 

Errol Morris continues:

Holzer’s enterprise is to weave a context — a story — around photographs and significant events in American history. If Meserve were correct — if Gardner took his photographs of Lincoln on April 10, if the negative cracked just days before Lincoln was shot — it would make for a better story. But that story, like so many “better stories,” isn’t true. 

Part II of the series is here. Parts III and Parts IV will follow soon.


(hat tip: @kottke)

The Origin of Black Friday

Writing in The New Yorker, Amy Merrick profiles the origin of “Black Friday”:

Beginning in the nineteen-thirties, thousands of fans thronged Philadelphia’s Municipal Stadium for the Army-Navy football game. As festive as the mood was inside the stadium, it wasn’t nearly so cheerful for the Philadelphia police officers who had to herd the crowds. The game was frequently held on the Saturday after Thanksgiving, and just as visiting fans were showing up the day before, holiday shoppers also would descend on downtown. On those Fridays after Thanksgiving, the late Joseph P. Barrett, a longtime reporter for the Philadelphia Bulletin, recalled, even members of the police band were called upon to direct traffic. The cops nicknamed the day of gridlock Black Friday, and soon others started to do the same.

Retailers worried the phrase would scare people away. A few weeks after the 1961 game, which President John F. Kennedy had attended, the P.R. pioneer Denny Griswold described in her industry newsletter, Public Relations News, the efforts by Philadelphia merchants and city officials to rebrand the day Big Friday, in reference to the start of the holiday shopping season. (“The media coöperated,” Griswold wrote.) Big Friday didn’t stick, but the idea behind it did, in Philadelphia and, eventually, beyond. A few decades later, when the term came to describe a day when retailers’ ledgers shifted “into the black” for the year—a connotation also pushed by marketers—people assumed that had always been the connotation.

How is your Black Friday shopping going? Or is it?

Vancouver’s Ban on Doorknobs

An interesting report from The Vancouver Sun, noting that the city has passed legislation to do away with the doorknob on all future construction in the city:

Vancouver is the only city in Canada with its own building code, so the changes made here are often chased into the B.C. Building Code and Canada’s National Building Code, and then put into practice in cities and towns across Canada. Vancouver’s influence is wide. And as go the codes, so too goes the construction industry.

Remember the regular toilet? Try to find one. Low-flush is all there is to be had. The incandescent light bulb? Sorry, just energy-saving fluorescent or LED now in most stores.

The change has crept up on us silently and without fanfare. Look at any new condo building. Any new office door. Any door to a public washroom that doesn’t have pneumatic hinges and a push-pad. There they are, these silver, black or brass-coloured levers that can spring a door open with even a forearm when hands are filled.

And, as doorknobs go, so too will go those other ubiquitous knobs, the ones that turn on and off water faucets. For they too are being legislatively upgraded to levers more conducive to the arthritic, gnarled or weakened hands we earn with age.

I actually don’t think that’s a bad idea.

Pennsylvania Newspaper Apologizes 150 Years After Gettysburg Address

In what may be longest-coming newspaper retraction in American history, The Patriot News, based in Harrisburg, Pennsylvania, has apologized for lambasting President Abraham Lincoln’s famous Gettysburg Address, delivered nearly 150 years ago (as of November 19, 2013):

Seven score and ten years ago, the forefathers of this media institution brought forth to its audience a judgment so flawed, so tainted by hubris, so lacking in the perspective history would bring, that it cannot remain unaddressed in our archives.

We write today in reconsideration of “The Gettysburg Address,” delivered by then-President Abraham Lincoln in the midst of the greatest conflict seen on American soil. Our predecessors, perhaps under the influence of partisanship, or of strong drink, as was common in the profession at the time, called President Lincoln’s words “silly remarks,” deserving “a veil of oblivion,” apparently believing it an indifferent and altogether ordinary message, unremarkable in eloquence and uninspiring in its brevity.

In the fullness of time, we have come to a different conclusion. No mere utterance, then or now, could do justice to the soaring heights of language Mr. Lincoln reached that day. By today’s words alone, we cannot exalt, we cannot hallow, we cannot venerate this sacred text, for a grateful nation long ago came to view those words with reverence, without guidance from this chagrined member of the mainstream media.

The world will little note nor long remember our emendation of this institution’s record – but we must do as conscience demands:

In the editorial about President Abraham Lincoln’s speech delivered Nov. 19, 1863, in Gettysburg, the Patriot & Union failed to recognize its momentous importance, timeless eloquence, and lasting significance. The Patriot-News regrets the error. 

That’s awesome.


(via Time)

A Brief History of “More Cowbell”

How did the cowbell go from a herdsman tool to a cultural icon? Modern Farmer has a brief post highlighting its entry into popular culture:

How did the humble cowbell end up on every drum kit of rock, hair and heavy metal band, its rhythmic beat infusing the Stones’ “Honky Tonk Woman” or feigning the tick-tock of a clock in the Chamber Brothers’“Time Has Come”?

Its path from serenity to the cult cry “I gotta have more cowbell!” from the famous Saturday Night Live skit in which Will Ferrell clangs along to Blue Öyster Cult’s “(Don’t Fear) The Reaper” begins around 1904, according to David Ludwig, a composer and dean of creative programs at the Curtis Institute. That was the year two German composers got cowbell fever. Gustav Mahler used them to create a sense of the country for pastoral movements in his Symphony No. 6 and Richard Strauss used them in Alpine Symphony (see the percussionist jiggle them at 16:13).

Both men had spent time near country pastures in their youth, where locals celebrated the changing seasons with spring and fall cow parades called “Alpabzug” when herdsmen led the flocks through town to and from the mountain fields, their cowbells clanging in unison.


Paul Saffo in 1993: The Written Word Remains

In the May/June 1993 issue of Wired, Paul Saffo reflected on digital media and the proliferation of video and virtual reality. But, true today as it was twenty years ago, he explained that our core still remains with text:

In fact, the written word doesn’t just remain; it is flourishing like kudzu vines at the boundaries of the digital revolution. The explosion of e-mail traffic on the Internet represents the largest boom in letter writing since the 18th century. Today’s cutting-edge infonauts are flooding cyberspace with gigabyte upon gigabyte of ASCII musings.

But we hardly notice this textual explosion because, mercifully, it is in large part paperless. Vague clouds of electrons flitting to and fro over the Net have replaced pulverized trees lugged by postal carriers. This has spared our landfills, but it has also obscured a critical media shift. Words have been decoupled from paper. Like the stuff of Horace’s affection, text is still comprised of 26 letters, but freed from the entombing, distancing oppression of paper, it has become as novel as the hottest new media.

In fact, our electronic novelties are transforming the word as profoundly as the printing press did half a millennium ago. For starters, we are smashing arbitrary print-centric boundaries among author, editor, and audience. These categories did not exist before the invention of moveable type, and they will not survive this decade. Just as monk scriveners at once wrote, edited, and read, information surfers browsing online services today routinely play all three roles: selectively scanning, absorbing, editing, and creating on-the-fly in real time. The printing press gave life and reach to the word, but at the terrible cost of making text formal and immutable. Printed words became as immobile as flies in amber, and readers knew that they could look, but not change.

Electronic text has become a new medium that combines print’s fixity with a manuscript-like mutability. Flick a key and volumes of text disappear in virtual smoke; flick another and they are replicated over the Net in a flash. Severed from unreliable paper, text has become all but inextinguishable. E-mail passed between Oliver North and his Iran- Contra conspirators survived numerous attempts at expungement, and now resides in the National Security Archives for all to inspect, even as historians naively lament that the switch to electronic media is depriving them of important research fodder. They needn’t worry; paper may be on the skids, but text is eternal.

Immortality may be the least of the surprises that this new medium of electronic text will deliver. Video enthusiasts are quick to argue that images are intrinsically more compelling than words, but they ignore a quality unique to text. While video is received by the eyes, text resonates in the mind. Text invites our minds to complete the word-based images it serves up, while video excludes such mental extensions. Until physical brain-to-machine links become a reality, text will offer the most direct of paths between the mind and the external world.

Video suffers from a deeper problem, one of ever diminishing reliability in the face of ever more capable morphing technologies. By decade’s end, we will look back at 1992 and wonder how a video of police beating a citizen could move Los Angeles to riot. The age of camcorder innocence will evaporate as teenage morphers routinely manipulate the most prosaic of images into vivid, convincing fictions. We will no longer trust our eyes when observing video-mediated reality. Text will emerge as a primary indicator of trustworthiness, and images will transit the Net as multimedia surrounded by a bodyguard of words, just as medieval scholars routinely added textual glosses in the margins of their tomes.

Of course words can be as false as images, but there is something to text that keeps our credulity at bay. Perhaps the intellectual labor required to decode words keeps us mentally alert, while visual stimuli encourage passivity. Studies conducted during the Gulf War hinted at such a possibility: Researchers found that citizens who read about the war’s events in daily publications had a far better grasp of the issues than avid real-time TV news junkies.

Talk about a way-back time machine… And how prescient, no?

Modeling 3,000 Years of Human History

It’s rare to find an interesting paper on history in the Proceedings of the National Academy of Sciences, so it was interesting to stumble upon Peter Turchin et al.’s “War, Space, and the Evolution of Old World Complex Societies” who developed a model that uses cultural evolution mechanisms to predict where and when the largest-scale complex societies should have arisen in human history.

From their abstract:

How did human societies evolve from small groups, integrated by face-to-face cooperation, to huge anonymous societies of today, typically organized as states? Why is there so much variation in the ability of different human populations to construct viable states? Existing theories are usually formulated as verbal models and, as a result, do not yield sharply defined, quantitative predictions that could be unambiguously tested with data. Here we develop a cultural evolutionary model that predicts where and when the largest-scale complex societies arose in human history. The central premise of the model, which we test, is that costly institutions that enabled large human groups to function without splitting up evolved as a result of intense competition between societies—primarily warfare. Warfare intensity, in turn, depended on the spread of historically attested military technologies (e.g., chariots and cavalry) and on geographic factors (e.g., rugged landscape). The model was simulated within a realistic landscape of the Afroeurasian landmass and its predictions were tested against a large dataset documenting the spatiotemporal distribution of historical large-scale societies in Afroeurasia between 1,500 BCE and 1,500 CE. The model-predicted pattern of spread of large-scale societies was very similar to the observed one. Overall, the model explained 65% of variance in the data. An alternative model, omitting the effect of diffusing military technologies, explained only 16% of variance. Our results support theories that emphasize the role of institutions in state-building and suggest a possible explanation why a long history of statehood is positively correlated with political stability, institutional quality, and income per capita.

The model simulation runs from 1500 B.C.E. to 1500 C.E.—so it encompasses the growth of societies like Mesopotamia, ancient Egypt and the like—and replicates historical trends with 65 percent accuracy.

Smithsonian Magazine summarizes:

Turchin began thinking about applying math to history in general about 15 years ago. “I always enjoyed history, but I realized then that it was the last major discipline which was not mathematized,” he explains. “But mathematical approaches—modeling, statistics, etc.—are an inherent part of any real science.”

In bringing these sorts of tools into the arena of world history and developing a mathematical model, his team was inspired by a theory called cultural multilevel selection, which predicts that competition between different groups is the main driver of the evolution of large-scale, complex societies. To build that into the model, they divided all of Africa and Eurasia into gridded squares which were each categorized by a few environmental variables (the type of habitat, elevation, and whether it had agriculture in 1500 B.C.E.). They then “seeded” military technology in squares adjacent to the grasslands of central Asia, because the domestication of horses—the dominant military technology of the age—likely arose there initially.

Over time, the model allowed for domesticated horses to spread between adjacent squares. It also simulated conflict between various entities, allowing squares to take over nearby squares, determining victory based on the area each entity controlled, and thus growing the sizes of empires. After plugging in these variables, they let the model simulate 3,000 years of human history, then compared its results to actual data, gleaned from a variety of historical atlases.

Click here to see a movie of the model in action.

Of particular interest to me was the discussion of the limitations of the model (100-year sampling and exclusion of city-states of Greece):

Due to the nature of the question addressed in our study, there are inevitably several sources of error in historical and geographical data we have used. Our decision to collect historical data only at 100-year time-slices means that the model ‘misses’ peaks of some substantial polities such as the Empire of Alexander the Great, or Attila’s Hunnic Empire. This could be seen as a limitation for traditional historical analyses because we have not included a few polities known to be historically influential. However, for the purposes of our analyses this is actually strength. Using a regular sampling strategy allows us to collect data in a systematic way independent of the hypothesis being tested rather than cherry-picking examples that support our ideas.

We have also only focused on the largest polities, i.e those that were approximately greater than 100,000 km2. This means that some complex societies, such as the Ancient Greek city states, are not included in our database. The focus on territorial extent is also a result of our attempt to be systematic and minimize bias, and this large threshold was chosen for practical considerations. Historical information about the world varies partly in the degree to which modern societies can invest in uncovering it. Our information about the history of western civilization, thus, is disproportionately good compared to some other parts of the world. Employing a relatively large cut-off minimizes the risk of “missing” polities with large  populations in less well-documented regions and time-frames, because the larger the polity the more likely it is to have left some trace in the historical record. At a smaller threshold there are simply too many polities about which we have very little information, including their territories, and the effects of a bias in our access to the historical record is increased.

Overall, I think the supporting information for the model is actually a lot more interesting read than the paper itself.

David Block, the Baseball Archaeologist

This is a fascinating story in Grantland about David Block and his quest to find the origins of baseball:

Block was coming to the subject of baseball’s paternity not as a historian but as a book collector. “Historians are driven by story and issue,” said Thorn. “David was driven by artifact.” As he scoured eBay in the late ’90s — back before anyone knew what their junk was worth — it was Block’s brainstorm to bypass books about baseball. He was looking for books that mentioned baseball, books historians might have missed. “I always liked to go where no one else was looking,” Block said. His collection grew big enough that he decided to write a bibliography of early texts. The bibliography became a proper book.

In 2001, Block got ahold of a copy of a 1796 German book with the ungainly title of Spiele zur Uebung und Erholung des Körpers und Geistes für die Jugend, ihre Erzieher und alle Freunde Unschuldiger Jugendfreuden. His copy has green and white marbled boards and brown binder’s tape on the spine. An inside page carries the stamp “D. Schaller,” a previous owner. Block ran his finger down the table of contents when he saw a reference:

                    3. Ball mit Freystäten, das engl. Base-ball

A translation confirmed what Block suspected. Here was a reference to baseball 32 years before the first literary reference to rounders. And the German book, by J.C.F. Gutsmuths, wasn’t the only example. The 1744 A Little Pretty Pocket-Book mentioned baseball. So did a letter of one Lady Hervey of England, from 1748. Even Jane Austen included the word “baseball” in her novel Northanger Abbey, which was published in 1818. If baseball had descended from rounders, Block wondered, then why did baseball keep popping up in the historical record before rounders?

Block began to get a little nervous. The historian Thomas Altherr, who talked to Block during this period, said Block was worried he was imposing on the work of others. For Block had confirmed that both the Doubleday theory was bunk. But he had also discovered that the rounders theory was bunk. Everything we knew about baseball’s parentage was wrong.

A reference to baseball, according to Block, can be traced as early as 1755:

In 2007, Block was on a computer terminal in the British Library in London. He came across a comic novel called The Card, by John Kidgell, which was published in 1755. He found this passage:

… the younger Part of the Family, perceiving Papa not inclined to enlarge upon the Matter, retired to an interrupted Party at Base-Ball, (an infant Game which as it advances in its Teens, improves into Fives, and in its State of Manhood, is called Tennis.)

On English baseball: 

Block offered an alternative proposal for baseball’s paternity. It was both simpler and more complex than any previous theory. First, Block said that baseball had descended from … baseball. What the authors of the BA’SEBALL dictionary entry and John Kidgell and William Bray and Jane Austen were describing was a primitive version of the game played in English fields. Block calls this English baseball.

And how was this English baseball played? Block offers that there were no bats (players used their hands), and that the game was social rather than competitive/athletic:

There were bases of some unknown counting. The pitcher threw to the batter underhanded. The fielders tried to catch the ball on the fly or retrieve the ball and throw it and strike the runner when he was off base.”

Fascinating throughout.


Note: If this topic piques your interest, Block wrote a book called Baseball Before we Knew It that has stellar reviews on Amazon.

When They Can’t Lay You Off, Employers in Japan Send You to Boredom Rooms

What happens if you’re working in Japan and a company wants to lay you off, and offers you a lucrative early retirement or severance deal? Well, if you choose not to accept the terms, the company has no right to fire you. So what they’ll do instead is send you to work in a so-called “Boredom Room.”

In Japan, lifetime employment has long been the norm and where large-scale layoffs remain a social taboo, at least at Japan’s largest corporations like Sony. The New York Times profiles one man who’s chosen to go into the Boredom Room and spend his workday there: reading college textbooks, surfing the Internet, and who knows what else.

Sony said it was not doing anything wrong in placing employees in what it calls Career Design Rooms. Employees are given counseling to find new jobs in the Sony group, or at another company, it said. Sony also said that it offered workers early retirement packages that are generous by American standards: in 2010, it promised severance payments equivalent to as much as 54 months of pay. But the real point of the rooms is to make employees feel forgotten and worthless — and eventually so bored and shamed that they just quit, critics say.

Labor practices in Japan contrast sharply with those in the United States, where companies are quick to lay off workers when demand slows or a product becomes obsolete. It is cruel to the worker, but it usually gives the overall economy agility. 

However, and this is a point worth emphasizing: critics say the real point of the boredoom rooms is to make employees feel forgotten and worthless — and eventually get so bored and shamed that they just quit.

Read the entire story here.