There is No Such Thing as Average Body Size

Below is a very powerful excerpt on why an “average body type” or “average body size” does not exist, taken from Todd Rose’s The End of Average: How We Succeed in a World That Values Sameness:

endofaverage

The End of Average

In the late 1940s, the United States air force had a serious problem: its pilots could not keep control of their planes. Although this was the dawn of jet-powered aviation and the planes were faster and more complicated to fly, the problems were so frequent and involved so many different aircraft that the air force had an alarming, life-or-death mystery on its hands. “It was a difficult time to be flying,” one retired airman told me. “You never knew if you were going to end up in the dirt.” At its worst point, 17 pilots crashed in a single day.

The two government designations for these noncombat mishaps were incidents and accidents, and they ranged from unintended dives and bungled landings to aircraft-obliterating fatalities. At first, the military brass pinned the blame on the men in the cockpits, citing “pilot error” as the most common reason in crash reports. This judgment certainly seemed reasonable, since the planes themselves seldom malfunctioned. Engineers confirmed this time and again, testing the mechanics and electronics of the planes and finding no defects. Pilots, too, were baffled. The only thing they knew for sure was that their piloting skills were not the cause of the problem. If it wasn’t human or mechanical error, what was it?

After multiple inquiries ended with no answers, officials turned their attention to the design of the cockpit itself. Back in 1926, when the army was designing its first-ever cockpit, engineers had measured the physical dimensions of hundreds of male pilots (the possibility of female pilots was never a serious consideration), and used this data to standardize the dimensions of the cockpit. For the next three decades, the size and shape of the seat, the distance to the pedals and stick, the height of the windshield, even the shape of the flight helmets were all built to conform to the average dimensions of a 1926 pilot.

Now military engineers began to wonder if the pilots had gotten bigger since 1926. To obtain an updated assessment of pilot dimensions, the air force authorized the largest study of pilots that had ever been undertaken. In 1950, researchers at Wright Air Force Base in Ohio measured more than 4,000 pilots on 140 dimensions of size, including thumb length, crotch height, and the distance from a pilot’s eye to his ear, and then calculated the average for each of these dimensions. Everyone believed this improved calculation of the average pilot would lead to a better-fitting cockpit and reduce the number of crashes — or almost everyone. One newly hired 23-year-old scientist had doubts.

Lt. Gilbert S. Daniels was not the kind of person you would normally associate with the testosterone-drenched culture of aerial combat. He was slender and wore glasses. He liked flowers and landscaping and in high school was president of the Botanical Garden Club. When he joined the Aero Medical Laboratory at Wright Air Force Base straight out of college, he had never even been in a plane before. But it didn’t matter. As a junior researcher, his job was to measure pilots’ limbs with a tape measure.

It was not the first time Daniels had measured the human body. The Aero Medical Laboratory hired Daniels because he had majored in physical anthropology, a field that specialized in the anatomy of humans, as an undergraduate at Harvard. During the first half of the 20th century, this field focused heavily on trying to classify the personalities of groups of people according to their average body shapes — a practice known as “typing.” For example, many physical anthropologists believed a short and heavy body was indicative of a merry and fun-loving personality, while receding hairlines and fleshy lips reflected a “criminal type.”

Daniels was not interested in typing, however. Instead, his undergraduate thesis consisted of a rather plodding comparison of the shape of 250 male Harvard students’ hands. The students Daniels examined were from very similar ethnic and socio-cultural backgrounds (namely, white and wealthy), but, unexpectedly, their hands were not similar at all. Even more surprising, when Daniels averaged all his data, the average hand did not resemble any individual’s measurements. There was no such thing as an average hand size. “When I left Harvard, it was clear to me that if you wanted to design something for an individual human being, the average was completely useless,” Daniels told me.

So when the air force put him to work measuring pilots, Daniels harboured a private conviction about averages that rejected almost a century of military design philosophy. As he sat in the Aero Medical Laboratory measuring hands, legs, waists and foreheads, he kept asking himself the same question in his head: How many pilots really were average?

He decided to find out. Using the size data he had gathered from 4,063 pilots, Daniels calculated the average of the 10 physical dimensions believed to be most relevant for design, including height, chest circumference and sleeve length. These formed the dimensions of the “average pilot,” which Daniels generously defined as someone whose measurements were within the middle 30 per cent of the range of values for each dimension. So, for example, even though the precise average height from the data was five foot nine, he defined the height of the “average pilot” as ranging from five-seven to five-11. Next, Daniels compared each individual pilot, one by one, to the average pilot.

Before he crunched his numbers, the consensus among his fellow air force researchers was that the vast majority of pilots would be within the average range on most dimensions. After all, these pilots had already been pre-selected because they appeared to be average sized. (If you were, say, six foot seven, you would never have been recruited in the first place.) The scientists also expected that a sizable number of pilots would be within the average range on all 10 dimensions. But even Daniels was stunned when he tabulated the actual number.

Zero.

Out of 4,063 pilots, not a single airman fit within the average range on all 10 dimensions. One pilot might have a longer-than-average arm length, but a shorter-than-average leg length. Another pilot might have a big chest but small hips. Even more astonishing, Daniels discovered that if you picked out just three of the ten dimensions of size — say, neck circumference, thigh circumference and wrist circumference — less than 3.5 per cent of pilots would be average sized on all three dimensions. Daniels’s findings were clear and incontrovertible. There was no such thing as an average pilot. If you’ve designed a cockpit to fit the average pilot, you’ve actually designed it to fit no one.

Daniels’ revelation was the kind of big idea that could have ended one era of basic assumptions about individuality and launched a new one. But even the biggest of ideas requires the correct interpretation. We like to believe that facts speak for themselves, but they most assuredly do not. After all, Gilbert Daniels was not the first person to discover there was no such thing as an average person.

Seven years earlier, the Cleveland Plain Dealer announced on its front page a contest co-sponsored with the Cleveland Health Museum and in association with the Academy of Medicine of Cleveland, the School of Medicine and the Cleveland Board of Education. Winners of the contest would get $100, $50, and $25 war bonds, and 10 additional lucky women would get $10 worth of war stamps. The contest? To submit body dimensions that most closely matched the typical woman, “Norma,” as represented by a statue on display at the Cleveland Health Museum.

Norma was the creation of a well-known gynecologist, Dr. Robert L. Dickinson, and his collaborator Abram Belskie, who sculpted the figure based on size data collected from 15,000 young adult women. Dr. Dickinson was an influential figure in his day: chief of obstetrics and gynecology at the Brooklyn Hospital, president of the American Gynecological Society and chairman of obstetrics at the American Medical Association. He was also an artist — the “Rodin of obstetrics,” as one colleague put it — and throughout his career he used his talents to draw sketches of women, their various sizes and shapes, to study correlations of body types and behaviour.

Like many scientists of his day, Dickinson believed the truth of something could be determined by collecting and averaging a massive amount of data. “Norma” represented such a truth. For Dickinson, the thousands of data points he had averaged revealed insight into a typical woman’s physique — someone normal.

In addition to displaying the sculpture, the Cleveland Health Museum began selling miniature reproductions of Norma, promoting her as the “Ideal Girl,” launching a Norma craze. A notable physical anthropologist argued that Norma’s physique was “a kind of perfection of bodily form,” artists proclaimed her beauty an “excellent standard” and physical education instructors used her as a model for how young women should look, suggesting exercise based on a student’s deviation from the ideal. A preacher even gave a sermon on her presumably normal religious beliefs. By the time the craze had peaked, Norma was featured in Time magazine, in newspaper cartoons, and on an episode of a CBS documentary series, This American Look, where her dimensions were read aloud so the audience could find out if they, too, had a normal body.

On Nov. 23, 1945, the Plain Dealer announced its winner, a slim brunette theatre cashier named Martha Skidmore. The newspaper reported that Skidmore liked to dance, swim, and bowl — in other words, that her tastes were as pleasingly normal as her figure, which was held up as the paragon of the female form.

Before the competition, the judges assumed most entrants’ measurements would be pretty close to the average, and that the contest would come down to a question of millimetres. The reality turned out to be nothing of the sort. Less than 40 of the 3,864 contestants were average size on just five of the nine dimensions and none of the contestants — not even Martha Skidmore — came close on all nine dimensions. Just as Daniels’ study revealed there was no such thing as an average-size pilot, the Norma Look-Alike contest demonstrated that average-size women did not exist either.

But while Daniels and the contest organizers ran up against the same revelation, they came to a markedly different conclusion about its meaning. Most doctors and scientists of the era did not interpret the contest results as evidence that Norma was a misguided ideal. Just the opposite: many concluded that American women, on the whole, were unhealthy and out of shape. One of those critics was the physician Bruno Gebhard, head of the Cleveland Health Museum, who lamented that postwar women were largely unfit to serve in the military, chiding them by insisting “the unfit are both bad producers and bad consumers.” His solution was a greater emphasis on physical fitness.

Daniels’ interpretation was the exact opposite. “The tendency to think in terms of the ‘average man’ is a pitfall into which many persons blunder,” Daniels wrote in 1952. “It is virtually impossible to find an average airman not because of any unique traits in this group but because of the great variability of bodily dimensions which is characteristic of all men.”

Rather than suggesting that people should strive harder to conform to an artificial ideal of normality, Daniels’ analysis led him to a counterintuitive conclusion that serves as the cornerstone of this book: any system designed around the average person is doomed to fail.

Daniels published his findings in a 1952 Air Force Technical Note entitled The “Average Man”?In it, he contended that if the military wanted to improve the performance of its soldiers, including its pilots, it needed to change the design of any environments in which those soldiers were expected to perform. The recommended change was radical: the environments needed to fit the individual rather than the average.

Amazingly — and to their credit — the air force embraced Daniels’ arguments. “The old air force designs were all based on finding pilots who were similar to the average pilot,” Daniels explained to me. “But once we showed them the average pilot was a useless concept, they were able to focus on fitting the cockpit to the individual pilot. That’s when things started getting better.”

By discarding the average as their reference standard, the air force initiated a quantum leap in its design philosophy, centred on a new guiding principle: individual fit. Rather than fitting the individual to the system, the military began fitting the system to the individual. In short order, the air force demanded that all cockpits needed to fit pilots whose measurements fell within the 5-per-cent to 95-per-cent range on each dimension.

When airplane manufacturers first heard this new mandate, they balked, insisting it would be too expensive and take years to solve the relevant engineering problems. But the military refused to budge, and then — to everyone’s surprise — aeronautical engineers rather quickly came up with solutions that were both cheap and easy to implement. They designed adjustable seats, technology now standard in all automobiles. They created adjustable foot pedals. They developed adjustable helmet straps and flight suits.

Once these and other design solutions were put into place, pilot performance soared, and the U.S. air force became the most dominant air force on the planet. Soon, every branch of the American military published guides decreeing that equipment should fit a wide range of body sizes, instead of standardized around the average.

Why was the military willing to make such a radical change so quickly? Because changing the system was not an intellectual exercise — it was a practical solution to an urgent problem. When pilots flying faster than the speed of sound were required to perform tough manoeuvres using a complex array of controls, they couldn’t afford to have a gauge just out of view or a switch barely out of reach. In a setting where split-second decisions meant the difference between life and death, pilots were forced to perform in an environment that was already stacked against them.

Bibliotherapy: Can Reading Make You Happier?

This is a fascinating piece in The New Yorker about bibliotherapy: reading books to deal with life’s ailments. Per the piece, the most common ailments people tend to bring to a bibliotherapist (who recommends books on various topics that are not in the self-help genre) are the life-juncture transitions, being stuck in a rut in your career, feeling depressed in your relationship, or suffering bereavement.

Berthoud and Elderkin trace the method of bibliotherapy all the way back to the Ancient Greeks, “who inscribed above the entrance to a library in Thebes that this was a ‘healing place for the soul.’ ” The practice came into its own at the end of the nineteenth century, when Sigmund Freud began using literature during psychoanalysis sessions. After the First World War, traumatized soldiers returning home from the front were often prescribed a course of reading. “Librarians in the States were given training on how to give books to WWI vets, and there’s a nice story about Jane Austen’s novels being used for bibliotherapeutic purposes at the same time in the U.K.,” Elderkin says. Later in the century, bibliotherapy was used in varying ways in hospitals and libraries, and has more recently been taken up by psychologists, social and aged-care workers, and doctors as a viable mode of therapy.

Bibliotherapy, if it existed at all, tended to be based within a more medical context, with an emphasis on self-help books. But we were dedicated to fiction as the ultimate cure because it gives readers a transformational experience.

If you’re interested in learning more, perhaps check out The Novel Cure: 751 Books to Cure What Ails You by Ella Berthoud and Susan Elderkin, the therapists mentioned in the piece.

Book Review: Ian Leslie’s Curious—The Desire To Know and Why Your Future Depends on It

Everyone is born curious. But only a proportion of the human population retains the habits of exploring, learning, and discovering as they grow older. So why are so many of us allowing our curiosity to wane, when there is evidence that those who are curious tend to be more creative, more intelligent, and more successful?

In Curious: The Desire to Know and Why Your Future Depends on It, Ian Leslie makes a compelling case for the cultivation of our “desire to know.” I’ve had the chance to read the book in advance of its publication date (full disclosure: I received a complimentary advance copy of the book from Basic Books, the publisher of Curious), and this review provides my impressions of the book and highlights some notable passages.

The book is divided into three parts: How Curiosity Works, The Curiosity Divide, and Staying Curious. In the introduction to the book, the case is made for why being curious is vital:

The truly curious will be increasingly in demand. Employers are looking for people who can do more than follow procedures competently or respond to requests, who have a strong, intrinsic desire to learn, solve problems, and ask penetrating questions. They may be difficult to manage at times, these individuals, for their interests and enthusiasms can take them along unpredictable paths, and they don’t respond well to being told what to think. But for the most part, they will be worth the difficulty.

 Another assessment of what this book is about is presented in the introduction: 

If you allow yourself to become incurious, your life will be drained of color, interest, and pleasure. You will be less likely to achieve your potential at work or in creative life. While barely noticing it, you’ll become a little duller, a little dimmer. You may not think it could happen to you, but it can. It can happen to any of us. To stop it from happening, you need to understand what feeds curiosity and what starves it. That’s what this book is about.

Something worth pondering over:

Curiosity is contagious. So is incuriosity.

Something that caught my attention in Part I of the book was the evolutionary of advantage of becoming or staying curious. Here, Leslie cites the research of Stephen Kaplan, an evolutionary psychologist at the University of Michigan:

The more information about her environment a human acquired, the more likely she would be to survive and pass on her genes. Gathering that knowledge meant venturing out into the unknown, to spot new sources of water or edible plants. But doing so meant risking one’s survival; you might become vulnerable to predators or get lost. The individuals more likely to survive would have been those adept at striking a balance between knowledge gathering and self-preservation.

Perhaps as an incentive to take a few risks in the pursuit of new information, evolution tied the act of curiosity to pleasure. Leslie writes how the caudate nucleus, located deep within the human brain, is packed with neurons that traffic in dopamine. As the brain has evolved (from the evolutionary perspective), it seems to have bootstrapped the urge for intellectual investigation onto the same pathway as our primal pleasures (for sex or food). This research was done at California Institute of Technology by asking undergraduates questions whilst they were in the brain scanner. (I need to read this study in depth because Caltech undergrads are naturally some of the most curious individuals on the planet, so we have a potential selection bias at work here).

In a chapter titled “How Curiosity Begins,” Leslie points out how babies respond to curiosity:

Babbling, like pointing, is a sign of readiness to learn, and babies are also more likely to us it as such, if, rather than ignoring them, they try to answer whatever they think the baby’s unintelligible question might be. If a baby looks at an apple and says “Da da da!” and the adult says nothing, the baby not only fails to learn the name of that round greenish object, but also starts to think this whole babbling business might be a waste of time.

One interesting bit about curiosity: we don’t get allocated a fixed amount of it at birth. Instead, we inherit a mercurial quality that rises and falls throughout the day and throughout our lives. Leslie points out that an important input into the curiosity output is the behavior of people around us – if our curiosity is ignited, it grows; on the other hand, if our curiosity is squashed at a point in time, curiosity may wane over the long term.

In a chapter titled “Puzzles and Mysteries,” Leslie describes how curiosity may naturally wane as we grow older:

Computer scientists talk about the differences between exploring and exploiting—a system will learn more if it explores many possibilities, but it will be more effective if it simply acts on the most likely one. As babies grow into children and then into adults, they begin to do more exploiting of whatever knowledge they have acquired. As adults, however, we have a tendency to err too far toward exploitation—we become content to fall back on the stock of knowledge and mental habits we built up when we were young, rather than adding to or revising it. We get lazy.

The so-called curiosity zone is a function of surprise, knowledge, and confidence. Curiosity is highest when the violation of an expectation is more than tiny but less than enormous. When violations are minor, we are quick to ignore them. When they’re massive, we often refuse to acknowledge them we may be scared of what they imply. The less knowledge you have about something, the less likely you are to pursue getting to know it better. Alternatively, if you are an expert in a particular subject area, your capacity to stay very curious about the subject area may have piqued. The curiosity zone is a concave function, where maximum curiosity happens at the middle. Finally, it is important to have an environment that is conducive to curious thinking. Curiosity requires an edge of uncertainty to thrive; too much uncertainty, and it freezes.

A good anecdote is presented in the “Puzzles and Mysteries” chapter on why The Wire was such a great TV show:

One way of describing the achievement of the TV series The Wire was that it took a genre, the police procedural, which is conventionally based on puzzles, in the form of crimes that are solved each week, and turned it into a mystery—the mystery of Baltimore’s crime problem.

So while routine police work may classified as solving puzzles (with a definitive answer), The Wire, showcased it as more akin to a mystery – multilayered, shifting, nuanced (in Leslie’s words). The Wire, to this day, is in my top 3 all-time favourite TV shows, so I was glad to see its incorporation in the book.

What’s the one company that is doing everything it can to deprive you of the itch of curiosity? Answer: Google. Because according to Google’s founders, Larry Page and Sergey Brin, they are working toward the ambition of incorporating search into people’s brains. All information gaps will be closed. I don’t take as a black-and-white stand in that proliferation of Google will make more people incurious, but I do understand Leslie’s perspective. In general, if you were to ask someone “Is the Internet making us stupid or more intelligent,” Leslie’s response would be a simple “Yes.” He writes:

The Internet presents us with more opportunities to learn than ever before and also allows us not to bother. It is a boon to those with a desire to deepen their understanding of the world, and also to those who are only too glad not to have to make the effort…If you’re incurious—or, like most of us, a little lazy—then you will use the Internet to look at pictures of cats and get into arguments with strangers.

Ian Leslie does a good job of assimilating related research into Curious. For instance, what matters in students are their character traits such as attitude toward learning and conscientiousness, as well as persistence, self-discipline, and what the psychologist Angela Duckworth termed “grit”—the ability to deal with failure, overcome setbacks, and focus on long-term goals. In a chapter titled “The Power of Questions,” Leslie quotes the former CEO of Dow Chemical, Mike Parker: 

A lot of bad leadership comes from an inability or unwillingness to ask questions. I have watched talented people—people with much higher IQs than mine—who have failed as leaders. They can talk brilliantly, with a great breadth of knowledge, but they’re not very good at asking questions. So while they know a lot at a high level, they don’t know what’s going on way down in the system. Sometimes they are afraid of asking questions, but what they don’t realize is that the dumbest questions can be very powerful. They can unlock a conversation.

In what I think is the most important chapter of the book, “The Importance of Knowing,” Leslie highlights the importance of epistemic knowledge, and provides evidence to debunk some of the “twenty-first century” mindset. Leslie presents three misapprehensions about learning, common to the supporters of “curiosity-driven” education:

  • Children don’t need teachers to instruct them. Those who think the natural curiosity of children is stifled by pedagogical instruction overlook something fundamental about human nature—as a species, we have always depended on the epistemic endowment of our elders and ancestors. As Leslie writes, every scientist stands on the shoulders of giants; every artist works within or against a tradition. The unusually long period for which children are dependent on adults is a clue that humans are designed to learn from others, rather than merely through their own explorations. Traditional teaching—the transmission of information from adults to children—is highly effective when skillfully executed. Citing the research of John Hattie, the three most powerful teacher factors (those that lead to student success) are feedback, quality of instruction, and direct instruction.
  • Facts kill creativity. At the most basic level, all of our new ideas are somehow linked to old ones. The more existing ideas you have in your head, the more varied and rich and blossoming will be your novel combination of them, and the greater your store of reference points and analogies. Per Leslie: “a fact is a particular class of idea about the world, and it can be put to work in a lot of different ways.” In this section, Leslie refers to Sir Ken Robinson’s famous 2008 talk on educational reform titled “Do Schools Kill Creativity” and the proceeds to justify that Sir Robinson’s arguments about creativity are almost entirely baseless.
  • Schools should teach thinking skills instead of knowledge. Learning different skills grow organically out of specific knowledge of specific domains—that is, facts. The wider your knowledge, the more widely your intelligence can range and the more purchase it gets on new information. This is why the argument that schools ought to prioritize learning skills over knowledge makes no sense, argues Leslie: the very foundation for such skills is memorized knowledge. The more we know, the better we are at thinking.

On how knowledge gives curiosity the staying power, Leslie writes:

This is why curiosity, like other thinking skills, cannot be nurtured, or taught, in the abstract. Rather than being stifled by factual knowledge, it depends on it. Until a child has been taught the basic information she needs to start thinking more deeply about a particular subject, it’s hard to develop her initial (diversive) curiosity into enduring (epistemic) curiosity, to get her to the stage where she is hungry for more knowledge…Sir Ken Robinson has it precisely the wrong way around when he says that the natural appetite for learning begins to dissipate once children start to be educated. The curiosity of children dissipates when it doesn’t get fed by knowledge, imparted by parents and teachers.

In short, background knowledge is vital, kindling curiosity. From personal experience, I happen to think that there is also a positive feedback loop in place; the more you know, the more curious you become, the more knowledgeable you become over time because you seek to gain more knowledge through your curiosity.

In the last part of the book, Leslie outlines seven ways to stay curious. They are as follows:

  1. Stay foolish. Echoing Steve Jobs’s memorable commencement address, in which Jobs advised Stanford graduates to “Stay hungry, stay foolish,” Ian Leslie points out how Jobs’s curiosity was crucial to his ability to invent and reinvent the businesses in which he was involved (Apple, Pixar).
  2. Build the database. The idea behind this premise is that any project or task that requires deep creative thought will be better addressed by someone who has deep knowledge of the task at hand and general background knowledge of the culture in which it and its users (or readers, or viewers) live. Leslie writes:

    Highly curious people, who have carefully cultivated their long-term memories, live in a kind of augmented reality; everything they see is overlaid with additional layers of meaning and possibility, unavailable to ordinary observers.

  3. Forage like a foxhog. In the words of the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.” The fox evades predators via a variety of techniques, while the hedge adopts one trusted technique (hunkering down and relying on its spikes to thwart a predator). And the thinkers that are best positioned to thrive today and in the future are likely a hybrid of the fox and the hedgehog: the foxhog. You need to be specialized in one or two subject areas (what are knowns as SMEs, or subject matter experts) but also to be a voracious consumer of knowledge from other fields. In short, combine breadth and depth into your skill set.
  4. Ask the big why. In a useful anecdote from the book Negotiation Genius by Harvard Business School professors Deepak Malhotra and Max H. Bazerman, Leslie points out how asking “why” is such a critical component in the negotiation process. If two parties negotiate on their preagreed positions, the negotiation becomes a trade-off where one side necessarily loses with respect to the other, which gains. So then the key is to really try to understand what’s motivating the other party’s interestsand this involves asking the probing, penetrating questions which can be summarized with the why.There is an interesting diversion in this point on the Big Data movement. One of the proponents of it, Chris Anderson (who was formerly editor of Wired), has made the extreme case of asking the Big What instead of the Big Why. With enough data, the premise is that you can glean behavior from the patterns that is observed. But I don’t think it’s that simple. In fact, the more data you collect, the more likely you are to start forming false narratives (Nassim Nicholas Taleb makes a great point of this fact in his excellent book, Antifragile). When we have a lot of data to work with, we get things like spurious correlations.
  5. Be a thinkerer. A portmanteau of “think” and “tinker,” the origin of the verb “to thinker” is unknown. Leslie mentions that he was introduced to the term by Paola Antonelli of Museum of Modern Art in New York City, who traced it to a 2007 presentation given by John Seely Brown (formerly the director of the Xerox Palo Alto Research Center). The idea is enunciated well by Peter Thiel:

    A fundamental challenge—in business as in life—is to integrate the micro and macro such that all things make sense. Humanities majors may well learn a great deal about the world. But they don’t really learn career skills through their studies. Engineering majors, conversely, learn in great technical detail. But they might not learn why, how, or where they should apply their skills in the workforce. The best students, workers, and thinkers will integrate these questions into a cohesive narrative.

  6. Question your teaspoons. The idea is to become aware and curious about your daily surroundings. Parking garage roofs, hand dryers, milk, paint catalogs, and bus routes–they sound mundane but if you dig deeper, you can find out how complex and intricate they can really be. This is what led James Ward to found The Boring Conference (which is a lot more interesting than it sounds!). Leslie points out a good example: Laura McInerney, who used to work at McDonalds. Her shift would be to make the daily breakfast by breaking four hundred eggs, a mind-numbing ordeal on a day-to-day basis. But then she started asking questions on how the proteins in the egg change as the egg is heated, and how she started reflecting on whether it was ethically right to steal eggs from a chicken, or whether the egg or the chicken came first?
  7. Turn puzzles into mysteries. The premise here is simple: a puzzle is something that commands our curiosity until we have solved it. A mystery, by contrast, is something that never stops inviting (further) inquiry. The way to stay curious, then, is for every puzzle that we come across in our daily lives, be cognizant that there may be an underlying mystery behind it that would be worth exploring/pursuing.

In the Afterword of Curious, Leslie highlights one of my all-time favourite commencement speeches, that given by David Foster Wallace to the graduating class of 2005. In it, Wallace argues that we are inherently self-centered (because the world we experience is in front and behind us, above and below us, and it is immediate). It is only through the exercise of our curiosity about others that we can free ourselves about our hard-wired self-obsession. We should be curious about others not just because it is virtuous, but because it’s also a coping mechanism of the routine, petty frustration of day-to-day life.

The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.

That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.

Ian Leslie’s Curious: The Desire to Know and Why Your Future Depends on It is a well-researched book that cites a number of relevant scientific studies, frames concepts related to knowledge and curiosity with interesting anecdotes, and has a solid bibliography for the curious people to dive further after finishing Curious.

I highly recommend the book. It is available on Amazon (hardcover or for the Kindle) or your favourite bookseller beginning today, August 26, 2014. 

The Language of Food: A Linguist Reads The Menu

The Atlantic has an interesting preview of Dan Jurafsky’s The Language of Food: A Linguist Reads the Menu, coming out in September: 

You needn’t be a linguist to note changes in the language of menus, but Stanford’s Dan Jurafsky has written a book doing just that. In The Language of Food: A Linguist Reads the Menu, Jurafsky describes how he and some colleagues analyzed a database of 6,500 restaurant menus describing 650,000 dishes from across the U.S. Among their findings: fancy restaurants, not surprisingly, use fancier—and longer—words than cheaper restaurants do (think accompaniments and decaffeinated coffee, not sides and decaf). Jurafsky writes that “every increase of one letter in the average length of words describing a dish is associated with an increase of 69 cents in the price of that dish.” Compared with inexpensive restaurants, the expensive ones are “three times less likely to talk about the diner’s choice” (your way, etc.) and “seven times more likely to talk about the chef’s choice.”

Lower-priced restaurants, meanwhile, rely on “linguistic fillers”: subjective words like delicious, flaky, and fluffy. These are the empty calories of menus, less indicative of flavor than of low prices. Cheaper establishments also use terms like ripe and fresh, which Jurafsky calls “status anxiety” words. Thomas Keller’s Per Se, after all, would never use fresh—that much is taken for granted—but Subway would. Per Se does, however, engage in the trendy habit of adding provenance to descriptions of ingredients (Island Creek oysters, Frog Hollow’s peaches). According to Jurafsky, very expensive restaurants “mention the origins of the food more than 15 times as often as inexpensive restaurants.”

Putting this book on my to-read list.

On France and Independent Bookstores

Pamela Druckerman, who lives in Paris, provides some insight on how the French people still value books and independent bookstore in France. The secret? Fixed prices on books across the entire country.

France, meanwhile, has just unanimously passed a so-called anti-Amazon law, which says online sellers can’t offer free shipping on discounted books. (“It will be either cheese or dessert, not both at once,” a French commentator explained.) The new measure is part of France’s effort to promote “biblio-diversity” and help independent bookstores compete. Here, there’s no big bookseller with the power to suddenly turn off the spigot. People in the industry estimate that Amazon has a 10 or 12 percent share of new book sales in France. Amazon reportedly handles 70 percent of the country’s online book sales, but just 18 percent of books are sold online.

The French secret is deeply un-American: fixed book prices. Its 1981 “Lang law,” named after former Culture Minister Jack Lang, says that no seller can offer more than 5 percent off the cover price of new books. That means a book costs more or less the same wherever you buy it in France, even online. The Lang law was designed to make sure France continues to have lots of different books, publishers and booksellers.

And this point, beyond the economics of book pricing, is very important:

What underlies France’s book laws isn’t just an economic position — it’s also a worldview. Quite simply, the French treat books as special. Some 70 percent of French people said they read at least one book last year; the average among French readers was 15 books. Readers say they trust books far more than any other medium, including newspapers and TV. The French government classifies books as an “essential good,” along with electricity, bread and water. (A French friend of mine runs a charity, Libraries Without Borders, which brings books to survivors of natural disasters.) “We don’t force French people to go to bookstores,” explains Vincent Montagne, head of the French Publishers Association. “They go to bookstores because they read.”

Sherlock Holmes is Now in the United States Public Domain

Hear ye, hear ye. A federal judge has ruled that Sir Author Conan Doyle’s Sherlock Holmes stories and (most of) the related characters  are now in the public domain. The New York Times arts beat blog reports:

A federal judge has issued a declarative judgment stating that Holmes, Watson, 221B Baker Street, the dastardly Professor Moriarty and other elements included in the 50 Holmes works that Arthur Conan Doyle published before Jan. 1, 1923, are no longer covered by United States copyright law, and can therefore be freely used by others without paying any licensing fee to the writer’s estate.

The ruling came in response to a civil complaint filed in February by Leslie S. Klinger, the editor of the three-volume, nearly 3,000-page “New Annotated Sherlock Holmes” and a number of other Holmes-related books. The complaint stemmed from “In the Company of Sherlock Holmes,” a collection of new Holmes stories written by different authors and edited by Mr. Klinger and Laurie R. King, herself the author of a mystery series featuring Mary Russell, Holmes’s wife.

But there is a minor twist, because anything that is less than 75 years old is still copyrighted:

Chief Judge Rubén Castillo of the United States District Court of the Northern District of Illinois, Eastern Division, stated that elements introduced in Holmes stories published after 1923 — such as the fact that Watson played rugby for Blackheath, or had a second wife — remain under copyright in the United States. (All of the Holmes stories are already in the public domain in Britain.)

I haven’t seen the show Elementary, but I have enjoyed episodes of the British show Sherlock.

What Mike Tyson Is Reading

Writing in The Wall Street Journal, Mike Tyson wants you to be aware of his erudite side:

I’m currently reading “The Quotable Kierkegaard,” edited by Gordon Marino, a collection of awesome quotes from that great Danish philosopher. (He wanted his epitaph to read: “In yet a little while / I shall have won; / Then the whole fight / Will all at once be done.”) I love reading philosophy. Most philosophers are so politically incorrect—challenging the status quo, even challenging God. Nietzsche’s my favorite. He’s just insane. You have to have an IQ of at least 300 to truly understand him. Apart from philosophy, I’m always reading about history. Someone very wise once said the past is just the present in funny clothes. I read everything about Alexander, so I downloaded “Alexander the Great: The Macedonian Who Conquered the World” by Sean Patrick. Everyone thinks Alexander was this giant, but he was really a runt. “I would rather live a short life of glory than a long one of obscurity,” he said. I so related to that, coming from Brownsville, Brooklyn.

What did I have to look forward to—going in and out of prison, maybe getting shot and killed, or just a life of scuffling around like a common thief? Alexander, Napoleon, Genghis Khan, even a cold pimp like Iceberg Slim—they were all mama’s boys. That’s why Alexander kept pushing forward. He didn’t want to have to go home and be dominated by his mother. In general, I’m a sucker for collections of letters. You think you’ve got deep feelings? Read Napoleon’s love letters to Josephine. It’ll make you think that love is a form of insanity. Or read Virginia Woolf’s last letter to her husband before she loaded her coat up with stones and drowned herself in a river. I don’t really do any light reading, just deep, deep stuff. I’m not a light kind of guy.

I prefer to read the deep, deep stuff as well. Mike Tyson, you have (marginally) redeemed yourself.