Book Review: Ian Leslie’s Curious—The Desire To Know and Why Your Future Depends on It

Everyone is born curious. But only a proportion of the human population retains the habits of exploring, learning, and discovering as they grow older. So why are so many of us allowing our curiosity to wane, when there is evidence that those who are curious tend to be more creative, more intelligent, and more successful?

In Curious: The Desire to Know and Why Your Future Depends on It, Ian Leslie makes a compelling case for the cultivation of our “desire to know.” I’ve had the chance to read the book in advance of its publication date (full disclosure: I received a complimentary advance copy of the book from Basic Books, the publisher of Curious), and this review provides my impressions of the book and highlights some notable passages.

The book is divided into three parts: How Curiosity Works, The Curiosity Divide, and Staying Curious. In the introduction to the book, the case is made for why being curious is vital:

The truly curious will be increasingly in demand. Employers are looking for people who can do more than follow procedures competently or respond to requests, who have a strong, intrinsic desire to learn, solve problems, and ask penetrating questions. They may be difficult to manage at times, these individuals, for their interests and enthusiasms can take them along unpredictable paths, and they don’t respond well to being told what to think. But for the most part, they will be worth the difficulty.

 Another assessment of what this book is about is presented in the introduction: 

If you allow yourself to become incurious, your life will be drained of color, interest, and pleasure. You will be less likely to achieve your potential at work or in creative life. While barely noticing it, you’ll become a little duller, a little dimmer. You may not think it could happen to you, but it can. It can happen to any of us. To stop it from happening, you need to understand what feeds curiosity and what starves it. That’s what this book is about.

Something worth pondering over:

Curiosity is contagious. So is incuriosity.

Something that caught my attention in Part I of the book was the evolutionary of advantage of becoming or staying curious. Here, Leslie cites the research of Stephen Kaplan, an evolutionary psychologist at the University of Michigan:

The more information about her environment a human acquired, the more likely she would be to survive and pass on her genes. Gathering that knowledge meant venturing out into the unknown, to spot new sources of water or edible plants. But doing so meant risking one’s survival; you might become vulnerable to predators or get lost. The individuals more likely to survive would have been those adept at striking a balance between knowledge gathering and self-preservation.

Perhaps as an incentive to take a few risks in the pursuit of new information, evolution tied the act of curiosity to pleasure. Leslie writes how the caudate nucleus, located deep within the human brain, is packed with neurons that traffic in dopamine. As the brain has evolved (from the evolutionary perspective), it seems to have bootstrapped the urge for intellectual investigation onto the same pathway as our primal pleasures (for sex or food). This research was done at California Institute of Technology by asking undergraduates questions whilst they were in the brain scanner. (I need to read this study in depth because Caltech undergrads are naturally some of the most curious individuals on the planet, so we have a potential selection bias at work here).

In a chapter titled “How Curiosity Begins,” Leslie points out how babies respond to curiosity:

Babbling, like pointing, is a sign of readiness to learn, and babies are also more likely to us it as such, if, rather than ignoring them, they try to answer whatever they think the baby’s unintelligible question might be. If a baby looks at an apple and says “Da da da!” and the adult says nothing, the baby not only fails to learn the name of that round greenish object, but also starts to think this whole babbling business might be a waste of time.

One interesting bit about curiosity: we don’t get allocated a fixed amount of it at birth. Instead, we inherit a mercurial quality that rises and falls throughout the day and throughout our lives. Leslie points out that an important input into the curiosity output is the behavior of people around us – if our curiosity is ignited, it grows; on the other hand, if our curiosity is squashed at a point in time, curiosity may wane over the long term.

In a chapter titled “Puzzles and Mysteries,” Leslie describes how curiosity may naturally wane as we grow older:

Computer scientists talk about the differences between exploring and exploiting—a system will learn more if it explores many possibilities, but it will be more effective if it simply acts on the most likely one. As babies grow into children and then into adults, they begin to do more exploiting of whatever knowledge they have acquired. As adults, however, we have a tendency to err too far toward exploitation—we become content to fall back on the stock of knowledge and mental habits we built up when we were young, rather than adding to or revising it. We get lazy.

The so-called curiosity zone is a function of surprise, knowledge, and confidence. Curiosity is highest when the violation of an expectation is more than tiny but less than enormous. When violations are minor, we are quick to ignore them. When they’re massive, we often refuse to acknowledge them we may be scared of what they imply. The less knowledge you have about something, the less likely you are to pursue getting to know it better. Alternatively, if you are an expert in a particular subject area, your capacity to stay very curious about the subject area may have piqued. The curiosity zone is a concave function, where maximum curiosity happens at the middle. Finally, it is important to have an environment that is conducive to curious thinking. Curiosity requires an edge of uncertainty to thrive; too much uncertainty, and it freezes.

A good anecdote is presented in the “Puzzles and Mysteries” chapter on why The Wire was such a great TV show:

One way of describing the achievement of the TV series The Wire was that it took a genre, the police procedural, which is conventionally based on puzzles, in the form of crimes that are solved each week, and turned it into a mystery—the mystery of Baltimore’s crime problem.

So while routine police work may classified as solving puzzles (with a definitive answer), The Wire, showcased it as more akin to a mystery – multilayered, shifting, nuanced (in Leslie’s words). The Wire, to this day, is in my top 3 all-time favourite TV shows, so I was glad to see its incorporation in the book.

What’s the one company that is doing everything it can to deprive you of the itch of curiosity? Answer: Google. Because according to Google’s founders, Larry Page and Sergey Brin, they are working toward the ambition of incorporating search into people’s brains. All information gaps will be closed. I don’t take as a black-and-white stand in that proliferation of Google will make more people incurious, but I do understand Leslie’s perspective. In general, if you were to ask someone “Is the Internet making us stupid or more intelligent,” Leslie’s response would be a simple “Yes.” He writes:

The Internet presents us with more opportunities to learn than ever before and also allows us not to bother. It is a boon to those with a desire to deepen their understanding of the world, and also to those who are only too glad not to have to make the effort…If you’re incurious—or, like most of us, a little lazy—then you will use the Internet to look at pictures of cats and get into arguments with strangers.

Ian Leslie does a good job of assimilating related research into Curious. For instance, what matters in students are their character traits such as attitude toward learning and conscientiousness, as well as persistence, self-discipline, and what the psychologist Angela Duckworth termed “grit”—the ability to deal with failure, overcome setbacks, and focus on long-term goals. In a chapter titled “The Power of Questions,” Leslie quotes the former CEO of Dow Chemical, Mike Parker: 

A lot of bad leadership comes from an inability or unwillingness to ask questions. I have watched talented people—people with much higher IQs than mine—who have failed as leaders. They can talk brilliantly, with a great breadth of knowledge, but they’re not very good at asking questions. So while they know a lot at a high level, they don’t know what’s going on way down in the system. Sometimes they are afraid of asking questions, but what they don’t realize is that the dumbest questions can be very powerful. They can unlock a conversation.

In what I think is the most important chapter of the book, “The Importance of Knowing,” Leslie highlights the importance of epistemic knowledge, and provides evidence to debunk some of the “twenty-first century” mindset. Leslie presents three misapprehensions about learning, common to the supporters of “curiosity-driven” education:

  • Children don’t need teachers to instruct them. Those who think the natural curiosity of children is stifled by pedagogical instruction overlook something fundamental about human nature—as a species, we have always depended on the epistemic endowment of our elders and ancestors. As Leslie writes, every scientist stands on the shoulders of giants; every artist works within or against a tradition. The unusually long period for which children are dependent on adults is a clue that humans are designed to learn from others, rather than merely through their own explorations. Traditional teaching—the transmission of information from adults to children—is highly effective when skillfully executed. Citing the research of John Hattie, the three most powerful teacher factors (those that lead to student success) are feedback, quality of instruction, and direct instruction.
  • Facts kill creativity. At the most basic level, all of our new ideas are somehow linked to old ones. The more existing ideas you have in your head, the more varied and rich and blossoming will be your novel combination of them, and the greater your store of reference points and analogies. Per Leslie: “a fact is a particular class of idea about the world, and it can be put to work in a lot of different ways.” In this section, Leslie refers to Sir Ken Robinson’s famous 2008 talk on educational reform titled “Do Schools Kill Creativity” and the proceeds to justify that Sir Robinson’s arguments about creativity are almost entirely baseless.
  • Schools should teach thinking skills instead of knowledge. Learning different skills grow organically out of specific knowledge of specific domains—that is, facts. The wider your knowledge, the more widely your intelligence can range and the more purchase it gets on new information. This is why the argument that schools ought to prioritize learning skills over knowledge makes no sense, argues Leslie: the very foundation for such skills is memorized knowledge. The more we know, the better we are at thinking.

On how knowledge gives curiosity the staying power, Leslie writes:

This is why curiosity, like other thinking skills, cannot be nurtured, or taught, in the abstract. Rather than being stifled by factual knowledge, it depends on it. Until a child has been taught the basic information she needs to start thinking more deeply about a particular subject, it’s hard to develop her initial (diversive) curiosity into enduring (epistemic) curiosity, to get her to the stage where she is hungry for more knowledge…Sir Ken Robinson has it precisely the wrong way around when he says that the natural appetite for learning begins to dissipate once children start to be educated. The curiosity of children dissipates when it doesn’t get fed by knowledge, imparted by parents and teachers.

In short, background knowledge is vital, kindling curiosity. From personal experience, I happen to think that there is also a positive feedback loop in place; the more you know, the more curious you become, the more knowledgeable you become over time because you seek to gain more knowledge through your curiosity.

In the last part of the book, Leslie outlines seven ways to stay curious. They are as follows:

  1. Stay foolish. Echoing Steve Jobs’s memorable commencement address, in which Jobs advised Stanford graduates to “Stay hungry, stay foolish,” Ian Leslie points out how Jobs’s curiosity was crucial to his ability to invent and reinvent the businesses in which he was involved (Apple, Pixar).
  2. Build the database. The idea behind this premise is that any project or task that requires deep creative thought will be better addressed by someone who has deep knowledge of the task at hand and general background knowledge of the culture in which it and its users (or readers, or viewers) live. Leslie writes:

    Highly curious people, who have carefully cultivated their long-term memories, live in a kind of augmented reality; everything they see is overlaid with additional layers of meaning and possibility, unavailable to ordinary observers.

  3. Forage like a foxhog. In the words of the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.” The fox evades predators via a variety of techniques, while the hedge adopts one trusted technique (hunkering down and relying on its spikes to thwart a predator). And the thinkers that are best positioned to thrive today and in the future are likely a hybrid of the fox and the hedgehog: the foxhog. You need to be specialized in one or two subject areas (what are knowns as SMEs, or subject matter experts) but also to be a voracious consumer of knowledge from other fields. In short, combine breadth and depth into your skill set.
  4. Ask the big why. In a useful anecdote from the book Negotiation Genius by Harvard Business School professors Deepak Malhotra and Max H. Bazerman, Leslie points out how asking “why” is such a critical component in the negotiation process. If two parties negotiate on their preagreed positions, the negotiation becomes a trade-off where one side necessarily loses with respect to the other, which gains. So then the key is to really try to understand what’s motivating the other party’s interestsand this involves asking the probing, penetrating questions which can be summarized with the why.There is an interesting diversion in this point on the Big Data movement. One of the proponents of it, Chris Anderson (who was formerly editor of Wired), has made the extreme case of asking the Big What instead of the Big Why. With enough data, the premise is that you can glean behavior from the patterns that is observed. But I don’t think it’s that simple. In fact, the more data you collect, the more likely you are to start forming false narratives (Nassim Nicholas Taleb makes a great point of this fact in his excellent book, Antifragile). When we have a lot of data to work with, we get things like spurious correlations.
  5. Be a thinkerer. A portmanteau of “think” and “tinker,” the origin of the verb “to thinker” is unknown. Leslie mentions that he was introduced to the term by Paola Antonelli of Museum of Modern Art in New York City, who traced it to a 2007 presentation given by John Seely Brown (formerly the director of the Xerox Palo Alto Research Center). The idea is enunciated well by Peter Thiel:

    A fundamental challenge—in business as in life—is to integrate the micro and macro such that all things make sense. Humanities majors may well learn a great deal about the world. But they don’t really learn career skills through their studies. Engineering majors, conversely, learn in great technical detail. But they might not learn why, how, or where they should apply their skills in the workforce. The best students, workers, and thinkers will integrate these questions into a cohesive narrative.

  6. Question your teaspoons. The idea is to become aware and curious about your daily surroundings. Parking garage roofs, hand dryers, milk, paint catalogs, and bus routes–they sound mundane but if you dig deeper, you can find out how complex and intricate they can really be. This is what led James Ward to found The Boring Conference (which is a lot more interesting than it sounds!). Leslie points out a good example: Laura McInerney, who used to work at McDonalds. Her shift would be to make the daily breakfast by breaking four hundred eggs, a mind-numbing ordeal on a day-to-day basis. But then she started asking questions on how the proteins in the egg change as the egg is heated, and how she started reflecting on whether it was ethically right to steal eggs from a chicken, or whether the egg or the chicken came first?
  7. Turn puzzles into mysteries. The premise here is simple: a puzzle is something that commands our curiosity until we have solved it. A mystery, by contrast, is something that never stops inviting (further) inquiry. The way to stay curious, then, is for every puzzle that we come across in our daily lives, be cognizant that there may be an underlying mystery behind it that would be worth exploring/pursuing.

In the Afterword of Curious, Leslie highlights one of my all-time favourite commencement speeches, that given by David Foster Wallace to the graduating class of 2005. In it, Wallace argues that we are inherently self-centered (because the world we experience is in front and behind us, above and below us, and it is immediate). It is only through the exercise of our curiosity about others that we can free ourselves about our hard-wired self-obsession. We should be curious about others not just because it is virtuous, but because it’s also a coping mechanism of the routine, petty frustration of day-to-day life.

The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.

That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.

Ian Leslie’s Curious: The Desire to Know and Why Your Future Depends on It is a well-researched book that cites a number of relevant scientific studies, frames concepts related to knowledge and curiosity with interesting anecdotes, and has a solid bibliography for the curious people to dive further after finishing Curious.

I highly recommend the book. It is available on Amazon (hardcover or for the Kindle) or your favourite bookseller beginning today, August 26, 2014. 

How the Brain is Like a Muscle

Salman Khan, founder of the Khan Academy, reflects in a personal blog post on the learning myth:

Researchers have known for some time that the brain is like a muscle; that the more you use it, the more it grows. They’ve found that neural connections form and deepen most when we make mistakes doing difficult tasks rather than repeatedly having success with easy ones. What this means is that our intelligence is not fixed, and the best way that we can grow our intelligence is to embrace tasks where we might struggle and fail.

However, not everyone realizes this. Dr. Carol Dweck of Stanford University has been studying people’s mindsets towards learning for decades. She has found that most people adhere to one of two mindsets: fixed or growth. Fixed mindsets mistakenly believe that people are either smart or not, that intelligence is fixed by genes. People with growth mindsets correctly believe that capability and intelligence can be grown through effort, struggle and failure. Dweck found that those with a fixed mindset tended to focus their effort on tasks where they had a high likelihood of success and avoided tasks where they may have had to struggle, which limited their learning. People with a growth mindset, however, embraced challenges, and understood that tenacity and effort could change their learning outcomes. As you can imagine, this correlated with the latter group more actively pushing themselves and growing intellectually.

The Language of Food: A Linguist Reads The Menu

The Atlantic has an interesting preview of Dan Jurafsky’s The Language of Food: A Linguist Reads the Menu, coming out in September: 

You needn’t be a linguist to note changes in the language of menus, but Stanford’s Dan Jurafsky has written a book doing just that. In The Language of Food: A Linguist Reads the Menu, Jurafsky describes how he and some colleagues analyzed a database of 6,500 restaurant menus describing 650,000 dishes from across the U.S. Among their findings: fancy restaurants, not surprisingly, use fancier—and longer—words than cheaper restaurants do (think accompaniments and decaffeinated coffee, not sides and decaf). Jurafsky writes that “every increase of one letter in the average length of words describing a dish is associated with an increase of 69 cents in the price of that dish.” Compared with inexpensive restaurants, the expensive ones are “three times less likely to talk about the diner’s choice” (your way, etc.) and “seven times more likely to talk about the chef’s choice.”

Lower-priced restaurants, meanwhile, rely on “linguistic fillers”: subjective words like delicious, flaky, and fluffy. These are the empty calories of menus, less indicative of flavor than of low prices. Cheaper establishments also use terms like ripe and fresh, which Jurafsky calls “status anxiety” words. Thomas Keller’s Per Se, after all, would never use fresh—that much is taken for granted—but Subway would. Per Se does, however, engage in the trendy habit of adding provenance to descriptions of ingredients (Island Creek oysters, Frog Hollow’s peaches). According to Jurafsky, very expensive restaurants “mention the origins of the food more than 15 times as often as inexpensive restaurants.”

Putting this book on my to-read list.

Bill Gates on the Future of College

At the National Association of College and University Business Officers Annual Meeting on July 21, 2014, Bill Gates delivered an address on the “Future of College” in America. A transcription is on Mr. Gates’s blog.

Looking at the individual level of opportunity, do people have equal opportunity? The data we see shows that, unless you’re given the preparation and access to higher education, and unless you have a successful completion of that higher education, your economic opportunity is greatly, greatly reduced. There’s a lot of data recently talking about the premium in salaries for people with four-year college degrees. In 2013, people with four-year college degrees earned 98 percent more per hour, on average, than people without degrees. That differential has gone up a lot. A generation ago, it was only 64 percent.

If you look at the numbers more closely, you will also see that unemployment, partial employment, is primarily in people without four-year degrees. Our economy already is near full employment for people with full year degrees. And, so, the uncertainty, the difficulty, the challenges, faced, if you haven’t been able to get a higher education degree, are very difficult already today. And, with changes coming in the economy, with more automation, more globalization, that divide will become even more stark in the years ahead.

So, if we’re really serious about all lives having equal value, we need to make sure that the higher education system, both access, completion, and excellence, are getting the attention they need.

It is unfortunate that, although the US does quite well in the percentage of kids going into higher education, we’ve actually dropped, quite dramatically, in the percentage who complete higher education. We have, amongst developed countries, the highest dropout rate of kids who start. And, understanding why that happens is very, very important. For many of those kids, that experience is not only financially debilitating, being left with loans that are hard to pay off, but, also, psychologically, very debilitating, that they expected to complete, they tried to complete. And, whether it was math or getting the right courses, or the scheduling, somehow, they weren’t able to do that, which is a huge setback.

Worth the read in entirety.

Why Photography Matters: an Airbnb Case Study

This is a superb read on one of my favorite start-ups, Airbnb, and how the company was able to double its revenues after a critical decision was made: get professional-looking photos of the listings.

At the time, Airbnb was part of Y Combinator. One afternoon, the team was poring over their search results for New York City listings with Paul Graham, trying to figure out what wasn’t working, why they weren’t growing. After spending time on the site using the product, Gebbia had a realization. “We noticed a pattern. There’s some similarity between all these 40 listings. The similarity is that the photos sucked. The photos were not great photos. People were using their camera phones or using their images from classified sites.  It actually wasn’t a surprise that people weren’t booking rooms because you couldn’t even really see what it is that you were paying for.”

Graham tossed out a completely non-scalable and non-technical solution to the problem: travel to New York, rent a camera, spend some time with customers listing properties, and replace the amateur photography with beautiful high-resolution pictures. The three-man team grabbed the next flight to New York and upgraded all the amateur photos to beautiful images. There wasn’t any data to back this decision originally. They just went and did it. A week later, the results were in: improving the pictures doubled the weekly revenue to $400 per week. This was the first financial improvement that the company had seen in over eight months. They knew they were onto something.

This was the turning point for the company. Gebbia shared that the team initially believed that everything they did had to be ‘scalable.’ It was only when they gave themselves permission to experiment with non-scalable changes to the business that they climbed out of what they called the ‘trough of sorrow.’

Here’s the takeaway:

Gebbia’s experience with upgrading photographs proved that code alone can’t solve every problem that customers have. While computers are powerful, there’s only so much that software alone can achieve. Silicon Valley entrepreneurs tend to become comfortable in their roles as keyboard jockeys. However, going out to meet customers in the real world is almost always the best way to wrangle their problems and come up with clever solutions. 

Read the rest here.

 

IBM’s SyNAPSE Chip Moves Closer to Brain-Like Computing

This week, scientists at IBM research unveiled a brain-inspired computer and ecosystem. From their press release on the so-called SyNAPSE chip:

Scientists from IBM unveiled the first neurosynaptic computer chip to achieve an unprecedented scale of one million programmable neurons, 256 million programmable synapses and 46 billion synaptic operations per second per watt. At 5.4 billion transistors, this fully functional and production-scale chip is currently one of the largest CMOS chips ever built, yet, while running at biological real time, it consumes a minuscule 70mW—orders of magnitude less power than a modern microprocessor.

MIT Technology Review has a good summary as well:

IBM’s SyNapse chip processes information using a network of just over one million “neurons,” which communicate with one another using electrical spikes—as actual neurons do. The chip uses the same basic components as today’s commercial chips—silicon transistors. But its transistors are configured to mimic the behavior of both neurons and the connections—synapses—between them.

The SyNapse chip breaks with a design known as the Von Neuman architecture that has underpinned computer chips for decades. Although researchers have been experimenting with chips modeled on brains—known as neuromorphic chips—since the late 1980s, until now all have been many times less complex, and not powerful enough to be practical (see “Thinking in Silicon”). Details of the chip were published today in the journal Science.

The new chip is not yet a product, but it is powerful enough to work on real-world problems. In a demonstration at IBM’s Almaden research center, MIT Technology Review saw one recognize cars, people, and bicycles in video of a road intersection. A nearby laptop that had been programed to do the same task processed the footage 100 times slower than real time, and it consumed 100,000 times as much power as the IBM chip. IBM researchers are now experimenting with connecting multiple SyNapse chips together, and they hope to build a supercomputer using thousands.

I think this kind of experimentation is fascinating. You can read more at Science Magazine (subscription required to view full text).

 

College Football in America: Athletics over Academics

This is an unsettling piece in The New York Times on the biggest college football conferences (the SEC, the ACC, the Pacific-12, the Big Ten, and the Big 12) vying to become more autonomous:

This is a portrait of life in the wealthiest districts of college sports.

The denizens of these rarefied quarters, universities like Alabama and Louisiana State, are still institutions of higher education. But athletics have become ever more central to their missions, and their bottom lines, thanks to the juggernaut programs that generate hundreds of millions of dollars a year.

Recruiters fly on private planes, athletes train on top-of-the-line equipment, and teams compete in mammoth stadiums that are the envy of many professional teams. It is not uncommon for a university’s athletic budget to exceed $60 million.

I went to an ACC school that is known for its academic rigor: Georgia Tech. But even there, I felt the athletics often overshadowed academics. Those that attended the university on an athletic scholarship had their priorities in the following order: 1) sports and/or team the athlete was competing for and 2) academics.

The new rules will likely sway the athletics over academics even further. Sad.