Book Review: Ian Leslie’s Curious—The Desire To Know and Why Your Future Depends on It

Everyone is born curious. But only a proportion of the human population retains the habits of exploring, learning, and discovering as they grow older. So why are so many of us allowing our curiosity to wane, when there is evidence that those who are curious tend to be more creative, more intelligent, and more successful?

In Curious: The Desire to Know and Why Your Future Depends on It, Ian Leslie makes a compelling case for the cultivation of our “desire to know.” I’ve had the chance to read the book in advance of its publication date (full disclosure: I received a complimentary advance copy of the book from Basic Books, the publisher of Curious), and this review provides my impressions of the book and highlights some notable passages.

The book is divided into three parts: How Curiosity Works, The Curiosity Divide, and Staying Curious. In the introduction to the book, the case is made for why being curious is vital:

The truly curious will be increasingly in demand. Employers are looking for people who can do more than follow procedures competently or respond to requests, who have a strong, intrinsic desire to learn, solve problems, and ask penetrating questions. They may be difficult to manage at times, these individuals, for their interests and enthusiasms can take them along unpredictable paths, and they don’t respond well to being told what to think. But for the most part, they will be worth the difficulty.

 Another assessment of what this book is about is presented in the introduction: 

If you allow yourself to become incurious, your life will be drained of color, interest, and pleasure. You will be less likely to achieve your potential at work or in creative life. While barely noticing it, you’ll become a little duller, a little dimmer. You may not think it could happen to you, but it can. It can happen to any of us. To stop it from happening, you need to understand what feeds curiosity and what starves it. That’s what this book is about.

Something worth pondering over:

Curiosity is contagious. So is incuriosity.

Something that caught my attention in Part I of the book was the evolutionary of advantage of becoming or staying curious. Here, Leslie cites the research of Stephen Kaplan, an evolutionary psychologist at the University of Michigan:

The more information about her environment a human acquired, the more likely she would be to survive and pass on her genes. Gathering that knowledge meant venturing out into the unknown, to spot new sources of water or edible plants. But doing so meant risking one’s survival; you might become vulnerable to predators or get lost. The individuals more likely to survive would have been those adept at striking a balance between knowledge gathering and self-preservation.

Perhaps as an incentive to take a few risks in the pursuit of new information, evolution tied the act of curiosity to pleasure. Leslie writes how the caudate nucleus, located deep within the human brain, is packed with neurons that traffic in dopamine. As the brain has evolved (from the evolutionary perspective), it seems to have bootstrapped the urge for intellectual investigation onto the same pathway as our primal pleasures (for sex or food). This research was done at California Institute of Technology by asking undergraduates questions whilst they were in the brain scanner. (I need to read this study in depth because Caltech undergrads are naturally some of the most curious individuals on the planet, so we have a potential selection bias at work here).

In a chapter titled “How Curiosity Begins,” Leslie points out how babies respond to curiosity:

Babbling, like pointing, is a sign of readiness to learn, and babies are also more likely to us it as such, if, rather than ignoring them, they try to answer whatever they think the baby’s unintelligible question might be. If a baby looks at an apple and says “Da da da!” and the adult says nothing, the baby not only fails to learn the name of that round greenish object, but also starts to think this whole babbling business might be a waste of time.

One interesting bit about curiosity: we don’t get allocated a fixed amount of it at birth. Instead, we inherit a mercurial quality that rises and falls throughout the day and throughout our lives. Leslie points out that an important input into the curiosity output is the behavior of people around us – if our curiosity is ignited, it grows; on the other hand, if our curiosity is squashed at a point in time, curiosity may wane over the long term.

In a chapter titled “Puzzles and Mysteries,” Leslie describes how curiosity may naturally wane as we grow older:

Computer scientists talk about the differences between exploring and exploiting—a system will learn more if it explores many possibilities, but it will be more effective if it simply acts on the most likely one. As babies grow into children and then into adults, they begin to do more exploiting of whatever knowledge they have acquired. As adults, however, we have a tendency to err too far toward exploitation—we become content to fall back on the stock of knowledge and mental habits we built up when we were young, rather than adding to or revising it. We get lazy.

The so-called curiosity zone is a function of surprise, knowledge, and confidence. Curiosity is highest when the violation of an expectation is more than tiny but less than enormous. When violations are minor, we are quick to ignore them. When they’re massive, we often refuse to acknowledge them we may be scared of what they imply. The less knowledge you have about something, the less likely you are to pursue getting to know it better. Alternatively, if you are an expert in a particular subject area, your capacity to stay very curious about the subject area may have piqued. The curiosity zone is a concave function, where maximum curiosity happens at the middle. Finally, it is important to have an environment that is conducive to curious thinking. Curiosity requires an edge of uncertainty to thrive; too much uncertainty, and it freezes.

A good anecdote is presented in the “Puzzles and Mysteries” chapter on why The Wire was such a great TV show:

One way of describing the achievement of the TV series The Wire was that it took a genre, the police procedural, which is conventionally based on puzzles, in the form of crimes that are solved each week, and turned it into a mystery—the mystery of Baltimore’s crime problem.

So while routine police work may classified as solving puzzles (with a definitive answer), The Wire, showcased it as more akin to a mystery – multilayered, shifting, nuanced (in Leslie’s words). The Wire, to this day, is in my top 3 all-time favourite TV shows, so I was glad to see its incorporation in the book.

What’s the one company that is doing everything it can to deprive you of the itch of curiosity? Answer: Google. Because according to Google’s founders, Larry Page and Sergey Brin, they are working toward the ambition of incorporating search into people’s brains. All information gaps will be closed. I don’t take as a black-and-white stand in that proliferation of Google will make more people incurious, but I do understand Leslie’s perspective. In general, if you were to ask someone “Is the Internet making us stupid or more intelligent,” Leslie’s response would be a simple “Yes.” He writes:

The Internet presents us with more opportunities to learn than ever before and also allows us not to bother. It is a boon to those with a desire to deepen their understanding of the world, and also to those who are only too glad not to have to make the effort…If you’re incurious—or, like most of us, a little lazy—then you will use the Internet to look at pictures of cats and get into arguments with strangers.

Ian Leslie does a good job of assimilating related research into Curious. For instance, what matters in students are their character traits such as attitude toward learning and conscientiousness, as well as persistence, self-discipline, and what the psychologist Angela Duckworth termed “grit”—the ability to deal with failure, overcome setbacks, and focus on long-term goals. In a chapter titled “The Power of Questions,” Leslie quotes the former CEO of Dow Chemical, Mike Parker: 

A lot of bad leadership comes from an inability or unwillingness to ask questions. I have watched talented people—people with much higher IQs than mine—who have failed as leaders. They can talk brilliantly, with a great breadth of knowledge, but they’re not very good at asking questions. So while they know a lot at a high level, they don’t know what’s going on way down in the system. Sometimes they are afraid of asking questions, but what they don’t realize is that the dumbest questions can be very powerful. They can unlock a conversation.

In what I think is the most important chapter of the book, “The Importance of Knowing,” Leslie highlights the importance of epistemic knowledge, and provides evidence to debunk some of the “twenty-first century” mindset. Leslie presents three misapprehensions about learning, common to the supporters of “curiosity-driven” education:

  • Children don’t need teachers to instruct them. Those who think the natural curiosity of children is stifled by pedagogical instruction overlook something fundamental about human nature—as a species, we have always depended on the epistemic endowment of our elders and ancestors. As Leslie writes, every scientist stands on the shoulders of giants; every artist works within or against a tradition. The unusually long period for which children are dependent on adults is a clue that humans are designed to learn from others, rather than merely through their own explorations. Traditional teaching—the transmission of information from adults to children—is highly effective when skillfully executed. Citing the research of John Hattie, the three most powerful teacher factors (those that lead to student success) are feedback, quality of instruction, and direct instruction.
  • Facts kill creativity. At the most basic level, all of our new ideas are somehow linked to old ones. The more existing ideas you have in your head, the more varied and rich and blossoming will be your novel combination of them, and the greater your store of reference points and analogies. Per Leslie: “a fact is a particular class of idea about the world, and it can be put to work in a lot of different ways.” In this section, Leslie refers to Sir Ken Robinson’s famous 2008 talk on educational reform titled “Do Schools Kill Creativity” and the proceeds to justify that Sir Robinson’s arguments about creativity are almost entirely baseless.
  • Schools should teach thinking skills instead of knowledge. Learning different skills grow organically out of specific knowledge of specific domains—that is, facts. The wider your knowledge, the more widely your intelligence can range and the more purchase it gets on new information. This is why the argument that schools ought to prioritize learning skills over knowledge makes no sense, argues Leslie: the very foundation for such skills is memorized knowledge. The more we know, the better we are at thinking.

On how knowledge gives curiosity the staying power, Leslie writes:

This is why curiosity, like other thinking skills, cannot be nurtured, or taught, in the abstract. Rather than being stifled by factual knowledge, it depends on it. Until a child has been taught the basic information she needs to start thinking more deeply about a particular subject, it’s hard to develop her initial (diversive) curiosity into enduring (epistemic) curiosity, to get her to the stage where she is hungry for more knowledge…Sir Ken Robinson has it precisely the wrong way around when he says that the natural appetite for learning begins to dissipate once children start to be educated. The curiosity of children dissipates when it doesn’t get fed by knowledge, imparted by parents and teachers.

In short, background knowledge is vital, kindling curiosity. From personal experience, I happen to think that there is also a positive feedback loop in place; the more you know, the more curious you become, the more knowledgeable you become over time because you seek to gain more knowledge through your curiosity.

In the last part of the book, Leslie outlines seven ways to stay curious. They are as follows:

  1. Stay foolish. Echoing Steve Jobs’s memorable commencement address, in which Jobs advised Stanford graduates to “Stay hungry, stay foolish,” Ian Leslie points out how Jobs’s curiosity was crucial to his ability to invent and reinvent the businesses in which he was involved (Apple, Pixar).
  2. Build the database. The idea behind this premise is that any project or task that requires deep creative thought will be better addressed by someone who has deep knowledge of the task at hand and general background knowledge of the culture in which it and its users (or readers, or viewers) live. Leslie writes:

    Highly curious people, who have carefully cultivated their long-term memories, live in a kind of augmented reality; everything they see is overlaid with additional layers of meaning and possibility, unavailable to ordinary observers.

  3. Forage like a foxhog. In the words of the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.” The fox evades predators via a variety of techniques, while the hedge adopts one trusted technique (hunkering down and relying on its spikes to thwart a predator). And the thinkers that are best positioned to thrive today and in the future are likely a hybrid of the fox and the hedgehog: the foxhog. You need to be specialized in one or two subject areas (what are knowns as SMEs, or subject matter experts) but also to be a voracious consumer of knowledge from other fields. In short, combine breadth and depth into your skill set.
  4. Ask the big why. In a useful anecdote from the book Negotiation Genius by Harvard Business School professors Deepak Malhotra and Max H. Bazerman, Leslie points out how asking “why” is such a critical component in the negotiation process. If two parties negotiate on their preagreed positions, the negotiation becomes a trade-off where one side necessarily loses with respect to the other, which gains. So then the key is to really try to understand what’s motivating the other party’s interestsand this involves asking the probing, penetrating questions which can be summarized with the why.There is an interesting diversion in this point on the Big Data movement. One of the proponents of it, Chris Anderson (who was formerly editor of Wired), has made the extreme case of asking the Big What instead of the Big Why. With enough data, the premise is that you can glean behavior from the patterns that is observed. But I don’t think it’s that simple. In fact, the more data you collect, the more likely you are to start forming false narratives (Nassim Nicholas Taleb makes a great point of this fact in his excellent book, Antifragile). When we have a lot of data to work with, we get things like spurious correlations.
  5. Be a thinkerer. A portmanteau of “think” and “tinker,” the origin of the verb “to thinker” is unknown. Leslie mentions that he was introduced to the term by Paola Antonelli of Museum of Modern Art in New York City, who traced it to a 2007 presentation given by John Seely Brown (formerly the director of the Xerox Palo Alto Research Center). The idea is enunciated well by Peter Thiel:

    A fundamental challenge—in business as in life—is to integrate the micro and macro such that all things make sense. Humanities majors may well learn a great deal about the world. But they don’t really learn career skills through their studies. Engineering majors, conversely, learn in great technical detail. But they might not learn why, how, or where they should apply their skills in the workforce. The best students, workers, and thinkers will integrate these questions into a cohesive narrative.

  6. Question your teaspoons. The idea is to become aware and curious about your daily surroundings. Parking garage roofs, hand dryers, milk, paint catalogs, and bus routes–they sound mundane but if you dig deeper, you can find out how complex and intricate they can really be. This is what led James Ward to found The Boring Conference (which is a lot more interesting than it sounds!). Leslie points out a good example: Laura McInerney, who used to work at McDonalds. Her shift would be to make the daily breakfast by breaking four hundred eggs, a mind-numbing ordeal on a day-to-day basis. But then she started asking questions on how the proteins in the egg change as the egg is heated, and how she started reflecting on whether it was ethically right to steal eggs from a chicken, or whether the egg or the chicken came first?
  7. Turn puzzles into mysteries. The premise here is simple: a puzzle is something that commands our curiosity until we have solved it. A mystery, by contrast, is something that never stops inviting (further) inquiry. The way to stay curious, then, is for every puzzle that we come across in our daily lives, be cognizant that there may be an underlying mystery behind it that would be worth exploring/pursuing.

In the Afterword of Curious, Leslie highlights one of my all-time favourite commencement speeches, that given by David Foster Wallace to the graduating class of 2005. In it, Wallace argues that we are inherently self-centered (because the world we experience is in front and behind us, above and below us, and it is immediate). It is only through the exercise of our curiosity about others that we can free ourselves about our hard-wired self-obsession. We should be curious about others not just because it is virtuous, but because it’s also a coping mechanism of the routine, petty frustration of day-to-day life.

The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.

That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.

Ian Leslie’s Curious: The Desire to Know and Why Your Future Depends on It is a well-researched book that cites a number of relevant scientific studies, frames concepts related to knowledge and curiosity with interesting anecdotes, and has a solid bibliography for the curious people to dive further after finishing Curious.

I highly recommend the book. It is available on Amazon (hardcover or for the Kindle) or your favourite bookseller beginning today, August 26, 2014. 

The Incredible Story of How the iPhone Came to Be

The best thing I’ve read today is this fascinating New York Times Magazine piece on how the iPhone was developed. From concept to prototype to Steve Jobs’s unveiling of the revolutionary device, this piece has it all. It is so much better than the section devoted to the iPhone in Walter Isaacson’s biography of Steve Jobs. Absolutely a must-read.

On how incredibly secretive Steve Jobs tried to keep the announcement of the iPhone:

Jobs was so obsessed with leaks that he tried to have all the contractors Apple hired — from people manning booths and doing demos to those responsible for lighting and sound — sleep in the building the night before his presentation. Aides talked him out of it.

Now this is a great way to phrase it:

Ponder the individual impacts of the book, the newspaper, the telephone, the radio, the tape recorder, the camera, the video camera, the compass, the television, the VCR and the DVD, the personal computer, the cellphone, the video game and the iPod. The smartphone is all those things, and it fits in your pocket

On the initial gamble of the iPhone and the buggy versions that existed at launch:

It’s hard to overstate the gamble Jobs took when he decided to unveil the iPhone back in January 2007. Not only was he introducing a new kind of phone — something Apple had never made before — he was doing so with a prototype that barely worked. Even though the iPhone wouldn’t go on sale for another six months, he wanted the world to want one right then. In truth, the list of things that still needed to be done was enormous. A production line had yet to be set up. Only about a hundred iPhones even existed, all of them of varying quality. Some had noticeable gaps between the screen and the plastic edge; others had scuff marks on the screen. And the software that ran the phone was full of bugs.

This bit about the compromises that Apple took to make the demo iPhone work is phenomenal:

The software in the iPhone’s Wi-Fi radio was so unstable that Grignon and his team had to extend the phones’ antennas by connecting them to wires running offstage so the wireless signal wouldn’t have to travel as far. And audience members had to be prevented from getting on the frequency being used. “Even if the base station’s ID was hidden” — that is, not showing up when laptops scanned for Wi-Fi signals — “you had 5,000 nerds in the audience,” Grignon says. “They would have figured out how to hack into the signal.” The solution, he says, was to tweak the AirPort software so that it seemed to be operating in Japan instead of the United States. Japanese Wi-Fi uses some frequencies that are not permitted in the U.S.

There were multiple versions of the iPhone built near launch:

Many executives and engineers, riding high from their success with the iPod, assumed a phone would be like building a small Macintosh. Instead, Apple designed and built not one but three different early versions of the iPhone in 2005 and 2006. One person who worked on the project thinks Apple then made six fully working prototypes of the device it ultimately sold — each with its own set of hardware, software and design tweaks. 

The first iPhone prototype in 2005 had a wheel (like the iPod):

From the start of the project, Jobs hoped that he would be able to develop a touch-screen iPhone running OS X similar to what he ended up unveiling. But in 2005 he had no idea how long that would take. So Apple’s first iPhone looked very much like the joke slide Jobs put up when introducing the real iPhone — an iPod with an old-fashioned rotary dial on it. The prototype really was an iPod with a phone radio that used the iPod click wheel as a dialer. “It was an easy way to get to market, but it was not cool like the devices we have today,” Grignon says.

On how stressful the environment was:

The pressure to meet Jobs’s deadlines was so intense that normal discussions quickly devolved into shouting matches. Exhausted engineers quit their jobs — then came back to work a few days later once they had slept a little. Forstall’s chief of staff, Kim Vorrath, once slammed her office door so hard it got stuck and locked her in, and co-workers took more than an hour to get her out. “We were all standing there watching it,” Grignon says. “Part of it was funny. But it was also one of those moments where you step back and realize how [expletive] it all is.”

And that ending to the piece? What a tear jerker. It put a huge smile on my face.

Seriously, you should read the whole thing.

Go Against Your Instinct

Dustin Curtis recently had a friend who went into cardiac arrest during a session at the gym. This event forced him to evaluate his reason for being. The post is excellent:

Humans are by default hopeful and optimistic creatures. We usually think about the future as though it will occur for us with absolute certainty, and that makes it hard to imagine death as a motivation for living. But knowing that my friend could potentially never wake up forced me, unexpectedly, to contemplate my personal drive for existence. Why do I do the things I do every day? Am I honestly acting out my dreams and aspirations? What’s my purpose? For a long time, when I was younger, I waited to discover my purpose. It was only very recently that I realized purpose is something you are supposed to create for yourself.

After my own comparatively minor brush with death a few years ago, when I was 22, I pledged to live my life as fully as possible, as though I had nothing to lose. For a few months afterward, I consciously tried to fight against the status quo. It’s so easy to get stuck in the waiting place, putting things off until later, even when those things are vitally important to making your dreams come true. But the truth is that, in order to make progress, you need to physically and mentally fight against the momentum of ordinary events. The default state of any new idea is failure. It’s the execution–the fight against inertia–that matters. You have to remember to go against your instinct, to confront the ordinary, and to put up a fight.

Does the statement below ring a bell for you?

It used to confuse and fascinate me how so many people with great dreams and great visions of the future can live such ordinary, repetitive lives. But now I know. I’ve experienced it. Doing something remarkable with your life is tough work, and it helps to remember one simple, motivating fact: in a blink, you could be gone.

Complement Dustin’s essay with Steve Jobs’s vision for the world.

Apple’s Tribute to Steve Jobs, One Year Later

Apple.com has a beautiful tribute to Steve Jobs, who died one year ago today. Click on the screenshot below to watch the video.

Remembering Steve Jobs, one year after his death.

Here is what I wrote one year ago today after I learned of Steve’s passing. Here are all the Steve Jobs posts on this blog. Here is the video on YouTube (unless it gets pulled).

Apple after Steve Jobs

In an interview with Wall Street Journal columnists Kara Swisher and Walt Mossberg, Apple CEO Tim Cook offers a few answers on Apple after Steve Jobs:

Q: How is Apple different with you as the CEO? What did you learn from Steve?

Cook:  I learned that focus is key, not just in running a company but in your personal life. You can only do so many things great, and you should cast aside everything else.Another thing that Steve taught us all was to not focus on the past. If you’ve done something great or terrible, forget it and go on and create the next thing. When I say that I’m not going to witness or permit the change, I’m talking about the thing that’s most important in Apple—the culture of Apple. Am I going to change anything? Of course.

Q: At any one time, there is only one new iPhone. That’s not the way you did it with the iPod; that’s not the way you did it with the Mac. Why don’t you have more than one iPhone, and why don’t you have more than one iPad?

Cook: Our North Star is to make the best product. Our objective isn’t to make this design for this kind of price point or make this design for this arbitrary schedule or line up other things or have X number of phones. I think one of our advantages is that we’re not fragmented. We have one app store, so you know what app store to go to. We have one phone with one screen size with one resolution, so it’s pretty simple if you’re a developer developing for this platform.

Perhaps the most succinct point that Cook tries to make in the interview: Apple is still about making great products. It’s not about becoming a trillion dollar company. By making great products, other good things will follow.

Money Can’t Buy Taste

Marco Arment offers an excellent rebuttal to this Seeking Alpha article about Apple’s eventual downfall. Marco has two major points: time and taste. This was my favorite part of his argument:

Most people don’t have great taste. (And they don’t care, so it doesn’t matter to them.) They usually like tasteful, well-designed products, but often don’t recognize why, or care more about other factors when making buying decisions.

People who naturally recognize tasteful, well-designed products are a small subset of the population. But people who can create them are a much smaller subset.

Taste in product creation overlaps a lot with design: doing it well requires it to be valued, rewarded, and embedded in the company’s culture and upper leadership. If it’s not, great taste can’t guide product decisions, and great designers leave.

No amount of money, and no small amount of time, can buy taste.

Spot on.

How to Become Creative

In the Saturday essay in The Wall Street Journal, Jonah Lehrer writes about the creative process. He argues that creativity is not something that is passed in the genes; it is something that requires practice. We can work to become more creative.

This ability to calculate progress is an important part of the creative process. When we don’t feel that we’re getting closer to the answer—we’ve hit the wall, so to speak—we probably need an insight. If there is no feeling of knowing, the most productive thing we can do is forget about work for a while. But when those feelings of knowing are telling us that we’re getting close, we need to keep on struggling.

Of course, both moment-of-insight problems and nose-to-the-grindstone problems assume that we have the answers to the creative problems we’re trying to solve somewhere in our heads. They’re both just a matter of getting those answers out. Another kind of creative problem, though, is when you don’t have the right kind of raw material kicking around in your head. If you’re trying to be more creative, one of the most important things you can do is increase the volume and diversity of the information to which you are exposed.

Steve Jobs famously declared that “creativity is just connecting things.” Although we think of inventors as dreaming up breakthroughs out of thin air, Mr. Jobs was pointing out that even the most far-fetched concepts are usually just new combinations of stuff that already exists. Under Mr. Jobs’s leadership, for instance, Apple didn’t invent MP3 players or tablet computers—the company just made them better, adding design features that were new to the product category.

And it isn’t just Apple. The history of innovation bears out Mr. Jobs’s theory. The Wright Brothers transferred their background as bicycle manufacturers to the invention of the airplane; their first flying craft was, in many respects, just a bicycle with wings. Johannes Gutenberg transformed his knowledge of wine presses into a printing machine capable of mass-producing words. Or look at Google: Larry Page and Sergey Brin came up with their famous search algorithm by applying the ranking method used for academic articles (more citations equals more influence) to the sprawl of the Internet.

Don’t miss the bottom of the post which provides ten ways to become more creative, which I summarize below. A lot of these have been tested in an artificial setting (think undergraduates in a lab), so take these with a grain of salt:

1. Surround yourself with the color blue.

2. Do creative things when you’re groggy.

3. Daydream more.

4. Think like a child — imagine what you would do as a five year old.

5. Laugh more.

6. Imagine that you are far away.

7. Keep it generic.  When the verbs are extremely specific, people think in narrow terms. In contrast, the use of more generic verbs—say, “moving” instead of “driving” can help us solve creative problems.

8. Don’t work in a cubicle!

9. See the world. Travel.

10. Move from a small city to a metropolis.

Tim Cook on the Apple Culture

Earlier this week, the CEO of Apple, Tim Cook, spoke at a conference put on by Goldman Sachs. For his final question during the the Q&A session, Cook was asked how his leadership might change Apple, and what aspects of the culture he might try to preserve. Here’s what he had to say:

Apple is a unique culture and unique company. You can’t replicate it. I’m not going to witness or permit the slow undoing of it. I believe in it so deeply.

Steve grilled in all of us, over many years, that the company should revolve around great products. We should stay extremely focused on a few things, rather than try to do so many that we did nothing well. We should only go into markets where we can make a significant contribution to society, not just sell a lot of products.

These things, along with keeping excellence as an expectation of everything at Apple. These are the things that I focus on because I think those are the things that make Apple a magical place that really smart people want to work in and do, not just their life’s work, but their life’s best work.

And so we’re always focused on the future. We don’t sit and think about how great things were yesterday. I love that trait because I think it’s the thing that drives us all forward. Those are the things I’m holding onto. It’s a privelege to be a part of it.

###

(via Dustin Curtis; full audio here)

John Gruber’s Critique of Walter Isaacson’s Biography of Steve Jobs

John Gruber has an excellent critique of Walter Isaacson’s biography of Steve Jobs. I’ve read the biography last year, but I couldn’t make an informed critique like this:

There is much that is wrong with Walter Isaacson’s biography of Jobs, but its treatment of software is the most profound of the book’s flaws. Isaacson doesn’t merely neglect or underemphasize Jobs’s passion for software and design, but he flat-out paints the opposite picture.

Isaacson makes it seem as though Jobs was almost solely interested in hardware, and even there, only in what the hardware looked like. Superficial aesthetics.

In Chapter 26, “Design Principles: The Studio of Jobs and Ive”, Isaacson writes (p. 344 in the hardcover print edition):

“Before Steve came back, engineers would say ‘Here are the guts’ — processor, hard drive — and then it would go to the designers to put it in a box,” said Apple’s marketing chief Phil Schiller.
“When you do it that way, you come up with awful products.” But when Jobs returned and forged his bond with Ive, the balance was again tilted toward the designers. “Steve kept impressing on us that the design was integral to what would make us great,” said Schiller.

“Design once again dictated the engineering, not just vice versa.”

On occasion this could backfire, such as when Jobs and Ive insisted on using a solid piece of brushed aluminum for the edge of the iPhone 4 even when the engineers worried that it would compromise the antenna. But usually the distinctiveness of its designs — for the iMac, the iPod, the iPhone, and the iPad — would set Apple apart and lead to its triumphs in the years after Jobs returned.

Isaacson clearly believes that design is merely how a product looks and feels, and that “engineering” is how it actually works.

Jobs, in an interview with Rob Walker for his terrific 2003 New York Times Magazine profile on the creation of the iPod, said:

“Most people make the mistake of thinking design is what it looks like. People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”
That quote is absent from Isaacson’s book, despite the book’s frequent use of existing source material.

The entire post is worth reading, especially if you’ve read Steve Jobs.

The FBI File on Steve Jobs

Late last year, Michael Morisy contacted the the FBI, with a request under the Freedom of Information Act to attain the FBI files on Steve Jobs, former CEO of Apple. His full correspondence appears here.

“Several individuals questioned Mr. Jobs’ honesty stating that Mr. Jobs will twist the truth and distort reality in order to achieve his goals,” according to the report released today by the FBI.

The FBI interviewed Jobs and people who knew him as part of a background check for a possible appointment by former President George H. W. Bush. Interviews were conducted with unnamed associates of Jobs to judge his character, drug use and potential prejudices, according to the file. Near the end, there is a mention of a bomb threat.

The FBI report on Steve Jobs is decades old, and a large portion of the material is redacted, but it still makes for an interesting look-through. The full file is here.