On Slowing Down

Some startling statistics about our obsession with technology:

The average American spends at least eight and a half hours a day in front of a screen, Nicholas Carr notes in his eye-opening book “The Shallows,” in part because the number of hours American adults spent online doubled between 2005 and 2009 (and the number of hours spent in front of a TV screen, often simultaneously, is also steadily increasing).

The average American teenager sends or receives 75 text messages a day, though one girl in Sacramento managed to handle an average of 10,000 every 24 hours for a month. Since luxury, as any economist will tell you, is a function of scarcity, the children of tomorrow, I heard myself tell the marketers in Singapore, will crave nothing more than freedom, if only for a short while, from all the blinking machines, streaming videos and scrolling headlines that leave them feeling empty and too full all at once.

That’s from an op-ed “The Joy of Quiet” by Pico Iyer, who also notes that there are hotels that cite lack of access to internet and television as a selling point:

I noticed that those who part with $2,285 a night to stay in a cliff-top room at the Post Ranch Inn in Big Sur pay partly for the privilege of not having a TV in their rooms; the future of travel, I’m reliably told, lies in “black-hole resorts,” which charge high prices precisely because you can’t get online in their rooms.

In barely one generation we’ve moved from exulting in the time-saving devices that have so expanded our lives to trying to get away from them — often in order to make more time. The more ways we have to connect, the more many of us seem desperate to unplug. Like teenagers, we appear to have gone from knowing nothing about the world to knowing too much all but overnight.

In 2011, I’ve had the chance to unwind and go internet-free for a few days (at least several independent occasions). One of my resolutions for the coming year is to have more days where I unwind and slow down.

Predicting the Future of Computing

The New York Times has a fascinating interactive post on where you, the reader, can submit snippets on what you expect to happen in the field of computer in the near (and distant) future.

Here are some predictions I see, as of current viewing (readers can vote the prediction up or down in terms of when they think the event will happen):

2015: The price and availability of computers will be such that more than half of the world’s people will have one.

2017: Mobile web browsing on phones surpasses desktop browsing.

2019: Irrefutable evidence will show that computers consistently make more accurate diagnoses than specialists in all branches of medicine, including psychiatry.

2023:  The most common forms of cancer will be treated with a personalized therapy based on genetic sequencing. A patient’s therapy will be retargeted every six months as a result of resequencing the cancer to track its inevitable evolution.

2029: Your entire medical history from birth till death will be collectively combined in one universal system and available to all your different doctors.

2035: Most people will own and use a Personal Life Recorder which will store full video and audio of their daily lives. This will be a fully searchable archive that will radically augment a person’s effective memory.

2041: Cash will become illegal, replaced with electric currency. [Bizarre prediction!]

2071: Humans will be able to implant their dogs’ brains with a neurological device such that the images of what the dog is thinking appear on a special contact lens the dog owner is wearing.

2106: Medical advances will permit the first human to live for a period of 200 years or more.

2154: Humans will become so integrated with electronics that more people will die from computer viruses in a year than from biological viruses.

2180: Old knowledge will not have to be learned; only new knowledge will need to be created. Learning will become obsolete. All known knowledge will be contained on a super computer. Individuals can download all known knowledge pertaining to any subject directly to the brain as desired.

2225: Artificial Intelligence is awarded full citizenship.

2283: Abundance happens. Digital and physical sciences produce abundance so great that wealth becomes meaningless as a difference between people.

2416: Thought-based communication surpasses spoken and typed communication.

What do you think is the coolest prediction from above? Which one is unlikely to happen in the next 400 years?

If you have a New York Times account, you can submit your own predictions. You can vote up/down predictions without an account. It’s a fascinating experiment!

Peter Thiel on Technology, Science, Politics

Peter Thiel, the founder of PayPal, in his piece, “The End of the Future,” offers excellent food-for-thought regarding technology, science, innovation, politics, and the economy.

The state of true science is the key to knowing whether something is truly rotten in the United States. But any such assessment encounters an immediate and almost insuperable challenge. Who can speak about the true health of the ever-expanding universe of human knowledge, given how complex, esoteric, and specialized the many scientific and technological fields have become? When any given field takes half a lifetime of study to master, who can compare and contrast and properly weight the rate of progress in nanotechnology and cryptography and superstring theory and 610 other disciplines? Indeed, how do we even know whether the so-called scientists are not just lawmakers and politicians in disguise, as some conservatives suspect in fields as disparate as climate change, evolutionary biology, and embryonic-stem-cell research, and as I have come to suspect in almost all fields?

Not so sure about this statement. Nuclear engineering remains a strong major at Georgia Tech, for example:

 One cannot in good conscience encourage an undergraduate in 2011 to study nuclear engineering as a career. 

On the big pharmaceutical companies today:

In the next three years, the large pharmaceutical companies will lose approximately one-third of their current revenue stream as patents expire, so, in a perverse yet understandable response, they have begun the wholesale liquidation of the research departments that have borne so little fruit in the last decade and a half.

I think this is Thiel’s most important point in the piece.  Read it carefully:

If meaningful scientific and technological progress occurs, then we reasonably would expect greater economic prosperity (though this may be offset by other factors). And also in reverse: If economic gains, as measured by certain key indicators, have been limited or nonexistent, then perhaps so has scientific and technological progress. Therefore, to the extent that economic growth is easier to quantify than scientific or technological progress, economic numbers will contain indirect but important clues to our larger investigation.

The single most important economic development in recent times has been the broad stagnation of real wages and incomes since 1973, the year when oil prices quadrupled. To a first approximation, the progress in computers and the failure in energy appear to have roughly canceled each other out. Like Alice in the Red Queen’s race, we (and our computers) have been forced to run faster and faster to stay in the same place.

One interesting anecdote, in which Thiel quotes from the 1967 bestseller The American Challenge by Jean-Jacques Servan-Schreiber:

In 30 years America will be a post-industrial society. . . . There will be only four work days a week of seven hours per day. The year will be comprised of 39 work weeks and 13 weeks of vacation. With weekends and holidays this makes 147 work days a year and 218 free days a year. All this within a single generation.

And what does Thiel really think of John Maynard Keynes?

The most common name for a misplaced emphasis on macroeconomic policy is “Keynesianism.” Despite his brilliance, John Maynard Keynes was always a bit of a fraud, and there is always a bit of clever trickery in massive fiscal stimulus and the related printing of paper money. 

And I strongly agree with Thiel here. It’s a shame how science and engineering get passed over by our politicians:

Most of our political leaders are not engineers or scientists and do not listen to engineers or scientists. Today a letter from Einstein would get lost in the White House mail room, and the Manhattan Project would not even get started; it certainly could never be completed in three years. I am not aware of a single political leader in the U.S., either Democrat or Republican, who would cut health-care spending in order to free up money for biotechnology research — or, more generally, who would make serious cuts to the welfare state in order to free up serious money for major engineering projects.

Where will the United States be in a year? In five years? In ten?

###
(via Tyler Cowen)

Technology’s Gang of Four

From The London Review of Books, we have this gem:

This spring, the billionaire Eric Schmidt announced that there were only four really significant technology companies: Apple, Amazon, Facebook and Google, the company he had until recently been running. People believed him. What distinguished his new ‘gang of four’ from the generation it had superseded – companies like Intel, Microsoft, Dell and Cisco, which mostly exist to sell gizmos and gadgets and innumerable hours of expensive support services to corporate clients – was that the newcomers sold their products and services to ordinary people. Since there are more ordinary people in the world than there are businesses, and since there’s nothing that ordinary people don’t want or need, or can’t be persuaded they want or need when it flashes up alluringly on their screens, the money to be made from them is virtually limitless…

Very interesting analogy from the real Gang of Four to technology companies. Do you agree?

The Origin of Cyberspace

William Gibson, the author of Neuromancer, is credited with coining the term cyberspace.

In the recent issue of Paris Review, he reveals how he came about the term:

Gibson: I was walking around Vancouver, aware of that need, and I remember walking past a video arcade, which was a new sort of business at that time, and seeing kids playing those old-fashioned console-style plywood video games. The games had a very primitive graphic representation of space and perspective. Some of them didn’t even have perspective but were yearning toward perspective and dimensionality. Even in this very primitive form, the kids who were playing them were so physically involved, it seemed to me that what they wanted was to be inside the games, within the notional space of the machine. The real world had disappeared for them—it had completely lost its importance. They were in that notional space, and the machine in front of them was the brave new world.

The only computers I’d ever seen in those days were things the size of the side of a barn. And then one day, I walked by a bus stop and there was an Apple poster. The poster was a photograph of a businessman’s jacketed, neatly cuffed arm holding a life-size representation of a real-life computer that was not much bigger than a laptop is today. Everyone is going to have one of these, I thought, and everyone is going to want to live inside them. And somehow I knew that the notional space behind all of the computer screens would be one single universe.

Interviewer: And you knew at that point you had your arena?

Gibson: I sensed that it would more than meet my requirements, and I knew that there were all sorts of things I could do there that I hadn’t even been able to imagine yet. But what was more important at that point, in terms of my practical needs, was to name it something cool, because it was never going to work unless it had a really good name. So the first thing I did was sit down with a yellow pad and a Sharpie and start scribbling—infospace, dataspace. I think I got cyberspace on the third try, and I thought, Oh, that’s a really weird word. I liked the way it felt in the mouth—I thought it sounded like it meant something while still being essentially hollow… I made up a whole bunch of things that happened in cyberspace, or what you could call cyberspace, and so I filled in my empty neologism. But because the world came along with its real cyberspace, very little of that stuff lasted. What lasted was the neologism

I love how Gibson describes the enlightenment: the feeling in the mouth, as though he is synesthetic.

###
Hat tip for this post: Paul Kedrosky.

Related: how Haruki Murakami became a writer