The 2014 Edge Question: What Scientific Idea is Ready for Retirement?

Every year since 1998, Edge.org editor John Brockman has been posing one thought-provoking question to some of the world’s greatest thinkers across a variety of disciplines, and then assimilating the responses in an annual anthology. Last year, he published a book This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works which collects a number of these questions in a single volume.

For 2014, the annual Edge.org question is: What Scientific Idea is Ready for Retirement? I’ll be reading responses for the next few weeks, but for now, wanted to link to the main page and highlight a few notable responses:

1) Nassim Taleb, one of my all-time favourite thinkers and authors, who argues for the throwing out of standard deviation as a measure:

The notion of standard deviation has confused hordes of scientists; it is time to retire it from common use and replace it with the more effective one of mean deviation. Standard deviation, STD, should be left to mathematicians, physicists and mathematical statisticians deriving limit theorems. There is no scientific reason to use it in statistical investigations in the age of the computer, as it does more harm than good—particularly with the growing class of people in social science mechanistically applying statistical tools to scientific problems.

Say someone just asked you to measure the “average daily variations” for the temperature of your town (or for the stock price of a company, or the blood pressure of your uncle) over the past five days. The five changes are: (-23, 7, -3, 20, -1). How do you do it?

Do you take every observation: square it, average the total, then take the square root? Or do you remove the sign and calculate the average? For there are serious differences between the two methods. The first produces an average of 15.7, the second 10.8. The first is technically called the root mean square deviation. The second is the mean absolute deviation, MAD. It corresponds to “real life” much better than the first—and to reality. In fact, whenever people make decisions after being supplied with the standard deviation number, they act as if it were the expected mean deviation.

It is all due to a historical accident: in 1893, the great Karl Pearson introduced the term “standard deviation” for what had been known as “root mean square error”. The confusion started then: people thought it meant mean deviation. The idea stuck: every time a newspaper has attempted to clarify the concept of market “volatility”, it defined it verbally as mean deviation yet produced the numerical measure of the (higher) standard deviation.

But it is not just journalists who fall for the mistake: I recall seeing official documents from the department of commerce and the Federal Reserve partaking of the conflation, even regulators in statements on market volatility. What is worse, Goldstein and I found that a high number of data scientists (many with PhDs) also get confused in real life.

2) Jay Rosen, who argues we should retire the concept of “information overload”:

Here’s the best definition of information that I know of: information is a measure of uncertainty reduced. It’s deceptively simple. In order to have information, you need two things: an uncertainty that matters to us (we’re having a picnic tomorrow, will it rain?) and something that resolves it (weather report.) But some reports create the uncertainty that is later to be solved.

Suppose we learn from news reports that the National Security Agency “broke” encryption on the Internet. That’s information! It reduces uncertainty about how far the U.S. government was willing to go. (All the way.) But the same report increases uncertainty about whether there will continue to be a single Internet, setting us up for more information when that larger picture becomes clearer. So information is a measure of uncertainty reduced, but also of uncertainty created. Which is probably what we mean when we say: “well, that raises more questions than it answers.”

3) Richard Dawkins thinks “essentialism” should be retired:

Essentialism—what I’ve called “the tyranny of the discontinuous mind”—stems from Plato, with his characteristically Greek geometer’s view of things. For Plato, a circle, or a right triangle, were ideal forms, definable mathematically but never realised in practice. A circle drawn in the sand was an imperfect approximation to the ideal Platonic circle hanging in some abstract space. That works for geometric shapes like circles, but essentialism has been applied to living things and Ernst Mayr blamed this for humanity’s late discovery of evolution—as late as the nineteenth century. If, like Aristotle, you treat all flesh-and-blood rabbits as imperfect approximations to an ideal Platonic rabbit, it won’t occur to you that rabbits might have evolved from a non-rabbit ancestor, and might evolve into a non-rabbit descendant. If you think, following the dictionary definition of essentialism, that theessence of rabbitness is “prior to” the existence of rabbits (whatever “prior to” might mean, and that’s a nonsense in itself) evolution is not an idea that will spring readily to your mind, and you may resist when somebody else suggests it.

Paleontologists will argue passionately about whether a particular fossil is, say, Australopithecus orHomo. But any evolutionist knows there must have existed individuals who were exactly intermediate. It’s essentialist folly to insist on the necessity of shoehorning your fossil into one genus or the other. There never was an Australopithecus mother who gave birth to a Homo child, for every child ever born belonged to the same species as its mother. The whole system of labelling species with discontinuous names is geared to a time slice, the present, in which ancestors have been conveniently expunged from our awareness (and “ring species” tactfully ignored). If by some miracle every ancestor were preserved as a fossil, discontinuous naming would be impossible. Creationists are misguidedly fond of citing “gaps” as embarrassing for evolutionists, but gaps are a fortuitous boon for taxonomists who, with good reason, want to give species discrete names. Quarrelling about whether a fossil is “really” Australopithecus or Homo is like quarrelling over whether George should be called “tall”. He’s five foot ten, doesn’t that tell you what you need to know?

4) Kevin Kelly, who argues that “fully random mutations” should be retired from thought (this is something that I’ve known for a while, as I have taken a number of courses in molecular biology):

What is commonly called “random mutation” does not in fact occur in a mathematically random pattern. The process of genetic mutation is extremely complex, with multiple pathways, involving more than one system. Current research suggests most spontaneous mutations occur as errors in the repair process for damaged DNA. Neither the damage nor the errors in repair have been shown to be random in where they occur, how they occur, or when they occur. Rather, the idea that mutations are random is simply a widely held assumption by non-specialists and even many teachers of biology. There is no direct evidence for it.

On the contrary, there’s much evidence that genetic mutation vary in patterns. For instance it is pretty much accepted that mutation rates increase or decrease as stress on the cells increases or decreases. These variable rates of mutation include mutations induced by stress from an organism’s predators and competition, and as well as increased mutations brought on by environmental and epigenetic factors. Mutations have also been shown to have a higher chance of occurring near a place in DNA where mutations have already occurred, creating mutation hotspot clusters—a non-random pattern. 

5) Ian Bogost, professor at my alma mater of Georgia Tech, who thinks “science” should be retired:

Beyond encouraging people to see science as the only direction for human knowledge and absconding with the subject of materiality, the rhetoric of science also does a disservice to science itself. It makes science look simple, easy, and fun, when science is mostly complex, difficult, and monotonous.

A case in point: the popular Facebook page “I f*cking love science” posts quick-take variations on the “science of x” theme, mostly images and short descriptions of unfamiliar creatures like the pink fairy armadillo, or illustrated birthday wishes to famous scientists like Stephen Hawking. But as the science fiction writer John Skylar rightly insisted in a fiery takedown of the practice last year, most people don’t f*cking love science, they f*cking love photography—pretty images of fairy armadillos and renowned physicists. The pleasure derived from these pictures obviates the public’s need to understand how science actually gets done—slowly and methodically, with little acknowledgement and modest pay in unseen laboratories and research facilities.

The rhetoric of science has consequences. Things that have no particular relation to scientific practice must increasingly frame their work in scientific terms to earn any attention or support. The sociology of Internet use suddenly transformed into “web science.” Long accepted practices of statistical analysis have become “data science.” Thanks to shifting educational and research funding priorities, anything that can’t claim that it is a member of a STEM (science, technology, engineering, and math) field will be left out in the cold. Unfortunately, the rhetoric of science offers the most tactical response to such new challenges. Unless humanists reframe their work as “literary science,” they risk getting marginalized, defunded and forgotten.

When you’re selling ideas, you have to sell the ideas that will sell. But in a secular age in which the abstraction of “science” risks replacing all other abstractions, a watered-down, bland, homogeneous version of science is all that will remain if the rhetoric of science is allowed to prosper.

We need not choose between God and man, science and philosophy, interpretation and evidence. But ironically, in its quest to prove itself as the supreme form of secular knowledge, science has inadvertently elevated itself into a theology. Science is not a practice so much as it is an ideology. We don’t need to destroy science in order to bring it down to earth. But we do need to bring it down to earth again, and the first step in doing so is to abandon the rhetoric of science that has become its most popular devotional practice.

If you want to get smarter today, go here and spend a few hours reading through the contributions.

The New York Times Treatment of Bistro at Villard Michel Richard

Food critic Pete Wells at The New York Times has just come out with a scathing review of the Bistro at Villard Michel Richard, the fancy new restaurant at the newly renovated New York Palace in Midtown Manhattan. It’s worth reading in entirety but these two paragraphs are the best:

Think of everything that’s great about fried chicken. Now take it all away. In its place, right between dried-out strands of gray meat and a shell of fried bread crumbs, imagine a gummy white paste about a quarter-inch deep. This unidentifiable paste coats your mouth until you can’t perceive textures or flavors. It is like edible Novocain.

What Villard Michel Richard’s $28 fried chicken does to Southern cooking, its $40 veal cheek blanquette does to French. A classic blanquette is a gentle, reassuring white stew of sublimely tender veal. In this version, the veal cheeks had the dense, rubbery consistency of overcooked liver. Slithering around the meat was a terrifying sauce the color of jarred turkey gravy mixed with cigar ashes. If soldiers had killed Escoffier’s family in front of him and then forced him to make dinner, this is what he would have cooked.

Mmm, delicious.

The Secrets of Snow-Diving Foxes

This is a super interesting article by NPR’s Robert Krulwich, who summarizes research on why snow foxes jump the way they do in hunting for prey:

When they looked at each other’s notes, the researchers saw a pattern: For some reason, Czech foxes prefer to jump in a particular direction — toward the northeast. (To be more precise, it’s about 20 degrees off “magnetic north” — the “N” on your compass.) As the video above says, most of the time, most foxes miss their targets and emerge covered in snow and (one presumes) a little embarrassed. But when they pointed in that particular northeasterly direction, Ed writes, “they killed on 73 percent of their attacks.” If they reversed direction, and jumped exactly the opposite way, they killed 60 percent of the time. But in all other directions — east, south, west, whatever — they sucked. Only 18 percent of those jumps were successful.

Here’s a video of a hunting fox in action:

Statistical Stylometry: Quantifying Elements of Writing Style that Differentiate Successful Fiction

Can good writing be differentiated from bad writing through some kind of algorithm? Many have tried to answer this research question. The latest news in this realm comes from Stony Brook University, in which a group of researchers:

…[T]ook 1000 sentences from the beginning of each book. They performed systematic analyses based on lexical and syntactic features that have been proven effective in Natural Language Processing (NLP) tasks such as authorship attribution, genre detection, gender identification, and native language detection.

“To the best of our knowledge, our work is the first that provides quantitative insights into the connection between the writing style and the success of literary works,” Choi says. “Previous work has attempted to gain insights into the ‘secret recipe’ of successful books. But most of these studies were qualitative, based on a dozen books, and focused primarily on high-level content—the personalities of protagonists and antagonists and the plots. Our work examines a considerably larger collection—800 books—over multiple genres, providing insights into lexical, syntactic, and discourse patterns that characterize the writing styles commonly shared among the successful literature.”

I had no idea there was a name for this kind of research. Statistical stylometry is the statistical analysis of variations in literary style between one writer or genre and another. This study reports, for the first time, that the discipline can be effective in distinguishing highly successful literature from its less successful counterpart, achieving accuracy rates as high as 84%.

The best book on writing that I’ve read is Stephen King’s On Writing, in which he echoes the descriptive nature of writing that the researchers back up as well:

[T]he less successful books also rely on verbs that explicitly describe actions and emotions (“wanted”, “took”, “promised”, “cried”, “cheered”), while more successful books favor verbs that describe thought-processing (“recognized”, “remembered”) and verbs that simply serve the purpose of quotes (“say”).

The Reader/Author Transaction Model

A very interesting reading tip from Bret Victor, compiled on this page in the sidebar:

Carver Mead describes a physical theory in which atoms exchange energy by resonating with each other. Before the energy transaction can happen, the two atoms must be phase-matched, oscillating in almost perfect synchrony with each other.

I sometimes think about resonant transactions as a metaphor for getting something out of a piece of writing. Before the material can resonate, before energy can be exchanged between the author and reader, the reader must already have available a mode of vibration at the author’s frequency. (This doesn’t mean that the reader is already thinking the author’s thought; it means the reader is capable of thinking it.)

People often describe written communication in terms of transmission(the author explained the concept well, or poorly) and/or absorption (the reader does or doesn’t have the background or skill to understand the concept). But I think of it more like a transaction — the author and the reader must be matched with each other. The author and reader must share a close-enough worldview, viewpoint, vocabulary, set of mental models, sense of aesthetics, and set of goals. For any particular concept in the material, if not enough of these are sufficiently matched, no resonance will occur and no energy will be exchanged.

Perhaps, as a reader, one way to get more out of more material is to collect and cultivate a diverse set of resonators, to increase the probability of a phase-match.

My brief thought: I wouldn’t be so pessimistic as to say there are no times no energy is exchanged in a reader-author relationship. I think there’s always a chance to learn something new.

###

(via Jason Kottke)

The Photography of Adam Magyar

In a long piece titled “Einstein’s Camera,” Joshua Hammer profiles the photography of Adam Magyar:

In a growing body of photographic and video art done over the past decade, Magyar bends conventional representations of time and space, stretching milliseconds into minutes, freezing moments with a resolution that the naked eye could never have perceived. His art evokes such variegated sources as Albert Einstein, Zen Buddhism, even the 1960s TV series The Twilight Zone.The images—sleek silver subway cars, solemn commuters lost in private worlds—are beautiful and elegant, but also produce feelings of disquiet. “These moments I capture are meaningless, there is no story in them, and if you can catch the core, the essence of being, you capture probably everything,” Magyar says in one of the many cryptic comments about his work that reflect both their hypnotic appeal and their elusiveness. There is a sense of stepping into a different dimension, of inhabiting a space between stillness and movement, a time-warp world where the rules of physics don’t apply.

Magyar’s work represents a fruitful cross-fertilization of technology and art, two disciplines—one objective and mathematical, the other entirely subjective—that weren’t always regarded as harmonious or compatible. Yet the two are intertwined, and breakthroughs in technology have often made new forms of art possible. Five thousand years ago, Egyptian technicians heated desert sand, limestone, potash, and copper carbonate in kilns to make a synthetic pigment known as “Egyptian blue,” which contributed to the highly realistic yet stylized portraiture of the Second and Third Dynasties.

adam_magyar_train

Fascinating profile. Definitely worth clicking over to see Adam’s photography, especially the Stainless photography and video features.

Why Everyone Seems to Have Cancer

A thoughtful take on comparing heart disease and cancer by George Johnson in The New York Times.

Half a century ago, the story goes, a person was far more likely to die from heart disease. Now cancer is on the verge of overtaking it as the No. 1 cause of death.

Troubling as this sounds, the comparison is unfair. Cancer is, by far, the harder problem — a condition deeply ingrained in the nature of evolution and multicellular life. Given that obstacle, cancer researchers are fighting and even winning smaller battles: reducing the death toll from childhood cancers and preventing — and sometimes curing — cancers that strike people in their prime. But when it comes to diseases of the elderly, there can be no decisive victory. This is, in the end, a zero-sum game.

As people age their cells amass more potentially cancerous mutations. Given a long enough life, cancer will eventually kill you — unless you die first of something else. That would be true even in a world free from carcinogens and equipped with the most powerful medical technology.

The author is keen on pointing out that the future of medicine will be focused on prevention rather than treatment.

On Hard-Won Lessons in Your Single, Solitary Years

Happy New Year!

I’ve cast this blog aside for the first few days of the year, spending some vacation time in Florida and working on building new relationship(s). To that end, a lot of what I have been reading online almost seems to come my way as something that I was meant to read, confirming my beliefs/values. Perhaps the best example is this Modern Love story in The New York Times, on what being single for many years teaches you:

Being a single person searching for love teaches you that not everything is under your control. You can’t control whether the person you’ve fallen for will call. You can’t force yourself to have feelings for the nice guy your best friend fixed you up with. You have no way to know whether attending this or that event — a co-worker’s art opening, a neighbor’s housewarming — will lead to the chance encounter that will forever alter your life. You simply learn to do your best, and leave it at that.

Ringing endorsement here:

Relationships are work, but so is being single, and I became pretty good at it.

The perspective in the story comes from someone older than me, but I sympathize with this:

Most important, I’ve realized I never needed a long boyfriend résumé for the experience. In the 20 years before I met Mark, I learned a lot of hard lessons: how to be a self-respecting adult in a world that often treats single people like feckless teenagers; how to stand at cocktail parties while my friends’ in-laws asked me if I had a boyfriend; how to have warm, friendly dinners with strangers I had met online as we delicately tried to determine whether we could possibly share our lives together; and how to come home to an empty apartment after a rotten day at work.

A wonderful way to start 2014.

###

(hat tip: @jennydeluxe)

James Franco on the Importance of the Selfie

The Oxford English Dictionaries chose “selfie” as its word of the year for 2013. I haven’t been one to post any selfies on my Instagram account and have done very few self-portraits on my (now defunct) photoblog. But when I read James Franco’s op-ed in The New York Times titled “The Meanings of the Selfie,” I mulled over what he had written and started to gather a renewed appreciation for the phenomenon. Franco writes:

But a well-stocked collection of selfies seems to get attention. And attention seems to be the name of the game when it comes to social networking. In this age of too much information at a click of a button, the power to attract viewers amid the sea of things to read and watch is power indeed. It’s what the movie studios want for their products, it’s what professional writers want for their work, it’s what newspapers want — hell, it’s what everyone wants: attention. Attention is power. And if you are someone people are interested in, then the selfie provides something very powerful, from the most privileged perspective possible.

The perspective here is misguided, however. His central premise is that we, as humans, must persistently seek some kind of validation for what we do. For Franco, apparently that comes from getting lots of comments and faves on his Instagram account.

Franco goes on to differentiate between the celebrity selfie and the non-celebrity selfie, and this is where his essay picks up some pace:

Now, while the celebrity selfie is most powerful as a pseudo-personal moment, the noncelebrity selfie is a chance for subjects to glam it up, to show off a special side of themselves — dressing up for a special occasion, or not dressing, which is a kind of preening that says, “There is something important about me that clothes hide, and I don’t want to hide.”

Of course, the self-portrait is an easy target for charges of self-involvement, but, in a visual culture, the selfie quickly and easily shows, not tells, how you’re feeling, where you are, what you’re doing.

But it was the way Franco ended the essay that really captured my attention:

I am actually turned off when I look at an account and don’t see any selfies, because I want to know whom I’m dealing with. In our age of social networking, the selfie is the new way to look someone right in the eye and say, “Hello, this is me.”

I am still not 100% in agreement: there are amazing photographers on Instagram that never post selfies. But I would agree that given someone who has a similar following on Instagram, the person who is more revealing, the one is saying “Yes, this is who I am” is the one who is posting those selfies.

With some luck, I will change my mind and actually start posting selfies in 2014.

Tess Vigeland on Her Remarkable Leap Year

Something which I haven’t previously blogged about but which has profoundly, deeply influenced how I think about the world around me is Tess Vigeland. You see, her speech at the World Domination Summit had a theme that would make most people uncomfortable: leaving a cushion-y position at a major radio station to do…well, she didn’t know what she was going to do. But if you read the text of her speech, you’ll see that Tess explains that it felt okay. That for anyone out there who is still searching for what they want out of life, it shouldn’t be this massive burden. It may be difficult to perform this release, but what we need to remember is that it will be okay, and perhaps even turn into the remarkable:

Why do I care what other people think? I KNOW I’m not supposed to care – but I do.

How do I get back to remarkable?

The ONLY way… is by redefining it.

And I think this is an exercise that’s going to take some time. We all know we’re not supposed to define ourselves, and our success, by money, by page views, by Twitter followers, by fan mail, by audience size. But if you have a job, it does define you in many ways. You spend a good chunk of your day at that job – whether that’s at home or in an office or out in the field. Your lifestyle is sometimes determined by how much that job compensates you. I’m on track right now to make one third of what I made last year. One third. I know that doesn’t define me… but it does contribute to how I see my own value. I like what money allows me to do in my life.

So I need to redefine what success means to me. I don’t know how to define that without an audience. I don’t know how to define that without strangers recognizing my voice in an elevator. If that sounds egotistical – well you don’t go into broadcasting without some amount of ego – it’s a performance, after all. And if I end up doing something that can’t or won’t feed it… how do I know if I’m succeeding? How do I know if I’m remarkable?

But I guess what I would tell you – wherever you are on your career timeline – wherever you are in your relationship with this thing you do for a living – is that you have to give yourself permission to grieve the end of something. And sometimes you have to work really, really hard to find what’s next. 

At the end of the year, in her most recent post titled “My Leap Year”, Tess elaborates how that fateful July 7 morning in Portland went when she gave this talk to a crowd of 3,000 people at World Domination Summit:

And here I was about to tell a bunch of strangers about my failure, about my self-doubt and recriminations, about my discomfort with uncertainty, about my sad lack of a life dream, about how I no longer knew who I was because I could no longer describe what I do.

Do not throw up. Do not throw up. Do not throw up.

I walked on stage and told the oldest joke in the radio book: “Hey! You all don’t look a thing like I thought you would!” When people find out you’re a radio person they know, it’s what they say. Guaranteed. So I told the joke. I know… it’s lame. A few people laughed. And I started to tell my story.

And I started to feel something in the room – in that huge performance hall. To this day, I can’t describe it. They were listening. Really listening. They laughed in places I didn’t expect them to laugh. They shouted out from the audience, answering my rhetorical questions with actual answers.

“Will anyone want to listen to me now that I’m not some famous national correspondent anymore?”

“YES!!! Woooooo!!!”

At one point in the speech I talked about the rollercoaster I’d been experiencing, the ups and downs of leaving a career to strike out alone, feeling successful one day and like a complete fraud and unmoored the next. They hopped right on that ride and joined me from one moment to the next. I was no longer afraid of throwing up. What I was afraid of was that I’d burst into tears right there on stage, because of this overwhelming sense of support, this indescribable empathy that I felt from the audience.

Thank you Tess for bringing such vulnerability to your life and sharing it with others. I, among with thousands of others, cannot wait to see how 2014 unfolds for you.