An Ultra-Orthodox Jew on Being Adrift, Searching for a Navigator

Writing in The New York Times, Leah Vincent reflects on her upbringing as one of eleven children in the ultra-Orthodox Jewish community and finding romance on the Q train:

I had been raised in the ultra-Orthodox Jewish community. As a girl, my life had revolved around modesty, obedience and dreams of an arranged marriage at 18, followed by a dozen children. Popular movies were irrelevant and forbidden.

But it had been six years since my parents ostracized me for having written letters to a boy, wanting to go to college, complaining when my father, a prominent rabbi, used a slur to describe an African-American person, and wearing an immodest sweater that highlighted my curves. Six years of wrestling with my forbidden desires.

I had finally given up on God and decided I would allow myself to be liberated from the confines of my faith. Now I was catching up on what I had missed.

I loved this paragraph:

Luke was different from those random flings. We talked like old friends about books, Brooklyn, our lives. The next morning, when we hugged goodbye, it felt as if we had already shed a layer of defenses that usually took months to peel away. Our connection seemed deep and profound. On my way home, I let myself imagine what it might be like to wake up every morning with him. I tried on his last name. I wondered what kind of father he might be.

There’s a twist in the story that I wasn’t expecting, but I liked the optimism of her final sentence.

If you enjoyed the piece, you might want to purchase Vincent’s upcoming book: Cut Me Loose: Sin and Salvation After My Ultra-Orthodox Girlhood.

John Siracusa on Geekdom

This is a wonderful post by John Siracusa on how he became a geek. Near the end, he offers some sage advice:

You don’t have to be a geek about everything in your life—or anything, for that matter. But if geekdom is your goal, don’t let anyone tell you it’s unattainable. You don’t have to be there “from the beginning” (whatever that means). You don’t have to start when you’re a kid. You don’t need to be a member of a particular social classracesex, or gender.

Geekdom is not a club; it’s a destination, open to anyone who wants to put in the time and effort to travel there. And if someone lacks the opportunity to get there, we geeks should help in any way we can. Take a new friend to a meetup or convention. Donate your old games, movies, comics, and toys. Be welcoming. Sharing your enthusiasm is part of being a geek.

I actually classify myself as more of a nerd than a geek in areas in which I am interested in, but I could see the enveloping classification of the word “geek” for someone who was brought up with computers at such an early age.

Siracusa’s reviews are always a pleasure to read. I spent a few hours perusing his deep dive into OS X Mavericks last fall.

What Apps and Services Does Barack Obama Think Young People Use?

A young associate editor at The Atlantic, Robinson Meyer, reflects on how he accidentally met Barack Obama at a local cafe. The takeaway of what the young people use (apps, services) seems to be in constant flux, but based on the subtitle of the piece, it seems “that even Obama knows young people don’t use Facebook anymore”:

Obama sat down at the head of the table. There was a brief photo op at the opposite end of the table. I surreptitiously took a picture to remember what being on the other side of a wall of cameras felt like, but now it seems more remarkable that I can see the president’s undershirt.

He had come to my local cafe to meet with five young people. According to White House background, provided to me after he left, they met to discuss how to get more 18-34 year-olds to sign up for the coverage under the Affordable Care Act. (The law depends on 18-34 year-olds signing up for healthcare.) One of the five was a navigator, someone employed to help families sign up; another helped explain the law at a mall over the holidays.

They talked about health care stuff for the first 20 minutes. The five shared their experiences, and some of them spoke quietly, so I couldn’t hear them that well.

At one point the president said, “Now, this isn’t public yet.” I perked up.

“Thirty percent of somethingsomethingsomething is mumblemumble,” he said.

I didn’t hear. I had failed as a journalist, so I went to the bathroom.

Failure

When I got back, they were talking about music. Circumstantial evidence indicates that, while I was in the bathroom, they talked about Beyoncé. 

The conversation moved on. They talked about cell phones, and Obama mentioned how Malia did not receive one until she was 16. One of the young people pointed out that, unlike most parents, the president could always argue that he’d know where she was.

They segued to talking about social media (I couldn’t hear their exact words).Now, I thought. Now I could do tech journalism.

The president said something—I could not hear all of it—about new social media apps that were for messaging, new apps that only somethingsomething’d for eight seconds.

“Snapchat,” said one of the young people.

The president made a comment about how different apps were now popular. Someone—it might have been the president—said the word “Instagram.” 

I guess that they were talking about the difficulty of doing political outreach on Snapchat or one of this newer, less textual ilk? I’m not sure. Then the president drops this:

“It seems like they don’t use Facebook anymore,” he said.

Facebook is so uncool even the president of the United States knows it.

I’ve been saying this for a while, but I am disliking using Facebook as of the last year or two. I prefer Twitter and Instagram.

The story is worth the click simply for that SnapChat photo at the end.

An Obituary for Mae Young, Unladylike Wrestler

We’re not a month into 2014, but this obituary for the unladylike wrestler named Mae Young is surely going to be one of the most interesting ones this year:

Mae Young — make that the Great Mae Young — who pulled hair and took cheap shots, who preferred actually fighting to pretending, who was, by her own account and that of many other female wrestlers, the greatest and dirtiest of them all, died on Tuesday in Columbia, S.C. She was 90, and her last round in the ring was in 2010.

Mae Young, on the right, doing her thing.

Mae Young, on the right, doing her thing.

You have to love her bravado:

“Anybody can be a baby face, what we call a clean wrestler,” she said in “Lipstick & Dynamite: The First Ladies of Wrestling,” a 2004 documentary. “They don’t have to do nothing. It’s the heel that carries the whole show. I’ve always been a heel, and I wouldn’t be anything else but.”

“This is a business that you have to love, and if you love it you live it.”  —Mae Young, RIP.

On the Morality and Self-Awareness of Cards Against Humanity

This is an excellent post that categorizes the infamous Cards Against Humanity game as not a game that is “morally corrosive” (as argued in this post) but rather simply distasteful and provocative:

Cards Against Humanity is a type of humor-oriented carnival space in which norms about appropriate discussion, and appropriate topics of humor, are reversed. It may be acceptable to relax the rules within this space, but there is little danger of what Leah fears is a “leakage” of these rules into everyday life, just as there is little danger that a jester would seriously try to become a pope in everyday life. The fact that a theology school would defend such orgies is a testament to the fact that they serve to uphold the establishment.

It is key that Cards Against Humanity is a highly self-aware game. This is apparent in the tagline (“A free party game for horrible people”) and descriptions: “Unlike most of the party games you’ve played before, Cards Against Humanity is as despicable and awkward as you and your friends.” By pairing the game and its brand of humor with words like “horrible,” “despicable,” and “awkward,” it shows, again, that these are things we should not laugh about, despite doing so anyway. This self-awareness is at the heart of every, “I know I shouldn’t find this funny, but…” statement. ”Virginia Tech Massacre” is funny in this “Opposite Day” world. It’s really not funny in other contexts or in the “real world.” This is also why it’s generally OK for Jews to make Holocaust jokes when it is more frowned upon for others to do the same—it is far more likely that the non-Jew would have less awareness of the consequences of the Holocaust than the Jew, and therefore the lack of self-awareness makes the attempt at humor far less palpable.

I welcomed 2014 with a game of Cards Against Humanity. While certain cards make me uncomfortable, as argued in the post, I don’t take the view that the game has or is able to corrupt me.

The 2014 Edge Question: What Scientific Idea is Ready for Retirement?

Every year since 1998, Edge.org editor John Brockman has been posing one thought-provoking question to some of the world’s greatest thinkers across a variety of disciplines, and then assimilating the responses in an annual anthology. Last year, he published a book This Explains Everything: Deep, Beautiful, and Elegant Theories of How the World Works which collects a number of these questions in a single volume.

For 2014, the annual Edge.org question is: What Scientific Idea is Ready for Retirement? I’ll be reading responses for the next few weeks, but for now, wanted to link to the main page and highlight a few notable responses:

1) Nassim Taleb, one of my all-time favourite thinkers and authors, who argues for the throwing out of standard deviation as a measure:

The notion of standard deviation has confused hordes of scientists; it is time to retire it from common use and replace it with the more effective one of mean deviation. Standard deviation, STD, should be left to mathematicians, physicists and mathematical statisticians deriving limit theorems. There is no scientific reason to use it in statistical investigations in the age of the computer, as it does more harm than good—particularly with the growing class of people in social science mechanistically applying statistical tools to scientific problems.

Say someone just asked you to measure the “average daily variations” for the temperature of your town (or for the stock price of a company, or the blood pressure of your uncle) over the past five days. The five changes are: (-23, 7, -3, 20, -1). How do you do it?

Do you take every observation: square it, average the total, then take the square root? Or do you remove the sign and calculate the average? For there are serious differences between the two methods. The first produces an average of 15.7, the second 10.8. The first is technically called the root mean square deviation. The second is the mean absolute deviation, MAD. It corresponds to “real life” much better than the first—and to reality. In fact, whenever people make decisions after being supplied with the standard deviation number, they act as if it were the expected mean deviation.

It is all due to a historical accident: in 1893, the great Karl Pearson introduced the term “standard deviation” for what had been known as “root mean square error”. The confusion started then: people thought it meant mean deviation. The idea stuck: every time a newspaper has attempted to clarify the concept of market “volatility”, it defined it verbally as mean deviation yet produced the numerical measure of the (higher) standard deviation.

But it is not just journalists who fall for the mistake: I recall seeing official documents from the department of commerce and the Federal Reserve partaking of the conflation, even regulators in statements on market volatility. What is worse, Goldstein and I found that a high number of data scientists (many with PhDs) also get confused in real life.

2) Jay Rosen, who argues we should retire the concept of “information overload”:

Here’s the best definition of information that I know of: information is a measure of uncertainty reduced. It’s deceptively simple. In order to have information, you need two things: an uncertainty that matters to us (we’re having a picnic tomorrow, will it rain?) and something that resolves it (weather report.) But some reports create the uncertainty that is later to be solved.

Suppose we learn from news reports that the National Security Agency “broke” encryption on the Internet. That’s information! It reduces uncertainty about how far the U.S. government was willing to go. (All the way.) But the same report increases uncertainty about whether there will continue to be a single Internet, setting us up for more information when that larger picture becomes clearer. So information is a measure of uncertainty reduced, but also of uncertainty created. Which is probably what we mean when we say: “well, that raises more questions than it answers.”

3) Richard Dawkins thinks “essentialism” should be retired:

Essentialism—what I’ve called “the tyranny of the discontinuous mind”—stems from Plato, with his characteristically Greek geometer’s view of things. For Plato, a circle, or a right triangle, were ideal forms, definable mathematically but never realised in practice. A circle drawn in the sand was an imperfect approximation to the ideal Platonic circle hanging in some abstract space. That works for geometric shapes like circles, but essentialism has been applied to living things and Ernst Mayr blamed this for humanity’s late discovery of evolution—as late as the nineteenth century. If, like Aristotle, you treat all flesh-and-blood rabbits as imperfect approximations to an ideal Platonic rabbit, it won’t occur to you that rabbits might have evolved from a non-rabbit ancestor, and might evolve into a non-rabbit descendant. If you think, following the dictionary definition of essentialism, that theessence of rabbitness is “prior to” the existence of rabbits (whatever “prior to” might mean, and that’s a nonsense in itself) evolution is not an idea that will spring readily to your mind, and you may resist when somebody else suggests it.

Paleontologists will argue passionately about whether a particular fossil is, say, Australopithecus orHomo. But any evolutionist knows there must have existed individuals who were exactly intermediate. It’s essentialist folly to insist on the necessity of shoehorning your fossil into one genus or the other. There never was an Australopithecus mother who gave birth to a Homo child, for every child ever born belonged to the same species as its mother. The whole system of labelling species with discontinuous names is geared to a time slice, the present, in which ancestors have been conveniently expunged from our awareness (and “ring species” tactfully ignored). If by some miracle every ancestor were preserved as a fossil, discontinuous naming would be impossible. Creationists are misguidedly fond of citing “gaps” as embarrassing for evolutionists, but gaps are a fortuitous boon for taxonomists who, with good reason, want to give species discrete names. Quarrelling about whether a fossil is “really” Australopithecus or Homo is like quarrelling over whether George should be called “tall”. He’s five foot ten, doesn’t that tell you what you need to know?

4) Kevin Kelly, who argues that “fully random mutations” should be retired from thought (this is something that I’ve known for a while, as I have taken a number of courses in molecular biology):

What is commonly called “random mutation” does not in fact occur in a mathematically random pattern. The process of genetic mutation is extremely complex, with multiple pathways, involving more than one system. Current research suggests most spontaneous mutations occur as errors in the repair process for damaged DNA. Neither the damage nor the errors in repair have been shown to be random in where they occur, how they occur, or when they occur. Rather, the idea that mutations are random is simply a widely held assumption by non-specialists and even many teachers of biology. There is no direct evidence for it.

On the contrary, there’s much evidence that genetic mutation vary in patterns. For instance it is pretty much accepted that mutation rates increase or decrease as stress on the cells increases or decreases. These variable rates of mutation include mutations induced by stress from an organism’s predators and competition, and as well as increased mutations brought on by environmental and epigenetic factors. Mutations have also been shown to have a higher chance of occurring near a place in DNA where mutations have already occurred, creating mutation hotspot clusters—a non-random pattern. 

5) Ian Bogost, professor at my alma mater of Georgia Tech, who thinks “science” should be retired:

Beyond encouraging people to see science as the only direction for human knowledge and absconding with the subject of materiality, the rhetoric of science also does a disservice to science itself. It makes science look simple, easy, and fun, when science is mostly complex, difficult, and monotonous.

A case in point: the popular Facebook page “I f*cking love science” posts quick-take variations on the “science of x” theme, mostly images and short descriptions of unfamiliar creatures like the pink fairy armadillo, or illustrated birthday wishes to famous scientists like Stephen Hawking. But as the science fiction writer John Skylar rightly insisted in a fiery takedown of the practice last year, most people don’t f*cking love science, they f*cking love photography—pretty images of fairy armadillos and renowned physicists. The pleasure derived from these pictures obviates the public’s need to understand how science actually gets done—slowly and methodically, with little acknowledgement and modest pay in unseen laboratories and research facilities.

The rhetoric of science has consequences. Things that have no particular relation to scientific practice must increasingly frame their work in scientific terms to earn any attention or support. The sociology of Internet use suddenly transformed into “web science.” Long accepted practices of statistical analysis have become “data science.” Thanks to shifting educational and research funding priorities, anything that can’t claim that it is a member of a STEM (science, technology, engineering, and math) field will be left out in the cold. Unfortunately, the rhetoric of science offers the most tactical response to such new challenges. Unless humanists reframe their work as “literary science,” they risk getting marginalized, defunded and forgotten.

When you’re selling ideas, you have to sell the ideas that will sell. But in a secular age in which the abstraction of “science” risks replacing all other abstractions, a watered-down, bland, homogeneous version of science is all that will remain if the rhetoric of science is allowed to prosper.

We need not choose between God and man, science and philosophy, interpretation and evidence. But ironically, in its quest to prove itself as the supreme form of secular knowledge, science has inadvertently elevated itself into a theology. Science is not a practice so much as it is an ideology. We don’t need to destroy science in order to bring it down to earth. But we do need to bring it down to earth again, and the first step in doing so is to abandon the rhetoric of science that has become its most popular devotional practice.

If you want to get smarter today, go here and spend a few hours reading through the contributions.

The New York Times Treatment of Bistro at Villard Michel Richard

Food critic Pete Wells at The New York Times has just come out with a scathing review of the Bistro at Villard Michel Richard, the fancy new restaurant at the newly renovated New York Palace in Midtown Manhattan. It’s worth reading in entirety but these two paragraphs are the best:

Think of everything that’s great about fried chicken. Now take it all away. In its place, right between dried-out strands of gray meat and a shell of fried bread crumbs, imagine a gummy white paste about a quarter-inch deep. This unidentifiable paste coats your mouth until you can’t perceive textures or flavors. It is like edible Novocain.

What Villard Michel Richard’s $28 fried chicken does to Southern cooking, its $40 veal cheek blanquette does to French. A classic blanquette is a gentle, reassuring white stew of sublimely tender veal. In this version, the veal cheeks had the dense, rubbery consistency of overcooked liver. Slithering around the meat was a terrifying sauce the color of jarred turkey gravy mixed with cigar ashes. If soldiers had killed Escoffier’s family in front of him and then forced him to make dinner, this is what he would have cooked.

Mmm, delicious.

The Secrets of Snow-Diving Foxes

This is a super interesting article by NPR’s Robert Krulwich, who summarizes research on why snow foxes jump the way they do in hunting for prey:

When they looked at each other’s notes, the researchers saw a pattern: For some reason, Czech foxes prefer to jump in a particular direction — toward the northeast. (To be more precise, it’s about 20 degrees off “magnetic north” — the “N” on your compass.) As the video above says, most of the time, most foxes miss their targets and emerge covered in snow and (one presumes) a little embarrassed. But when they pointed in that particular northeasterly direction, Ed writes, “they killed on 73 percent of their attacks.” If they reversed direction, and jumped exactly the opposite way, they killed 60 percent of the time. But in all other directions — east, south, west, whatever — they sucked. Only 18 percent of those jumps were successful.

Here’s a video of a hunting fox in action:

Statistical Stylometry: Quantifying Elements of Writing Style that Differentiate Successful Fiction

Can good writing be differentiated from bad writing through some kind of algorithm? Many have tried to answer this research question. The latest news in this realm comes from Stony Brook University, in which a group of researchers:

…[T]ook 1000 sentences from the beginning of each book. They performed systematic analyses based on lexical and syntactic features that have been proven effective in Natural Language Processing (NLP) tasks such as authorship attribution, genre detection, gender identification, and native language detection.

“To the best of our knowledge, our work is the first that provides quantitative insights into the connection between the writing style and the success of literary works,” Choi says. “Previous work has attempted to gain insights into the ‘secret recipe’ of successful books. But most of these studies were qualitative, based on a dozen books, and focused primarily on high-level content—the personalities of protagonists and antagonists and the plots. Our work examines a considerably larger collection—800 books—over multiple genres, providing insights into lexical, syntactic, and discourse patterns that characterize the writing styles commonly shared among the successful literature.”

I had no idea there was a name for this kind of research. Statistical stylometry is the statistical analysis of variations in literary style between one writer or genre and another. This study reports, for the first time, that the discipline can be effective in distinguishing highly successful literature from its less successful counterpart, achieving accuracy rates as high as 84%.

The best book on writing that I’ve read is Stephen King’s On Writing, in which he echoes the descriptive nature of writing that the researchers back up as well:

[T]he less successful books also rely on verbs that explicitly describe actions and emotions (“wanted”, “took”, “promised”, “cried”, “cheered”), while more successful books favor verbs that describe thought-processing (“recognized”, “remembered”) and verbs that simply serve the purpose of quotes (“say”).

The Reader/Author Transaction Model

A very interesting reading tip from Bret Victor, compiled on this page in the sidebar:

Carver Mead describes a physical theory in which atoms exchange energy by resonating with each other. Before the energy transaction can happen, the two atoms must be phase-matched, oscillating in almost perfect synchrony with each other.

I sometimes think about resonant transactions as a metaphor for getting something out of a piece of writing. Before the material can resonate, before energy can be exchanged between the author and reader, the reader must already have available a mode of vibration at the author’s frequency. (This doesn’t mean that the reader is already thinking the author’s thought; it means the reader is capable of thinking it.)

People often describe written communication in terms of transmission(the author explained the concept well, or poorly) and/or absorption (the reader does or doesn’t have the background or skill to understand the concept). But I think of it more like a transaction — the author and the reader must be matched with each other. The author and reader must share a close-enough worldview, viewpoint, vocabulary, set of mental models, sense of aesthetics, and set of goals. For any particular concept in the material, if not enough of these are sufficiently matched, no resonance will occur and no energy will be exchanged.

Perhaps, as a reader, one way to get more out of more material is to collect and cultivate a diverse set of resonators, to increase the probability of a phase-match.

My brief thought: I wouldn’t be so pessimistic as to say there are no times no energy is exchanged in a reader-author relationship. I think there’s always a chance to learn something new.

###

(via Jason Kottke)