By JONATHAN SAFRAN FOER NY Times Published: June 8, 2013
A COUPLE of weeks ago, I saw a stranger crying in public. I was in Brooklyn’s Fort Greene neighborhood, waiting to meet a friend for breakfast. I arrived at the restaurant a few minutes early and was sitting on the bench outside, scrolling through my contact list. A girl, maybe 15 years old, was sitting on the bench opposite me, crying into her phone. I heard her say, “I know, I know, I know” over and over.
What did she know? Had she done something wrong? Was she being comforted? And then she said, “Mama, I know,” and the tears came harder.
What was her mother telling her? Never to stay out all night again? That everybody fails? Is it possible that no one was on the other end of the call, and that the girl was merely rehearsing a difficult conversation?
“Mama, I know,” she said, and hung up, placing her phone on her lap.
I was faced with a choice: I could interject myself into her life, or I could respect the boundaries between us. Intervening might make her feel worse, or be inappropriate. But then, it might ease her pain, or be helpful in some straightforward logistical way. An affluent neighborhood at the beginning of the day is not the same as a dangerous one as night is falling. And I was me, and not someone else. There was a lot of human computing to be done.
It is harder to intervene than not to, but it is vastly harder to choose to do either than to retreat into the scrolling names of one’s contact list, or whatever one’s favorite iDistraction happens to be. Technology celebrates connectedness, but encourages retreat. The phone didn’t make me avoid the human connection, but it did make ignoring her easier in that moment, and more likely, by comfortably encouraging me to forget my choice to do so. My daily use of technological communication has been shaping me into someone more likely to forget others. The flow of water carves rock, a little bit at a time. And our personhood is carved, too, by the flow of our habits.
Psychologists who study empathy and compassion are finding that unlike our almost instantaneous responses to physical pain, it takes time for the brain to comprehend the psychological and moral dimensions of a situation. The more distracted we become, and the more emphasis we place on speed at the expense of depth, the less likely and able we are to care.
Everyone wants his parent’s, or friend’s, or partner’s undivided attention — even if many of us, especially children, are getting used to far less. Simone Weil wrote, “Attention is the rarest and purest form of generosity.” By this definition, our relationships to the world, and to one another, and to ourselves, are becoming increasingly miserly.
Most of our communication technologies began as diminished substitutes for an impossible activity. We couldn’t always see one another face to face, so the telephone made it possible to keep in touch at a distance. One is not always home, so the answering machine made a kind of interaction possible without the person being near his phone. Online communication originated as a substitute for telephonic communication, which was considered, for whatever reasons, too burdensome or inconvenient. And then texting, which facilitated yet faster, and more mobile, messaging. These inventions were not created to be improvements upon face-to-face communication, but a declension of acceptable, if diminished, substitutes for it.
But then a funny thing happened: we began to prefer the diminished substitutes. It’s easier to make a phone call than to schlep to see someone in person. Leaving a message on someone’s machine is easier than having a phone conversation — you can say what you need to say without a response; hard news is easier to leave; it’s easier to check in without becoming entangled. So we began calling when we knew no one would pick up.
Shooting off an e-mail is easier, still, because one can hide behind the absence of vocal inflection, and of course there’s no chance of accidentally catching someone. And texting is even easier, as the expectation for articulateness is further reduced, and another shell is offered to hide in. Each step “forward” has made it easier, just a little, to avoid the emotional work of being present, to convey information rather than humanity.
THE problem with accepting — with preferring — diminished substitutes is that over time, we, too, become diminished substitutes. People who become used to saying little become used to feeling little.
With each generation, it becomes harder to imagine a future that resembles the present. My grandparents hoped I would have a better life than they did: free of war and hunger, comfortably situated in a place that felt like home. But what futures would I dismiss out of hand for my grandchildren? That their clothes will be fabricated every morning on 3-D printers? That they will communicate without speaking or moving?
Only those with no imagination, and no grounding in reality, would deny the possibility that they will live forever. It’s possible that many reading these words will never die. Let’s assume, though, that we all have a set number of days to indent the world with our beliefs, to find and create the beauty that only a finite existence allows for, to wrestle with the question of purpose and wrestle with our answers.
We often use technology to save time, but increasingly, it either takes the saved time along with it, or makes the saved time less present, intimate and rich. I worry that the closer the world gets to our fingertips, the further it gets from our hearts. It’s not an either/or — being “anti-technology” is perhaps the only thing more foolish than being unquestioningly “pro-technology” — but a question of balance that our lives hang upon.
Most of the time, most people are not crying in public, but everyone is always in need of something that another person can give, be it undivided attention, a kind word or deep empathy. There is no better use of a life than to be attentive to such needs. There are as many ways to do this as there are kinds of loneliness, but all of them require attentiveness, all of them require the hard work of emotional computation and corporeal compassion. All of them require the human processing of the only animal who risks “getting it wrong” and whose dreams provide shelters and vaccines and words to crying strangers.
We live in a world made up more of story than stuff. We are creatures of memory more than reminders, of love more than likes. Being attentive to the needs of others might not be the point of life, but it is the work of life. It can be messy, and painful, and almost impossibly difficult. But it is not something we give. It is what we get in exchange for having to die.
Editor’s note: …and then there are those, steps above on the intelligence curve; they feel so confident in their ideas that they are willing to risk everything by betting on them. And all theose doubts from the said intelligent people or the opposing forces from the stupid ones can’t keep them down. Those are the ones that actually change the world.
A familiar but profound insight by a wise man about one of life’s great tragedies. When you’re “young” that phase might as well be from an alien language made up of English words. And then one day it hits you …like a train.
In that way, it’s like the ultimate litmus test of one’s maturity.
You will feel guilty about things you did not expect to feel guilty about — things that have nothing to do with your business. Your aging grandfather that you didn’t spend enough time with, your dog that unexpectedly died as soon as you took the plunge — these are all things that you may feel guilty about, because you’ve transitioned your time and your focus to your business. You’re human. Walk it off and go make them proud.
A COUPLE of evolutionary psychologists recently published a book about human sexual behavior in prehistory called “Sex at Dawn.” Upon hearing of the project, one colleague, dubious that a modern scholar could hope to know anything about that period, asked them, “So what do you do, close your eyes and dream?”
Actually, it’s a little more involved. Evolutionary psychologists who study mating behavior often begin with a hypothesis about how modern humans mate: say, that men think about sex more than women do. Then they gather evidence — from studies, statistics and surveys — to support that assumption. Finally, and here’s where the leap occurs, they construct an evolutionary theory to explain why men think about sex more than women, where that gender difference came from, what adaptive purpose it served in antiquity, and why we’re stuck with the consequences today.
Lately, however, a new cohort of scientists have been challenging the very existence of the gender differences in sexual behavior that Darwinians have spent the past 40 years trying to explain and justify on evolutionary grounds.
Of course, no fossilized record can really tell us how people behaved or thought back then, much less why they behaved or thought as they did. Nonetheless, something funny happens when social scientists claim that a behavior is rooted in our evolutionary past. Assumptions about that behavior take on the immutability of a physical trait — they come to seem as biologically rooted as opposable thumbs or ejaculation.
Using evolutionary psychology to back up these assumptions about men and women is nothing new. In “The Descent of Man, and Selection in Relation to Sex,” Charles Darwin gathered evidence for the notion that, through competition for mates and sustenance, natural selection had encouraged man’s “more inventive genius” while nurturing woman’s “greater tenderness.” In this way, he suggested that the gender differences he saw around him — men sought power and made money; women stayed at home — weren’t simply the way things were in Victorian England. They were the way things had always been.
A century later, a new batch of scientists began applying Darwinian doctrine to the conduct of mating, and specifically to three assumptions that endure to this day: men are less selective about whom they’ll sleep with; men like casual sex more than women; and men have more sexual partners over a lifetime.
In 1972, Robert L. Trivers, a graduate student at Harvard, addressed that first assumption in one of evolutionary psychology’s landmark studies, “Parental Investment and Sexual Selection.” He argued that women are more selective about whom they mate with because they’re biologically obliged to invest more in offspring. Given the relative paucity of ova and plenitude of sperm, as well as the unequal feeding duties that fall to women, men invest less in children. Therefore, men should be expected to be less discriminating and more aggressive in competing for females.
It was an elegant, powerful application of evolutionary theory to the mating game. The evolutionary psychologists of the 1980s and ’90s built on Mr. Trivers’s theory to explain a wide array of stereotypical gender differences in mating.
In 1993, David M. Buss and David P. Schmitt used parental investment theory to explain why men should be expected to “devote a larger proportion of their total mating effort to short-term mating.” Because men invested less time and effort in their offspring, they evolved toward promiscuity, while women evolved away from it. Promiscuity, the researchers hypothesized, would have been more damaging to the female reputation than to the male reputation. If a man mated with a promiscuous woman, he would never be able to ensure his paternity. Men, on the other hand, could potentially enhance their status by pursuing a short-term mating strategy. (Think Kennedy, Clinton, Spitzer, Letterman and so forth. My space is limited.)
One of the earliest critics of this kind of thinking was Stephen Jay Gould. He wrote in 1997 that parental investment theory “will not explain the full panoply of supposed sexual differences so dear to pop psychology.” Mr. Gould felt that the field had become overrun with “ultra-Darwinians,” and that evolutionary psychology would be a more fruitful science if it didn’t limit itself “to the blinkered view” that evolutionary explanations accounted for every difference.
BUT if evolution didn’t determine human behavior, what did? The most common explanation is the effect of cultural norms. That, for instance, society tends to view promiscuous men as normal and promiscuous women as troubled outliers, or that our “social script” requires men to approach women while the pickier women do the selecting. Over the past decade, sociocultural explanations have gained steam.
Take the question of promiscuity. Everyone has always assumed — and early research had shown — that women desired fewer sexual partners over a lifetime than men. But in 2003, two behavioral psychologists, Michele G. Alexander and Terri D. Fisher, published the results of a study that used a “bogus pipeline” — a fake lie detector. When asked about actual sexual partners, rather than just theoretical desires, the participants who were not attached to the fake lie detector displayed typical gender differences. Men reported having had more sexual partners than women. But when participants believed that lies about their sexual history would be revealed by the fake lie detector, gender differences in reported sexual partners vanished. In fact, women reported slightly more sexual partners (a mean of 4.4) than did men (a mean of 4.0).
In 2009, another long-assumed gender difference in mating — that women are choosier than men — also came under siege. In speed dating, as in life, the social norm instructs women to sit in one place, waiting to be approached, while the men rotate tables. But in one study of speed-dating behavior, the evolutionary psychologists Eli J. Finkel and Paul W. Eastwick switched the “rotator” role. The men remained seated and the women rotated. By manipulating this component of the gender script, the researchers discovered that women became less selective — they behaved more like stereotypical men — while men were more selective and behaved more like stereotypical women. The mere act of physically approaching a potential romantic partner, they argued, engendered more favorable assessments of that person.
Recently, a third pillar appeared to fall. To back up the assumption that an enormous gap exists between men’s and women’s attitudes toward casual sex, evolutionary psychologists typically cite a classic study published in 1989. Men and women on a college campus were approached in public and propositioned with offers of casual sex by “confederates” who worked for the study. The confederate would say: “I have been noticing you around campus and I find you to be very attractive.” The confederate would then ask one of three questions: (1) “Would you go out with me tonight?” (2) “Would you come over to my apartment tonight?” or (3) “Would you go to bed with me tonight?”
Roughly equal numbers of men and women agreed to the date. But women were much less likely to agree to go to the confederate’s apartment. As for going to bed with the confederate, zero women said yes, while about 70 percent of males agreed.
Those results seemed definitive — until a few years ago, when Terri D. Conley, a psychologist at the University of Michigan, set out to re-examine what she calls “one of the largest documented sexuality gender differences,” that men have a greater interest in casual sex than women.
Ms. Conley found the methodology of the 1989 paper to be less than ideal. “No one really comes up to you in the middle of the quad and asks, ‘Will you have sex with me?’ ” she told me recently. “So there needs to be a context for it. If you ask people what they would do in a specific situation, that’s a far more accurate way of getting responses.” In her study, when men and women considered offers of casual sex from famous people, or offers from close friends whom they were told were good in bed, the gender differences in acceptance of casual-sex proposals evaporated nearly to zero.
IN light of this new research, will Darwinians consider revising their theories to reflect the possibility that our mating behavior is less hard-wired than they had believed?
Probably not. In an article responding to the new studies last year, Mr. Schmitt, a leading voice among hard-line Darwinians, ceded no ground. Addressing Ms. Conley’s finding that women were more likely to agree to casual sex with a celebrity, Mr. Schmitt argued that this resulted from “women’s (but not men’s) short-term mating psychology being specially designed to obtain good genes from physically attractive short-term partners.” He continued: “When women’s short-term-mating aim is activated (perhaps, temporarily, because of, e.g., high-fertility ovulatory status or desire for an extramarital affair, or more chronically, because of , e.g., a female-biased local sex ratio or a history of insecure parent-child attachment), they appear to express relatively focused desires for genetic traits in ‘sexy men’ that would biologically benefit women when short-term mating.”
In other words: Nothing new here, it’s all evolution.
Steven Pinker, the Harvard psychologist and popular author, also backs the Darwinians, whom he says still have the weight of evidence on their side. “A study which shows you can push some phenomenon around a bit at the margins,” he wrote to me in an e-mail, “is of dubious relevance to whether the phenomenon exists.”
But the fact that some gender differences can be manipulated, if not eliminated, by controlling for cultural norms suggests that the explanatory power of evolution can’t sustain itself when applied to mating behavior. This wouldn’t be the first time we’ve pushed these theories too far. How many stereotypical racial and ethnic differences, once declared evolutionarily determined under the banner of science, have been revealed instead as vestiges of power dynamics from earlier societies?
Citing the speed-dating study, Mr. Pinker added, “The only reason this flawed paper was published was that it challenged an evolutionary hypothesis … in particular a sex difference — as the Larry Summers incident shows, claims about sex differences are still politically inflammatory in the academy.” Here, he was referring to the much criticized 2005 comments Mr. Summers made when he was Harvard’s president suggesting that women’s underrepresentation in science and engineering was attributable not to socialization but to “different availability of aptitude at the high end.”
Perhaps these phenomena exist. Perhaps men do, over all, pursue more short-term mating. But given new research, continued rigid reliance on evolution as an explanation seems to risk elevating a limited guide to teleological status — a way of thinking that scientists should abhor.
“Some sexual features are deeply rooted in evolutionary heritage, such as the sex response and how quickly it takes men and women to become aroused,” said Paul Eastwick, a co-author of the speed-dating study. “However, if you’re looking at features such as how men and women regulate themselves in society to achieve specific goals, I believe those features are unlikely to have evolved sex differences. I consider myself an evolutionary psychologist. But many evolutionary psychologists don’t think this way. They think these features are getting shaped and honed by natural selection all the time.” How far does Darwin go in explaining human behavior?
By JOHN TIERNEY NY Times Published: January 3, 2013
When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.
They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.
“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”
Other psychologists said they were intrigued by the findings, published Thursday in the journal Science, and were impressed with the amount of supporting evidence. Participants were asked about their personality traits and preferences — their favorite foods, vacations, hobbies and bands — in years past and present, and then asked to make predictions for the future. Not surprisingly, the younger people in the study reported more change in the previous decade than did the older respondents.
But when asked to predict what their personalities and tastes would be like in 10 years, people of all ages consistently played down the potential changes ahead.
Thus, the typical 20-year-old woman’s predictions for her next decade were not nearly as radical as the typical 30-year-old woman’s recollection of how much she had changed in her 20s. This sort of discrepancy persisted among respondents all the way into their 60s.
And the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.
Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.
“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”
Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.
The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.
And that illusion of stability could lead to dubious financial expectations, as the researchers showed in an experiment asking people how much they would pay to see their favorite bands.
When asked about their favorite band from a decade ago, respondents were typically willing to shell out $80 to attend a concert of the band today. But when they were asked about their current favorite band and how much they would be willing to spend to see the band’s concert in 10 years, the price went up to $129. Even though they realized that favorites from a decade ago like Creed or the Dixie Chicks have lost some of their luster, they apparently expect Coldplay and Rihanna to blaze on forever.
“The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.
Dr. McAdams was reminded of a conversation with his 4-year-old daughter during the craze forTeenage Mutant Ninja Turtles in the 1980s. When he told her they might not be her favorite thing one day, she refused to acknowledge the possibility. But later, in her 20s, she confessed to him that some part of her 4-year-old mind had realized he might be right.
“She resisted the idea of change, as it dawned on her at age 4, because she could not imagine what else she would ever substitute for the Turtles,” Dr. McAdams said. “She had a sneaking suspicion that she would change, but she couldn’t quite imagine how, so she stood with her assertion of continuity. Maybe something like this goes on with all of us.”
Three days before 20 year-old Adam Lanza killed his mother, then opened fire on a classroom full of Connecticut kindergartners, my 13-year old son Michael (name changed) missed his bus because he was wearing the wrong color pants.
“I can wear these pants,” he said, his tone increasingly belligerent, the black-hole pupils of his eyes swallowing the blue irises.
“They are navy blue,” I told him. “Your school’s dress code says black or khaki pants only.”
“They told me I could wear these,” he insisted. “You’re a stupid bitch. I can wear whatever pants I want to. This is America. I have rights!”
“You can’t wear whatever pants you want to,” I said, my tone affable, reasonable. “And you definitely cannot call me a stupid bitch. You’re grounded from electronics for the rest of the day. Now get in the car, and I will take you to school.”
I live with a son who is mentally ill. I love my son. But he terrifies me.
A few weeks ago, Michael pulled a knife and threatened to kill me and then himself after I asked him to return his overdue library books. His 7 and 9 year old siblings knew the safety plan—they ran to the car and locked the doors before I even asked them to. I managed to get the knife from Michael, then methodically collected all the sharp objects in the house into a single Tupperware container that now travels with me. Through it all, he continued to scream insults at me and threaten to kill or hurt me.
That conflict ended with three burly police officers and a paramedic wrestling my son onto a gurney for an expensive ambulance ride to the local emergency room. The mental hospital didn’t have any beds that day, and Michael calmed down nicely in the ER, so they sent us home with a prescription for Zyprexa and a follow-up visit with a local pediatric psychiatrist.
We still don’t know what’s wrong with Michael. Autism spectrum, ADHD, Oppositional Defiant or Intermittent Explosive Disorder have all been tossed around at various meetings with probation officers and social workers and counselors and teachers and school administrators. He’s been on a slew of antipsychotic and mood altering pharmaceuticals, a Russian novel of behavioral plans. Nothing seems to work.
At the start of seventh grade, Michael was accepted to an accelerated program for highly gifted math and science students. His IQ is off the charts. When he’s in a good mood, he will gladly bend your ear on subjects ranging from Greek mythology to the differences between Einsteinian and Newtonian physics to Doctor Who. He’s in a good mood most of the time. But when he’s not, watch out. And it’s impossible to predict what will set him off.
Several weeks into his new junior high school, Michael began exhibiting increasingly odd and threatening behaviors at school. We decided to transfer him to the district’s most restrictive behavioral program, a contained school environment where children who can’t function in normal classrooms can access their right to free public babysitting from 7:30-1:50 Monday through Friday until they turn 18.
The morning of the pants incident, Michael continued to argue with me on the drive. He would occasionally apologize and seem remorseful. Right before we turned into his school parking lot, he said, “Look, Mom, I’m really sorry. Can I have video games back today?”
“No way,” I told him. “You cannot act the way you acted this morning and think you can get your electronic privileges back that quickly.”
His face turned cold, and his eyes were full of calculated rage. “Then I’m going to kill myself,” he said. “I’m going to jump out of this car right now and kill myself.”
That was it. After the knife incident, I told him that if he ever said those words again, I would take him straight to the mental hospital, no ifs, ands, or buts. I did not respond, except to pull the car into the opposite lane, turning left instead of right.
“Where are you taking me?” he said, suddenly worried. “Where are we going?”
“You know where we are going,” I replied.
“No! You can’t do that to me! You’re sending me to hell! You’re sending me straight to hell!”
I pulled up in front of the hospital, frantically waiving for one of the clinicians who happened to be standing outside. “Call the police,” I said. “Hurry.”
Michael was in a full-blown fit by then, screaming and hitting. I hugged him close so he couldn’t escape from the car. He bit me several times and repeatedly jabbed his elbows into my rib cage. I’m still stronger than he is, but I won’t be for much longer.
The police came quickly and carried my son screaming and kicking into the bowels of the hospital. I started to shake, and tears filled my eyes as I filled out the paperwork—“Were there any difficulties with….at what age did your child….were there any problems with…has your child ever experienced…does your child have….”
At least we have health insurance now. I recently accepted a position with a local college, giving up my freelance career because when you have a kid like this, you need benefits. You’ll do anything for benefits. No individual insurance plan will cover this kind of thing.
For days, my son insisted that I was lying—that I made the whole thing up so that I could get rid of him. The first day, when I called to check up on him, he said, “I hate you. And I’m going to get my revenge as soon as I get out of here.”
By day three, he was my calm, sweet boy again, all apologies and promises to get better. I’ve heard those promises for years. I don’t believe them anymore.
On the intake form, under the question, “What are your expectations for treatment?” I wrote, “I need help.”
What I realized after I had kids was every single thing people think about what it’s like to be a parent and to have babies is not what it is. All you know is television or diaper commercials, and there’s always, like, talk about “Oh man, you’re going to have to do diapers.” And that has actually nothing to do with the emotional experience of creating life and being responsible for it. All of your priorities suddenly change. I spent my whole life, you know, worrying about a job and doing comedy and meeting someone I could share my life with, and suddenly all of that takes a back seat to something else. I didn’t anticipate any of that. Someone told me, you know, you go from being number one in your world to number four instantly. And then a lot of guys don’t talk about it, and they’re quietly having a nervous breakdown because they’re not the king of the castle anymore, they’re the last person that’s being dealt with in any situation, as it should be, but I didn’t think about any of it. And it wasn’t in any of the baby books. No one’s written a good baby book that explains the emotional change, you know, that thing that happens where you have a kid, and you’re up at three in the morning with your ear next to their mouth making sure they’re breathing. You know, that’s what it’s about.
It has always seemed strange to me… the things we admire in men, kindness and generosity, openness, honesty, understanding and feeling, are the concomitants of failure in our system. And those traits we detest, sharpness, greed, acquisitiveness, meanness, egotism and self-interest, are the traits of success. And while men admire the quality of the first they love the produce of the second.
This is not true. Introverts just don’t talk unless they have something to say. They hate small talk. Get an introvert talking about something they are interested in, and they won’t shut up for days.
Myth #2 – Introverts are shy. Shyness has nothing to do with being an Introvert. Introverts are not necessarily afraid of people. What they need is a reason to interact. They don’t interact for the sake of interacting. If you want to talk to an Introvert, just start talking. Don’t worry about being polite.
Myth #3 – Introverts are rude. Introverts often don’t see a reason for beating around the bush with social pleasantries. They want everyone to just be real and honest. Unfortunately, this is not acceptable in most settings, so Introverts can feel a lot of pressure to fit in, which they find exhausting.
Myth #4 – Introverts don’t like people. On the contrary, Introverts intensely value the few friends they have. They can count their close friends on one hand. If you are lucky enough for an introvert to consider you a friend, you probably have a loyal ally for life. Once you have earned their respect as being a person of substance, you’re in.
Myth #5 – Introverts don’t like to go out in public. Nonsense. Introverts just don’t like to go out in public FOR AS LONG. They also like to avoid the complications that are involved in public activities. They take in data and experiences very quickly, and as a result, don’t need to be there for long to “get it.” They’re ready to go home, recharge, and process it all. In fact, recharging is absolutely crucial for Introverts.
Myth #6 – Introverts always want to be alone. Introverts are perfectly comfortable with their own thoughts. They think a lot. They daydream. They like to have problems to work on, puzzles to solve. But they can also get incredibly lonely if they don’t have anyone to share their discoveries with. They crave an authentic and sincere connection with ONE PERSON at a time.
Myth #7 – Introverts are weird. Introverts are often individualists. They don’t follow the crowd. They’d prefer to be valued for their novel ways of living. They think for themselves and because of that, they often challenge the norm. They don’t make most decisions based on what is popular or trendy.
Myth #8 – Introverts are aloof nerds. Introverts are people who primarily look inward, paying close attention to their thoughts and emotions. It’s not that they are incapable of paying attention to what is going on around them, it’s just that their inner world is much more stimulating and rewarding to them.
Myth #9 – Introverts don’t know how to relax and have fun. Introverts typically relax at home or in nature, not in busy public places. Introverts are not thrill seekers and adrenaline junkies. If there is too much talking and noise going on, they shut down. Their brains are too sensitive to the neurotransmitter called Dopamine. Introverts and Extroverts have different dominant neuro-pathways. Just look it up.
Drew Petersen didn’t speak until he was 3½, but his mother, Sue, never believed he was slow. When he was 18 months old, in 1994, she was reading to him and skipped a word, whereupon Drew reached over and pointed to the missing word on the page. Drew didn’t produce much sound at that stage, but he already cared about it deeply. “Church bells would elicit a big response,” Sue told me. “Birdsong would stop him in his tracks.”
Sue, who learned piano as a child, taught Drew the basics on an old upright, and he became fascinated by sheet music. “He needed to decode it,” Sue said. “So I had to recall what little I remembered, which was the treble clef.” As Drew told me, “It was like learning 13 letters of the alphabet and then trying to read books.” He figured out the bass clef on his own, and when he began formal lessons at 5, his teacher said he could skip the first six months’ worth of material. Within the year, Drew was performing Beethoven sonatas at the recital hall at Carnegie Hall. “I thought it was delightful,” Sue said, “but I also thought we shouldn’t take it too seriously. He was just a little boy.”
On his way to kindergarten one day, Drew asked his mother, “Can I just stay home so I can learn something?” Sue was at a loss. “He was reading textbooks this big, and they’re in class holding up a blowup M,” she said. Drew, who is now 18, said: “At first, it felt lonely. Then you accept that, yes, you’re different from everyone else, but people will be your friends anyway.” Drew’s parents moved him to a private school. They bought him a new piano, because he announced at 7 that their upright lacked dynamic contrast. “It cost more money than we’d ever paid for anything except a down payment on a house,” Sue said. When Drew was 14, he discovered a home-school program created by Harvard; when I met him two years ago, he was 16, studying at the Manhattan School of Music and halfway to a Harvard bachelor’s degree.
Prodigies are able to function at an advanced adult level in some domain before age 12. “Prodigy” derives from the Latin “prodigium,” a monster that violates the natural order. These children have differences so evident as to resemble a birth defect, and it was in that context that I came to investigate them. Having spent 10 years researching a book about children whose experiences differ radically from those of their parents and the world around them, I found that stigmatized differences — having Down syndrome, autism or deafness; being a dwarf or being transgender — are often clouds with silver linings. Families grappling with these apparent problems may find profound meaning, even beauty, in them. Prodigiousness, conversely, looks from a distance like silver, but it comes with banks of clouds; genius can be as bewildering and hazardous as a disability. Despite the past century’s breakthroughs in psychology and neuroscience, prodigiousness and genius are as little understood as autism. “Genius is an abnormality, and can signal other abnormalities,” says Veda Kaplinsky of Juilliard, perhaps the world’s pre-eminent teacher of young pianists. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”
We live in ambitious times. You need only to go through the New York preschool application process, as I recently did for my son, to witness the hysteria attached to early achievement, the widespread presumption that a child’s destiny hinges on getting a baby foot on a tall ladder. Parental obsessiveness on this front reflects the hegemony of developmental psychiatry, with its insistence that first experience is formative. We now know that brain plasticity diminishes over time; it is easier to mold a child than to reform an adult. What are we to do with this information? I would hate for my children to feel that their worth is contingent on sustaining competitive advantage, but I’d also hate for them to fall short of their potential. Tiger mothers who browbeat their children into submission overemphasize a narrow category of achievement over psychic health. Attachment parenting, conversely, often sacrifices accomplishment to an ideal of unboundaried acceptance that can be equally pernicious. It’s tempting to propose some universal answer, but spending time with families of remarkably talented children showed me that what works for one child can be disastrous for another.
Children who are pushed toward success and succeed have a very different trajectory from that of children who are pushed toward success and fail. I once told Lang Lang, a prodigy par excellence and now perhaps the most famous pianist in the world, that by American standards, his father’s brutal methods — which included telling him to commit suicide, refusing any praise, browbeating him into abject submission — would count as child abuse. “If my father had pressured me like this and I had not done well, it would have been child abuse, and I would be traumatized, maybe destroyed,” Lang responded. “He could have been less extreme, and we probably would have made it to the same place; you don’t have to sacrifice everything to be a musician. But we had the same goal. So since all the pressure helped me become a world-famous star musician, which I love being, I would say that, for me, it was in the end a wonderful way to grow up.”
While it is true that some parents push their kids too hard and give them breakdowns, others fail to support a child’s passion for his own gift and deprive him of the only life that he would have enjoyed. You can err in either direction. Given that there is no consensus about how to raise ordinary children, it is not surprising that there is none about how to raise remarkable children. Like parents of children who are severely challenged, parents of exceptionally talented children are custodians of young people beyond their comprehension.
Spending time with the Petersens, I was struck not only by their mutual devotion but also by the easy way they avoided the snobberies that tend to cling to classical music. Sue is a school nurse; her husband, Joe, works in the engineering department of Volkswagen. They never expected the life into which Drew has led them, but they have neither been intimidated by it nor brash in pursuing it; it remains both a diligence and an art. “How do you describe a normal family?” Joe said. “The only way I can describe a normal one is a happy one. What my kids do brings a lot of joy into this household.” When I asked Sue how Drew’s talent had affected how they reared his younger brother, Erik, she said: “It’s distracting and different. It would be similar if Erik’s brother had a disability or a wooden leg.”
Prodigiousness manifests most often in athletics, mathematics, chess and music. A child may have a brain that processes chess moves or mathematical equations like some dream computer, which is its own mystery, but how can the mature emotional insight that is necessary to musicianship emerge from someone who is immature? “Young people like romance stories and war stories and good-and-evil stories and old movies because their emotional life mostly is and should be fantasy,” says Ken Noda, a great piano prodigy in his day who gave up public performance and now works at the Metropolitan Opera. “They put that fantasized emotion into their playing, and it is very convincing. I had an amazing capacity for imagining these feelings, and that’s part of what talent is. But it dries up, in everyone. That’s why so many prodigies have midlife crises in their late teens or early 20s. If our imagination is not replenished with experience, the ability to reproduce these feelings in one’s playing gradually diminishes.”
Musicians often talked to me about whether you achieve brilliance on the violin by practicing for hours every day or by reading Shakespeare, learning physics and falling in love. “Maturity, in music and in life, has to be earned by living,” the violinist Yehudi Menuhin once said. Who opens up or blocks access to such living? A musical prodigy’s development hinges on parental collaboration. Without that support, the child would never gain access to an instrument, the technical training that even the most devout genius requires or the emotional nurturance that enables a musician to achieve mature expression. As David Henry Feldman and Lynn T. Goldsmith, scholars in the field, have said, “A prodigy is a group enterprise.”
Some prodigies seem to trade on a splinter skill — an ability in music that occupies their whole consciousness, leaving them virtually incompetent in all other areas. Others have a dazzling capacity for achievement in general and select music from among multitudinous gifts. Mikhail and Natalie Paremski held comfortable positions within the Soviet system: Mikhail with the Kurchatov Institute of Atomic Agency; Natalie with the Moscow Engineering Physics Institute. Their daughter, Natasha, born in 1987, showed a precocious interest in the piano. “I was in the kitchen, and I thought, Who is playing?” Natalie recalls. “Then I saw: it’s the baby, picking out nursery songs.” By the time she was 4, Natasha had played a Chopin mazurka in a children’s concert.
After the Soviet Union collapsed, Mikhail emigrated to California; the family followed in 1995. Natasha entered fourth grade, two years younger than her classmates. Within months, she was speaking English without an accent and coming in first on every school test. The family couldn’t afford a good piano; they finally found a cheap one that “sounded like cabbage,” Natasha recalls, and she began performing Haydn concertos, Beethoven sonatas and Chopin études. “Everyone would say, ‘You must be so proud of your daughter,’ ” Natalie told me. “I used to say that it’s not for me to be proud; it’s Natasha who does this herself — but I learned that this is not the polite American way. So now I always say, ‘I am so proud of my daughter,’ and then maybe we can have a conversation.” Natasha agreed. “What did they do to make me practice?” she asked when I first interviewed her, at 16. “What did they do to make me eat or sleep?”
Natasha graduated with top honors from high school at 14 and was offered a full scholarship by Mannes College the New School for Music in New York. Her mother worried about a deficit of soul in New York. “There is no time for vision! People are just struggling to survive, like in Moscow,” Natalie said — to which her daughter replied, “Vision is how I survive.” In those early New York days, Natasha and her mother spoke by phone constantly. Nonetheless, Natalie said, “that was my present to her: I gave her her own life.”
In 2004, when Natasha was 16, I went to her Carnegie Hall debut, for which she played Rachmaninoff’s Piano Concerto No. 2. She’s a beautiful young woman, with cascades of hair and a sylphlike figure, and she wore a sleeveless, black velvet dress, so her arms would feel free, and a pair of insanely high heels that she said gave her better leverage on the pedals. Her parents were not there. “They’re too supportive to come,” Natasha told me just before the concert. Afterward, Natalie explained, “If I am there, I am so worried about every single note that I can’t even sit still. It’s not helpful to Natasha.”
Natasha later said she saw nothing strange in a musician’s ability to express emotions she has not experienced. “Had I experienced them, that wouldn’t necessarily help me to express them better in my music. I’m an actress, not a character; my job is to represent something, not to live it. Chopin wrote a mazurka, Person X in the audience wants to hear the mazurka and so I have to decipher the score and make it apprehensible to Person X, and it’s really hard to do. But it has nothing to do with my life experience.”
After the English lawyer Daines Barrington examined the 8-year-old Mozart in 1764, he wrote: “He had a thorough knowledge of the fundamental principles of composition. He was also a great master of modulation, and his transitions from one key to another were excessively natural and judicious.” Yet, Mozart was also clearly a child. “Whilst he was playing to me, a favorite cat came in, upon which he immediately left his harpsichord, nor could we bring him back for a considerable time. He would also sometimes run about the room with a stick between his legs by way of horse.”
Every prodigy is a chimera of such mastery and childishness, and the contrast between musical sophistication and personal immaturity can be striking. One prodigy I interviewed switched from the violin to the piano when she was 7. She offered to tell me why if I didn’t tell her mother. “I wanted to sit down,” she said.
Chloe Yu was born in Macao and came to the United States to study when she was 17. She married at 25, and her son, Marc, was born a year later, in Pasadena, Calif. While she was pregnant, Chloe played the piano to him. When Marc was almost 3, he picked out a few tunes on the piano with two fingers; within a few months, Chloe had found him a teacher advanced enough to respond to his emerging talent. At 5, he added the cello to his regimen. “Soon he asked for more instruments,” Chloe told me. “I said: ‘That’s it, Marc. Be realistic. Two is enough.’ ”
Chloe gave up on the master’s degree she was working on. She had divorced Marc’s father, but because she had no money, she and Marc ended up living with her ex-in-laws, in a room over the garage. Marc’s grandparents did not approve of his “excessive” devotion to the piano. “His grandmother loves him a lot,” Chloe said. “But she just wanted him to be a normal 5-year-old.” When Marc was in preschool, Chloe felt he was ready to perform, and she contacted local retirement facilities and hospitals to offer free recitals. Soon the papers were writing about this young genius. “When I began to understand how talented he is, I was so excited!” Chloe said. “And also so afraid!”
At 6, Marc won a fellowship for gifted youth that covered the down payment on a Steinway. By the time Marc was 8, he and Chloe were flying to China frequently for lessons; Chloe explained that whereas her son’s American teachers gave him broad interpretive ideas to explore freely, his Chinese teacher taught measure by measure. I asked Marc whether he found it difficult traveling so far. “Well, fortunately, I don’t have vestigial somnolence,” he said. I raised an eyebrow. “You know — jet lag,” he apologized.
Marc was being home-schooled to accommodate his performance and practice schedule. At the age of a third-grader, he was taking an SAT class. Chloe serves as his manager and reviews concert invitations with him. “In America, every kid has to be well rounded,” Chloe said. “They have 10 different activities, and they never excel at any of them. Americans want everyone to have the same life; it’s a cult of the average. This is wonderful for disabled children, who get things they would never have otherwise, but it’s a disaster for gifted children. Why should Marc spend his life learning sports he’s not interested in when he has this superb gift that gives him so much joy?”
At their home in California, I asked Marc what he thought of a normal childhood. “I already have a normal childhood,” he said. “Do you want to see my room? It’s messy, but you can come anyway.” Upstairs, he showed me a yellow remote-controlled helicopter that his father had sent from China. The bookshelves were crammed with Dr. Seuss, “Jumanji” and “The Wind in the Willows” but also “Moby-Dick”; with “Sesame Street” videos and also a series of DVDs on the music of Prague, Vienna and so on. We sat on the floor, and he showed me his favorite Gary Larson cartoons, and then we played the board game Mouse Trap.
Then we went downstairs, and Marc sat on a phone book on the piano bench so his hands would be high enough to play comfortably and launched into Chopin’s “Fantasie-Impromptu,” which he imbued with a quality of nuanced yearning that seemed almost inconceivable in someone with a shelf of Cookie Monster videos. “You see?” Chloe said to me. “He’s not a normal child. Why should he have a normal childhood?”
A parent is the progenitor of much of a child’s behavior, telling that child repeatedly who he has been, is and could be, reconciling accomplishment and naïveté. In constructing this narrative, parents often confuse the anomaly of developing fast with the objective of developing profoundly. There is no clear delineation between supporting and pressuring a child, between believing in your child and forcing your child to conform to what you imagine for him. If society’s expectations for most children with profound differences are too low, expectations for prodigies are often perilously high. “When you have a child whose gift is so overshadowing, it is possible for parents to be distracted and lose track of the child himself,” says Karen Monroe, a psychiatrist at Boston’s McLean Hospital who works with prodigious children.
If you dream of having a genius for a child, you will spot brilliance in your child, sometimes even when it isn’t there. Such children, despite being the subjects of obsessive attention, can suffer from not being seen; their sorrow is organized not so much around the rigor of practicing as around invisibility. And yet, accomplishment entails giving up the pleasures of the present moment in favor of anticipated triumphs, and that is an impulse that must be learned. Left to their own devices, children do not become world-class instrumentalists before they turn 10.
When I spoke to the mother of one musical prodigy on the telephone to set up an interview, I invited her and her daughter to dinner, but she said, “We have a family of fussy eaters, so we’ll eat before we come.” The girl and her parents, whom I’ve granted anonymity for their own protection, arrived wearing coats, and I offered to hang them up. “That won’t be necessary,” the mother said, and they sat holding them through the interview. I offered them something to drink, but the woman said, “We are so used to our schedule, and it’s not time for a drink right now.” In three hours, none of them had a sip of water. I had put out homemade cookies, and the daughter kept glancing at them; every time she did, the mother shot her a look. Whenever I asked the daughter a question, her mother jumped in to answer on her behalf; when the daughter did reply, she did so with an anxious glance at her mother, as if worried that she delivered the wrong response.
The daughter was holding her instrument case, so I invited her to play. “I think I’ll play the Bach Chaconne,” she said. Her mother said, “How about the Rimsky-Korsakov?” She replied, “No, no, no, the Chaconne is better.” The daughter had told me that she chose her instrument for its resemblance to her voice; now it provided her only chance to be heard over her mother. She played the Chaconne. When she finished, her mother said, “Now you can play the Rimsky-Korsakov.” The daughter dutifully launched into “Flight of the Bumblebee,” the proof of every virtuoso. “Vivaldi?” her mother said, and she played “Summer” from “The Four Seasons.” She played with a clear, bright tone, although not with such brilliance as to resolve the question of why a childhood had been sacrificed for this art. I had hoped this child would light up when her bow met the strings, but instead she brought out her instrument’s searing melancholy.
Throughout much of history, prodigies were thought to be possessed; Aristotle believed that there could be no genius without madness. Paganini was accused of putting himself in the hands of the devil. The Italian criminologist Cesare Lombroso said in 1891, “Genius is a true degenerative psychosis belonging to the group of moral insanity.” Recent neuroscience demonstrates that the processes of creativity and psychosis map similarly in the brain, each contingent on a reduced number of dopamine D2 receptors in the thalamus. A continuum runs between the two conditions; there is no sharp line.
The parents of children with disabilities must be educated to see the identity within a perceived illness, but the parents of prodigies are confronted with an identity and must be educated to recognize the prospect of illness within it. Even those without a sideline diagnosis like A.D.D. or Asperger’s need to mitigate the loneliness of being peerless and of having their primary emotional relationship with an inanimate object. “If you’re spending five hours a day practicing, and the other kids are out playing baseball, you’re not doing the same things,” Karen Monroe says. “Even if you love it and can’t imagine yourself doing anything else, that doesn’t mean you don’t feel lonely.”
If Chloe Yu scorned the idea of a normal childhood, May Armstrong simply had to bow to the reality that no such thing could be achieved with her only son, Kit. Born in 1992, Kit could count at 15 months; May taught him addition and subtraction at 2, and he worked out multiplication and division for himself. While digging in the garden, he explained the principle of leverage to his mother. By 5, he explained Einstein’s theory of time dilation to her. May, an economist, was frankly bemused: “By nature, every mother wants to be protective, but he didn’t need protection. I can’t say that was easy.”
May had left Taiwan at 22 to study in the United States and spent holidays by herself. “I knew what loneliness was all about, and I thought he needed a hobby he could enjoy on his own,” she says. So she started him on piano lessons when he was 5, even though she had no interest in music. After three weeks of lessons, Kit started composing without an instrument on staff paper: the written language of music had come to him whole.
When Kit was 3, a supervisor of his play group told May that he let other children push him around. “I went in one day and saw another child snatch a toy away from him,” May said. “I told him he should stand up for himself, and he said: ‘That kid will be bored in two minutes, and then I can play with it again. Why start a fight?’ So he was mature already. What did I have to teach this kid? But he always seemed happy, and that was what I wanted most for him. He used to look in the mirror and burst out laughing.” May enrolled him in school. “His teacher told me that she wanted her other kids to grow up in kindergarten,” she said. “She wanted mine to grow down.”
By age 9, he had graduated from high school and started college in Utah. “The other students often thought it was strange that he was there,” May says, “but Kit never did.” His piano skills, meanwhile, had advanced enough so that by the time he was 10, he appeared on David Letterman. Shortly after, Kit toured the physics research facility at Los Alamos. A physicist said that, unlike the postdoctoral physicists who usually visited, Kit was so bright that no one could “find the bottom of this boy’s knowledge.” A few years later, Kit attended a summer program at M.I.T., where he helped edit papers in physics, chemistry and mathematics. “He just understands things,” May said to me, almost resigned. “Someday, I want to work with parents of disabled children, because I know their bewilderment is like mine. I had no idea how to be a mother to Kit, and there was no place to find out.”
May moved them to London to pursue Kit’s musicianship, and he soon met the revered pianist Alfred Brendel; he took Kit on and refused payment for lessons. When he learned that Kit was practicing at a piano showroom, he had a Steinway delivered to their apartment.
“I have no ear to be any help to Kit,” May said. “All I can do is remind him that he is very lucky to have been born with those talents. I’d have preferred that he be a professor of mathematics. It’s an easier life.” Then she added, “But Kit has decided that mathematics is his hobby, and the piano is his work.” At 18, Kit was pursuing an M.A. in pure mathematics in Paris; he said he did it “to unwind.” I asked May if she ever worried that Kit, like many young people of remarkable ability, might have a nervous breakdown. She laughed. “If anyone’s going to have a nervous breakdown in this setup,” she said, “it’s me!”
There is no federal mandate for gifted education. But if we recognize the importance of special programs for students whose atypical brains encode less-accepted differences, we should extrapolate to create programs for those whose atypical brains encode remarkable abilities. Writing in Time magazine in 2007, the educator John Cloud faulted the “radically egalitarian” values underlying the No Child Left Behind Act, which provided little support for gifted students. Once again, it falls to parents to advocate for their children’s needs, often in the face of a hostile or indifferent educational system. Leon Botstein, president of Bard College, himself a conductor and a former wunderkind, remarked dryly, “If Beethoven were sent to nursery school today, they would medicate him, and he would be a postal clerk.”
Growing up gay in the 1970s, I encountered prejudice from the world at large that often crossed into disdain. My parents were never derisive, but they were uncomfortable with the ways I differed from them and encouraged me to try to be straight. I began researching children of difference in a quest to forgive my mother and father for pressing me to be untrue to myself. I wanted to look at the process through which parents reconcile themselves to children who throw up significant challenges. I found that many families come to celebrate children with characteristics they initially found incomprehensible — just as my parents did. Having seen how hard it was for other parents, I decided, with considerable relief, that mine had actually done a pretty good job and realized that I was ready to be a parent myself.
My research on prodigies echoed my study of children with other differences. Sue Petersen compared her experience to having a child with a wooden leg; May Armstrong saw common ground with parents of disabled children; and I realized that parenthood always entails perplexity and that the valence of that perplexity matters less than the spirit with which parents respond to it. Half the prodigies I studied seemed to be under pressure to be even more astonishing than they naturally were, and the other half, to be more ordinary than their talents. Studying their families, I gradually recognized that all parenting is guesswork, and that difference of any kind, positive or negative, makes the guessing harder. That insight has largely shaped me as a father. I don’t think I would love my children more if they could play Rachmaninoff’s Third, and I hope I wouldn’t love them less for having that consuming skill, any more than I would if they were affected with a chronic illness. But I am frankly relieved that so far, they show no such uncanny aptitude.
Nobel laureate John Steinbeck (1902-1968) might be best-known as the author of East of Eden, The Grapes of Wrath, and Of Mice and Men, but he was also a prolific letter-writer. Steinbeck: A Life in Letters constructs an alternative biography of the iconic author through some 850 of his most thoughtful, witty, honest, opinionated, vulnerable, and revealing letters to family, friends, his editor, and a circle of equally well-known and influential public figures.
Among his correspondence is this beautiful response to his eldest son Thom’s 1958 letter, in which the teenage boy confesses to have fallen desperately in love with a girl named Susan while at boarding school. Steinbeck’s words of wisdom — tender, optimistic, timeless, infinitely sagacious — should be etched onto the heart and mind of every living, breathing human being.
Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.
Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one eminent American philosopher, after hearing about his research, quickly turned away, saying, “I am not interested in the psychology of stupidity.”
The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.
West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here’s a example:
In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that’s wrong. The correct solution is forty-seven days.
West also gave a puzzle that measured subjects’ vulnerability to something called “anchoring bias,” which Kahneman and Tversky had demonstrated in the nineteen-seventies. Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students exposed to a small “anchor”—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold.
But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”
The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.
Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes. Education also isn’t a savior; as Kahneman and Shane Frederick first noted many years ago, more than fifty per cent of students at Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball question.
What explains this result? One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.
The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.
If you live in America in the 21st century you’ve probably had to listen to a lot of people tell you how busy they are. It’s become the default response when you ask anyone how they’re doing: “Busy!” “So busy.” “Crazy busy.” It is, pretty obviously, a boast disguised as a complaint. And the stock response is a kind of congratulation: “That’s a good problem to have,” or “Better than the opposite.”
Notice it isn’t generally people pulling back-to-back shifts in the I.C.U. or commuting by bus to three minimum-wage jobs who tell you how busy they are; what those people are is not busy but tired. Exhausted. Dead on their feet. It’s almost always people whose lamented busyness is purely self-imposed: work and obligations they’ve taken on voluntarily, classes and activities they’ve “encouraged” their kids to participate in. They’re busy because of their own ambition or drive or anxiety, because they’re addicted to busyness and dread what they might have to face in its absence.
Almost everyone I know is busy. They feel anxious and guilty when they aren’t either working or doing something to promote their work. They schedule in time with friends the way students with 4.0 G.P.A.’s make sure to sign up for community service because it looks good on their college applications. I recently wrote a friend to ask if he wanted to do something this week, and he answered that he didn’t have a lot of time but if something was going on to let him know and maybe he could ditch work for a few hours. I wanted to clarify that my question had not been a preliminary heads-up to some future invitation; this was the invitation. But his busyness was like some vast churning noise through which he was shouting out at me, and I gave up trying to shout back over it.
Even children are busy now, scheduled down to the half-hour with classes and extracurricular activities. They come home at the end of the day as tired as grown-ups. I was a member of the latchkey generation and had three hours of totally unstructured, largely unsupervised time every afternoon, time I used to do everything from surfing the World Book Encyclopedia to making animated films to getting together with friends in the woods to chuck dirt clods directly into one another’s eyes, all of which provided me with important skills and insights that remain valuable to this day. Those free hours became the model for how I wanted to live the rest of my life.
The present hysteria is not a necessary or inevitable condition of life; it’s something we’ve chosen, if only by our acquiescence to it. Not long ago I Skyped with a friend who was driven out of the city by high rent and now has an artist’s residency in a small town in the south of France. She described herself as happy and relaxed for the first time in years. She still gets her work done, but it doesn’t consume her entire day and brain. She says it feels like college - she has a big circle of friends who all go out to the cafe together every night. She has a boyfriend again. (She once ruefully summarized dating in New York: “Everyone’s too busy and everyone thinks they can do better.”) What she had mistakenly assumed was her personality - driven, cranky, anxious and sad - turned out to be a deformative effect of her environment. It’s not as if any of us wants to live like this, any more than any one person wants to be part of a traffic jam or stadium trampling or the hierarchy of cruelty in high school - it’s something we collectively force one another to do.
Busyness serves as a kind of existential reassurance, a hedge against emptiness; obviously your life cannot possibly be silly or trivial or meaningless if you are so busy, completely booked, in demand every hour of the day. I once knew a woman who interned at a magazine where she wasn’t allowed to take lunch hours out, lest she be urgently needed for some reason. This was an entertainment magazine whose raison d’être was obviated when “menu” buttons appeared on remotes, so it’s hard to see this pretense of indispensability as anything other than a form of institutional self-delusion. More and more people in this country no longer make or do anything tangible; if your job wasn’t performed by a cat or a boa constrictor in a Richard Scarry book I’m not sure I believe it’s necessary. I can’t help but wonder whether all this histrionic exhaustion isn’t a way of covering up the fact that most of what we do doesn’t matter.
I am not busy. I am the laziest ambitious person I know. Like most writers, I feel like a reprobate who does not deserve to live on any day that I do not write, but I also feel that four or five hours is enough to earn my stay on the planet for one more day. On the best ordinary days of my life, I write in the morning, go for a long bike ride and run errands in the afternoon, and in the evening I see friends, read or watch a movie. This, it seems to me, is a sane and pleasant pace for a day. And if you call me up and ask whether I won’t maybe blow off work and check out the new American Wing at the Met or ogle girls in Central Park or just drink chilled pink minty cocktails all day long, I will say, what time?
But just in the last few months, I’ve insidiously started, because of professional obligations, to become busy. For the first time I was able to tell people, with a straight face, that I was “too busy” to do this or that thing they wanted me to do. I could see why people enjoy this complaint; it makes you feel important, sought-after and put-upon. Except that I hate actually being busy. Every morning my in-box was full of e-mails asking me to do things I did not want to do or presenting me with problems that I now had to solve. It got more and more intolerable until finally I fled town to the Undisclosed Location from which I’m writing this.
Here I am largely unmolested by obligations. There is no TV. To check e-mail I have to drive to the library. I go a week at a time without seeing anyone I know. I’ve remembered about buttercups, stink bugs and the stars. I read. And I’m finally getting some real writing done for the first time in months. It’s hard to find anything to say about life without immersing yourself in the world, but it’s also just about impossible to figure out what it might be, or how best to say it, without getting the hell out of it again.
Idleness is not just a vacation, an indulgence or a vice; it is as indispensable to the brain as vitamin D is to the body, and deprived of it we suffer a mental affliction as disfiguring as rickets. The space and quiet that idleness provides is a necessary condition for standing back from life and seeing it whole, for making unexpected connections and waiting for the wild summer lightning strikes of inspiration - it is, paradoxically, necessary to getting any work done. “Idle dreaming is often of the essence of what we do,” wrote Thomas Pynchon in his essay on sloth. Archimedes’ “Eureka” in the bath, Newton’s apple, Jekyll & Hyde and the benzene ring: history is full of stories of inspirations that come in idle moments and dreams. It almost makes you wonder whether loafers, goldbricks and no-accounts aren’t responsible for more of the world’s great ideas, inventions and masterpieces than the hardworking.
“The goal of the future is full unemployment, so we can play. That’s why we have to destroy the present politico-economic system.” This may sound like the pronouncement of some bong-smoking anarchist, but it was actually Arthur C. Clarke, who found time between scuba diving and pinball games to write “Childhood’s End” and think up communications satellites. My old colleague Ted Rall recently wrote a column proposing that we divorce income from work and give each citizen a guaranteed paycheck, which sounds like the kind of lunatic notion that’ll be considered a basic human right in about a century, like abolition, universal suffrage and eight-hour workdays. The Puritans turned work into a virtue, evidently forgetting that God invented it as a punishment.
Perhaps the world would soon slide to ruin if everyone behaved as I do. But I would suggest that an ideal human life lies somewhere between my own defiant indolence and the rest of the world’s endless frenetic hustle. My role is just to be a bad influence, the kid standing outside the classroom window making faces at you at your desk, urging you to just this once make some excuse and get out of there, come outside and play. My own resolute idleness has mostly been a luxury rather than a virtue, but I did make a conscious decision, a long time ago, to choose time over money, since I’ve always understood that the best investment of my limited time on earth was to spend it with people I love. I suppose it’s possible I’ll lie on my deathbed regretting that I didn’t work harder and say everything I had to say, but I think what I’ll really wish is that I could have one more beer with Chris, another long talk with Megan, one last good hard laugh with Boyd. Life is too short to be busy.