By JONATHAN SAFRAN FOER NY Times Published: June 8, 2013
A COUPLE of weeks ago, I saw a stranger crying in public. I was in Brooklyn’s Fort Greene neighborhood, waiting to meet a friend for breakfast. I arrived at the restaurant a few minutes early and was sitting on the bench outside, scrolling through my contact list. A girl, maybe 15 years old, was sitting on the bench opposite me, crying into her phone. I heard her say, “I know, I know, I know” over and over.
What did she know? Had she done something wrong? Was she being comforted? And then she said, “Mama, I know,” and the tears came harder.
What was her mother telling her? Never to stay out all night again? That everybody fails? Is it possible that no one was on the other end of the call, and that the girl was merely rehearsing a difficult conversation?
“Mama, I know,” she said, and hung up, placing her phone on her lap.
I was faced with a choice: I could interject myself into her life, or I could respect the boundaries between us. Intervening might make her feel worse, or be inappropriate. But then, it might ease her pain, or be helpful in some straightforward logistical way. An affluent neighborhood at the beginning of the day is not the same as a dangerous one as night is falling. And I was me, and not someone else. There was a lot of human computing to be done.
It is harder to intervene than not to, but it is vastly harder to choose to do either than to retreat into the scrolling names of one’s contact list, or whatever one’s favorite iDistraction happens to be. Technology celebrates connectedness, but encourages retreat. The phone didn’t make me avoid the human connection, but it did make ignoring her easier in that moment, and more likely, by comfortably encouraging me to forget my choice to do so. My daily use of technological communication has been shaping me into someone more likely to forget others. The flow of water carves rock, a little bit at a time. And our personhood is carved, too, by the flow of our habits.
Psychologists who study empathy and compassion are finding that unlike our almost instantaneous responses to physical pain, it takes time for the brain to comprehend the psychological and moral dimensions of a situation. The more distracted we become, and the more emphasis we place on speed at the expense of depth, the less likely and able we are to care.
Everyone wants his parent’s, or friend’s, or partner’s undivided attention — even if many of us, especially children, are getting used to far less. Simone Weil wrote, “Attention is the rarest and purest form of generosity.” By this definition, our relationships to the world, and to one another, and to ourselves, are becoming increasingly miserly.
Most of our communication technologies began as diminished substitutes for an impossible activity. We couldn’t always see one another face to face, so the telephone made it possible to keep in touch at a distance. One is not always home, so the answering machine made a kind of interaction possible without the person being near his phone. Online communication originated as a substitute for telephonic communication, which was considered, for whatever reasons, too burdensome or inconvenient. And then texting, which facilitated yet faster, and more mobile, messaging. These inventions were not created to be improvements upon face-to-face communication, but a declension of acceptable, if diminished, substitutes for it.
But then a funny thing happened: we began to prefer the diminished substitutes. It’s easier to make a phone call than to schlep to see someone in person. Leaving a message on someone’s machine is easier than having a phone conversation — you can say what you need to say without a response; hard news is easier to leave; it’s easier to check in without becoming entangled. So we began calling when we knew no one would pick up.
Shooting off an e-mail is easier, still, because one can hide behind the absence of vocal inflection, and of course there’s no chance of accidentally catching someone. And texting is even easier, as the expectation for articulateness is further reduced, and another shell is offered to hide in. Each step “forward” has made it easier, just a little, to avoid the emotional work of being present, to convey information rather than humanity.
THE problem with accepting — with preferring — diminished substitutes is that over time, we, too, become diminished substitutes. People who become used to saying little become used to feeling little.
With each generation, it becomes harder to imagine a future that resembles the present. My grandparents hoped I would have a better life than they did: free of war and hunger, comfortably situated in a place that felt like home. But what futures would I dismiss out of hand for my grandchildren? That their clothes will be fabricated every morning on 3-D printers? That they will communicate without speaking or moving?
Only those with no imagination, and no grounding in reality, would deny the possibility that they will live forever. It’s possible that many reading these words will never die. Let’s assume, though, that we all have a set number of days to indent the world with our beliefs, to find and create the beauty that only a finite existence allows for, to wrestle with the question of purpose and wrestle with our answers.
We often use technology to save time, but increasingly, it either takes the saved time along with it, or makes the saved time less present, intimate and rich. I worry that the closer the world gets to our fingertips, the further it gets from our hearts. It’s not an either/or — being “anti-technology” is perhaps the only thing more foolish than being unquestioningly “pro-technology” — but a question of balance that our lives hang upon.
Most of the time, most people are not crying in public, but everyone is always in need of something that another person can give, be it undivided attention, a kind word or deep empathy. There is no better use of a life than to be attentive to such needs. There are as many ways to do this as there are kinds of loneliness, but all of them require attentiveness, all of them require the hard work of emotional computation and corporeal compassion. All of them require the human processing of the only animal who risks “getting it wrong” and whose dreams provide shelters and vaccines and words to crying strangers.
We live in a world made up more of story than stuff. We are creatures of memory more than reminders, of love more than likes. Being attentive to the needs of others might not be the point of life, but it is the work of life. It can be messy, and painful, and almost impossibly difficult. But it is not something we give. It is what we get in exchange for having to die.
Inside a plain warehouselike office building filled with rows of cubicles, Melissa Stark stares at the image of an envelope on a computer screen. The handwriting is barely legible and appears to be addressed to someone in the “cty of Jesey.”
“Is that a 7 or a 9 in the address?” Ms. Stark said to no one in particular. Then she typed in a few numbers and a list of possible addresses popped up on her screen. “Looks like a 9,” she said before selecting an address, apparently in Jersey City. The letter disappears and another one appears on the screen.
“That means I got it right,” Ms. Stark said.
Ms. Stark is one of the Postal Service’s data conversion operators, a techie title for someone who deciphers unreadable addresses, and she is one of the last of a breed. In September, the post office will close one of its two remaining centers where workers try to read the scribble on envelopes and address labels that machines cannot. At one time, there were 55 plants around the country where addresses rejected by machines were guessed at by workers aided with special software to get the mail where it was intended.
But improved scanning technology now allows machines to “read” virtually all of the 160 billion pieces of mail that moved through the system last year. As machines have improved, workers have been let go, and after September, the facility here will be the post office’s only center for reading illegible mail.
“We understand that these remote encoding centers were planned as a temporary fix,” said Barbara Batin, the center’s operations manager, using the facilities’ formal name. “They were created and deployed with the knowledge that new technology would eventually put us out of work.”
But for now, this center operates 365 days a year, 24 hours a day. More than 700 workers stare at images of letters, packages, change-of-address cards and other mail, trying to figure out where they are supposed to go. It is not easy work. With software, a knowledge of geography and more than a little intuition, an operator has exactly 90 seconds to move each piece of mail.
When mail-sorting machines around the country encounter addresses they cannot read, an electronic image of the bad handwriting or faded address is transmitted to operators here who view them and try to fill in the missing information by typing in a letter or a number. Once corrected, the information is returned to the processing plant where the mail is sent on to a local post office, ultimately ending up where it is supposed to go.
“We get the worst of the worst,” Ms. Batin said. “It used to be that we’d get letters that were somewhat legible but the machines weren’t good enough to read them. Now we get letters and packages with the most awful handwriting you can imagine. Still, it’s our job to make sure it gets to where it’s supposed to go.”
Over the years, the Postal Service has become the world leader in optical character recognition — software capable of reading computer-generated lettering and handwriting — sinking millions of dollars into equipment that can read nearly 98 percent of all hand-addressed mail and 99.5 percent of machine-addressed pieces.
That was not always the case. In the beginning, people sorted mail. As the volume and variety increased, the post office turned to automation. But the machines could read only about 35 percent of the mail at first and had trouble with handwritten addresses. So the Postal Service set up the centers, using people to supplement the scanners. At the height of the program, in 1997, the centers processed 19 billion images annually, about 10 percent of all mail at the time, the post office said.
In the last year, this center, and the one in Wichita, Kan., that will close in September, deciphered just 2.4 billion images, or a mere 1.5 percent of the mail, the post office said.
Speed is important. Each worker in this nearly football-field-length room is expected to process about 1,200 images an hour, and they average three seconds an image.
“Not everyone can process all the types of mail that we get,” said Ruth Burns, a group leader who sits in the middle of the sprawling room watching a bank of computer screens. “Some people are better at reading handwriting. Some are better at reading faded addresses. It varies.”
Rita Archuletta, who has worked at the center for 16 years, said she worked only on addresses involving letters, magazines and items listed as “undeliverable as addressed.” She does not do large envelopes, for example.
“My supervisor said my speed was too slow on those,” she said.
Ms. Archuletta said that over the years she had seen her share of impossible letters, like the one addressed to the house “down the street from the drugstore on the corner” or one intended for “the place next to the red barn.” Still, she said bad handwriting was the worst. “And most of the bad scribble seems to be coming from people back East,” she said with a smile. “They really can’t write.”
Natalie Jenkins, who started at the facility a year after it opened in 1994, said that while bad penmanship was a problem, addresses in different languages gave her the most trouble.
“We get a lot of mail from San Juan, and it’s in Spanish,” she said. “The machines can’t read it, so we have to. It does get easier after you’ve been doing it for a while. You start to recognize certain things.”
The saddest letter Ms. Jenkins has seen was addressed to God, apparently written by a little girl whose father had just died. “It broke my heart,” she said.
The best letters, Ms. Jenkins said, are those addressed to Santa Claus. They come in without an address and are sent to a processing center in Alaska, where volunteers answer them.
Back at Ms. Stark’s workstation, the image of an extremely faded letter with no discernible address appeared on the screen.
She zooms in. “Is that a ZIP code in the corner?” she asked, staring at the image for a few seconds.
Finally, she hit the reject button. The letter will be placed in a bin back at the mail processing plant where someone else will try to figure out the address by physically examining it.
“There are some things even we can’t read,” Ms. Stark said as another image popped up.
GREAT design, the management expert Gary Hamel once said, is like Justice Potter Stewart’s famous definition of pornography — you know it when you see it. You want it, too: brain scan studies reveal that the sight of an attractive product can trigger the part of the motor cerebellum that governs hand movement. Instinctively, we reach out for attractive things; beauty literally moves us.
Yet, while we are drawn to good design, as Mr. Hamel points out, we’re not quite sure why.
This is starting to change. A revolution in the science of design is already under way, and most people, including designers, aren’t even aware of it.
Take color. Last year, German researchers found that just glancing at shades of green can boost creativity and motivation. It’s not hard to guess why: we associate verdant colors with food-bearing vegetation — hues that promise nourishment.
This could partly explain why window views of landscapes, research shows, can speed patient recovery in hospitals, aid learning in classrooms and spur productivity in the workplace. In studies of call centers, for example, workers who could see the outdoors completed tasks 6 to 7 percent more efficiently than those who couldn’t, generating an annual savings of nearly $3,000 per employee.
In some cases the same effect can happen with a photographic or even painted mural, whether or not it looks like an actual view of the outdoors. Corporations invest heavily to understand what incentivizes employees, and it turns out that a little color and a mural could do the trick.
Simple geometry is leading to similar revelations. For more than 2,000 years, philosophers, mathematicians and artists have marveled at the unique properties of the “golden rectangle”: subtract a square from a golden rectangle, and what remains is another golden rectangle, and so on and so on — an infinite spiral. These so-called magical proportions (about 5 by 8) are common in the shapes of books, television sets and credit cards, and they provide the underlying structure for some of the most beloved designs in history: the facades of the Parthenon and Notre Dame, the face of the “Mona Lisa,” the Stradivarius violin and the original iPod.
Experiments going back to the 19th century repeatedly show that people invariably prefer images in these proportions, but no one has known why.
Then, in 2009, a Duke University professor demonstrated that our eyes can scan an image fastest when its shape is a golden rectangle. For instance, it’s the ideal layout of a paragraph of text, the one most conducive to reading and retention. This simple shape speeds up our ability to perceive the world, and without realizing it, we employ it wherever we can.
Certain patterns also have universal appeal. Natural fractals — irregular, self-similar geometry — occur virtually everywhere in nature: in coastlines and riverways, in snowflakes and leaf veins, even in our own lungs. In recent years, physicists have found that people invariably prefer a certain mathematical density of fractals — not too thick, not too sparse. The theory is that this particular pattern echoes the shapes of trees, specifically the acacia, on the African savanna, the place stored in our genetic memory from the cradle of the human race. To paraphrase one biologist, beauty is in the genes of the beholder — home is where the genome is.
LIFE magazine named Jackson Pollock “the greatest living painter in the United States” in 1949, when he was creating canvases now known to conform to the optimal fractal density (about 1.3 on a scale of 1 to 2 from void to solid). Could Pollock’s late paintings result from his lifelong effort to excavate an image buried in all of our brains?
We respond so dramatically to this pattern that it can reduce stress levels by as much as 60 percent — just by being in our field of vision. One researcher has calculated that since Americans spend $300 billion a year dealing with stress-related illness, the economic benefits of these shapes, widely applied, could be in the billions.
It should come as no surprise that good design, often in very subtle ways, can have such dramatic effects. After all, bad design works the other way: poorly designed computers can injure your wrists, awkward chairs can strain your back and over-bright lighting and computer screens can fatigue your eyes.
We think of great design as art, not science, a mysterious gift from the gods, not something that results just from diligent and informed study. But if every designer understood more about the mathematics of attraction, the mechanics of affection, all design — from houses to cellphones to offices and cars — could both look good and be good for you.
A COUPLE of evolutionary psychologists recently published a book about human sexual behavior in prehistory called “Sex at Dawn.” Upon hearing of the project, one colleague, dubious that a modern scholar could hope to know anything about that period, asked them, “So what do you do, close your eyes and dream?”
Actually, it’s a little more involved. Evolutionary psychologists who study mating behavior often begin with a hypothesis about how modern humans mate: say, that men think about sex more than women do. Then they gather evidence — from studies, statistics and surveys — to support that assumption. Finally, and here’s where the leap occurs, they construct an evolutionary theory to explain why men think about sex more than women, where that gender difference came from, what adaptive purpose it served in antiquity, and why we’re stuck with the consequences today.
Lately, however, a new cohort of scientists have been challenging the very existence of the gender differences in sexual behavior that Darwinians have spent the past 40 years trying to explain and justify on evolutionary grounds.
Of course, no fossilized record can really tell us how people behaved or thought back then, much less why they behaved or thought as they did. Nonetheless, something funny happens when social scientists claim that a behavior is rooted in our evolutionary past. Assumptions about that behavior take on the immutability of a physical trait — they come to seem as biologically rooted as opposable thumbs or ejaculation.
Using evolutionary psychology to back up these assumptions about men and women is nothing new. In “The Descent of Man, and Selection in Relation to Sex,” Charles Darwin gathered evidence for the notion that, through competition for mates and sustenance, natural selection had encouraged man’s “more inventive genius” while nurturing woman’s “greater tenderness.” In this way, he suggested that the gender differences he saw around him — men sought power and made money; women stayed at home — weren’t simply the way things were in Victorian England. They were the way things had always been.
A century later, a new batch of scientists began applying Darwinian doctrine to the conduct of mating, and specifically to three assumptions that endure to this day: men are less selective about whom they’ll sleep with; men like casual sex more than women; and men have more sexual partners over a lifetime.
In 1972, Robert L. Trivers, a graduate student at Harvard, addressed that first assumption in one of evolutionary psychology’s landmark studies, “Parental Investment and Sexual Selection.” He argued that women are more selective about whom they mate with because they’re biologically obliged to invest more in offspring. Given the relative paucity of ova and plenitude of sperm, as well as the unequal feeding duties that fall to women, men invest less in children. Therefore, men should be expected to be less discriminating and more aggressive in competing for females.
It was an elegant, powerful application of evolutionary theory to the mating game. The evolutionary psychologists of the 1980s and ’90s built on Mr. Trivers’s theory to explain a wide array of stereotypical gender differences in mating.
In 1993, David M. Buss and David P. Schmitt used parental investment theory to explain why men should be expected to “devote a larger proportion of their total mating effort to short-term mating.” Because men invested less time and effort in their offspring, they evolved toward promiscuity, while women evolved away from it. Promiscuity, the researchers hypothesized, would have been more damaging to the female reputation than to the male reputation. If a man mated with a promiscuous woman, he would never be able to ensure his paternity. Men, on the other hand, could potentially enhance their status by pursuing a short-term mating strategy. (Think Kennedy, Clinton, Spitzer, Letterman and so forth. My space is limited.)
One of the earliest critics of this kind of thinking was Stephen Jay Gould. He wrote in 1997 that parental investment theory “will not explain the full panoply of supposed sexual differences so dear to pop psychology.” Mr. Gould felt that the field had become overrun with “ultra-Darwinians,” and that evolutionary psychology would be a more fruitful science if it didn’t limit itself “to the blinkered view” that evolutionary explanations accounted for every difference.
BUT if evolution didn’t determine human behavior, what did? The most common explanation is the effect of cultural norms. That, for instance, society tends to view promiscuous men as normal and promiscuous women as troubled outliers, or that our “social script” requires men to approach women while the pickier women do the selecting. Over the past decade, sociocultural explanations have gained steam.
Take the question of promiscuity. Everyone has always assumed — and early research had shown — that women desired fewer sexual partners over a lifetime than men. But in 2003, two behavioral psychologists, Michele G. Alexander and Terri D. Fisher, published the results of a study that used a “bogus pipeline” — a fake lie detector. When asked about actual sexual partners, rather than just theoretical desires, the participants who were not attached to the fake lie detector displayed typical gender differences. Men reported having had more sexual partners than women. But when participants believed that lies about their sexual history would be revealed by the fake lie detector, gender differences in reported sexual partners vanished. In fact, women reported slightly more sexual partners (a mean of 4.4) than did men (a mean of 4.0).
In 2009, another long-assumed gender difference in mating — that women are choosier than men — also came under siege. In speed dating, as in life, the social norm instructs women to sit in one place, waiting to be approached, while the men rotate tables. But in one study of speed-dating behavior, the evolutionary psychologists Eli J. Finkel and Paul W. Eastwick switched the “rotator” role. The men remained seated and the women rotated. By manipulating this component of the gender script, the researchers discovered that women became less selective — they behaved more like stereotypical men — while men were more selective and behaved more like stereotypical women. The mere act of physically approaching a potential romantic partner, they argued, engendered more favorable assessments of that person.
Recently, a third pillar appeared to fall. To back up the assumption that an enormous gap exists between men’s and women’s attitudes toward casual sex, evolutionary psychologists typically cite a classic study published in 1989. Men and women on a college campus were approached in public and propositioned with offers of casual sex by “confederates” who worked for the study. The confederate would say: “I have been noticing you around campus and I find you to be very attractive.” The confederate would then ask one of three questions: (1) “Would you go out with me tonight?” (2) “Would you come over to my apartment tonight?” or (3) “Would you go to bed with me tonight?”
Roughly equal numbers of men and women agreed to the date. But women were much less likely to agree to go to the confederate’s apartment. As for going to bed with the confederate, zero women said yes, while about 70 percent of males agreed.
Those results seemed definitive — until a few years ago, when Terri D. Conley, a psychologist at the University of Michigan, set out to re-examine what she calls “one of the largest documented sexuality gender differences,” that men have a greater interest in casual sex than women.
Ms. Conley found the methodology of the 1989 paper to be less than ideal. “No one really comes up to you in the middle of the quad and asks, ‘Will you have sex with me?’ ” she told me recently. “So there needs to be a context for it. If you ask people what they would do in a specific situation, that’s a far more accurate way of getting responses.” In her study, when men and women considered offers of casual sex from famous people, or offers from close friends whom they were told were good in bed, the gender differences in acceptance of casual-sex proposals evaporated nearly to zero.
IN light of this new research, will Darwinians consider revising their theories to reflect the possibility that our mating behavior is less hard-wired than they had believed?
Probably not. In an article responding to the new studies last year, Mr. Schmitt, a leading voice among hard-line Darwinians, ceded no ground. Addressing Ms. Conley’s finding that women were more likely to agree to casual sex with a celebrity, Mr. Schmitt argued that this resulted from “women’s (but not men’s) short-term mating psychology being specially designed to obtain good genes from physically attractive short-term partners.” He continued: “When women’s short-term-mating aim is activated (perhaps, temporarily, because of, e.g., high-fertility ovulatory status or desire for an extramarital affair, or more chronically, because of , e.g., a female-biased local sex ratio or a history of insecure parent-child attachment), they appear to express relatively focused desires for genetic traits in ‘sexy men’ that would biologically benefit women when short-term mating.”
In other words: Nothing new here, it’s all evolution.
Steven Pinker, the Harvard psychologist and popular author, also backs the Darwinians, whom he says still have the weight of evidence on their side. “A study which shows you can push some phenomenon around a bit at the margins,” he wrote to me in an e-mail, “is of dubious relevance to whether the phenomenon exists.”
But the fact that some gender differences can be manipulated, if not eliminated, by controlling for cultural norms suggests that the explanatory power of evolution can’t sustain itself when applied to mating behavior. This wouldn’t be the first time we’ve pushed these theories too far. How many stereotypical racial and ethnic differences, once declared evolutionarily determined under the banner of science, have been revealed instead as vestiges of power dynamics from earlier societies?
Citing the speed-dating study, Mr. Pinker added, “The only reason this flawed paper was published was that it challenged an evolutionary hypothesis … in particular a sex difference — as the Larry Summers incident shows, claims about sex differences are still politically inflammatory in the academy.” Here, he was referring to the much criticized 2005 comments Mr. Summers made when he was Harvard’s president suggesting that women’s underrepresentation in science and engineering was attributable not to socialization but to “different availability of aptitude at the high end.”
Perhaps these phenomena exist. Perhaps men do, over all, pursue more short-term mating. But given new research, continued rigid reliance on evolution as an explanation seems to risk elevating a limited guide to teleological status — a way of thinking that scientists should abhor.
“Some sexual features are deeply rooted in evolutionary heritage, such as the sex response and how quickly it takes men and women to become aroused,” said Paul Eastwick, a co-author of the speed-dating study. “However, if you’re looking at features such as how men and women regulate themselves in society to achieve specific goals, I believe those features are unlikely to have evolved sex differences. I consider myself an evolutionary psychologist. But many evolutionary psychologists don’t think this way. They think these features are getting shaped and honed by natural selection all the time.” How far does Darwin go in explaining human behavior?
WHEN The New York Times called, inquiring if I might pen a few words “from the horse’s mouth” about hypochondria, I confess I was taken aback. What light could I possibly shed on this type of crackpot behavior since, contrary to popular belief, I am not a hypochondriac but a totally different genus of crackpot?
What I am is an alarmist, which is in the same ballpark as the hypochondriac or, should I say, the same emergency room. Still there is a fundamental difference. I don’t experience imaginary maladies — my maladies are real.
What distinguishes my hysteria is that at the appearance of the mildest symptom, let’s say chapped lips, I instantly leap to the conclusion that the chapped lips indicate a brain tumor. Or maybe lung cancer. In one instance I thought it was Mad Cow.
The point is, I am always certain I’ve come down with something life threatening. It matters little that few people are ever found dead of chapped lips. Every minor ache or pain sends me to a doctor’s office in need of reassurance that my latest allergy will not require a heart transplant, or that I have misdiagnosed my hives and it’s not possible for a human being to contract elm blight.
Unfortunately, my wife bears the brunt of these pathological dramas. Like the time I awoke at 3 a.m. with a spot on my neck that to me clearly had the earmarks of a melanoma. That it turned out to be a hickey was confirmed only later at the hospital after much wailing and gnashing of teeth. Sitting at an ungodly hour in the emergency room where my wife tried to talk me down, I was making my way through the five stages of grief and was up to either “denial” or “bargaining” when a young resident fixed me with a rather supercilious eye and said sarcastically, “Your hickey is benign.”
But why should I live in such constant terror? I take great care of myself. I have a personal trainer who has me up to 50 push-ups a month, and combined with my knee bends and situps, I can now press the 100-pound barbell over my head with only minimal tearing of my stomach wall. I never smoke and I watch what I eat, carefully avoiding any foods that give pleasure. (Basically, I adhere to the Mediterranean diet of olive oil, nuts, figs and goat cheese, and except for the occasional impulse to become a rug salesman, it works.) In addition to yearly physicals I get all available vaccines and inoculations, making me immune to everything from Whipple’s disease to the Andromeda strain.
As far as vitamins go, if I take a few with each meal, over time I can usually get in quite a lot before the latest study confirms they’re worthless. Regarding medications, I’m flexible but prudent because while it’s true antibiotics kill bad bacteria, I’m always afraid they’ll kill my good bacteria, not to mention my pheromones, and then I won’t give off any sexual vibes in a crowded elevator.
It’s also true that when I leave the house to go for a stroll in Central Park or to Starbucks for a latte I might just pick up a quick cardiogram or CT scan prophylactically. My wife calls this nonsense and says that in the end it’s all genetic. My parents both lived to ripe old ages but absolutely refused to pass their genes to me as they believed an inheritance often spoils the child.
Even when the results of my yearly checkup show perfect health, how can I relax knowing that the minute I leave the doctor’s office something may start growing in me and, by the time a full year rolls around, my chest X-ray will look like a Jackson Pollock? Incidentally, this relentless preoccupation with health has made me quite the amateur medical expert. Not that I don’t make an occasional mistake — but what doctor doesn’t? For example, I once convinced a woman who experienced a mild ringing in her ears that she had the flesh-eating bacteria, and another time I pronounced a man dead who had simply dozed off in a chair.
But what’s this obsession with personal vulnerability? When I panic over symptoms that require no more than an aspirin or a little calamine lotion, what is it I’m really frightened of? My best guess is dying. I have always had an animal fear of death, a fate I rank second only to having to sit through a rock concert. My wife tries to be consoling about mortality and assures me that death is a natural part of life, and that we all die sooner or later. Oddly this news, whispered into my ear at 3 a.m., causes me to leap screaming from the bed, snap on every light in the house and play my recording of “The Stars and Stripes Forever” at top volume till the sun comes up.
I sometimes imagine that death might be more tolerable if I passed away in my sleep, although the reality is, no form of dying is acceptable to me with the possible exception of being kicked to death by a pair of scantily clad cocktail waitresses.
Perhaps if I were a religious person, which I am not, although I sometimes do have the intimation that we all may be part of something larger — like a Ponzi scheme. A great Spanish philosopher wrote that all humans long for “the eternal persistence of consciousness.” Not an easy state to maintain, especially when you’re dining with people who keep talking about their children.
And yet, there are worse things than death. Many of them playing at a theater near you. For instance, I would not like to survive a stroke and for the rest of my life talk out of the side of my mouth like a racetrack tout. I would also not like to go into a coma, to lie in a hospital bed where I’m not dead but can’t even blink my eyes and signal the nurse to switch the channel from Fox News. And incidentally, who’s to say the nurse isn’t one of those angel of death crazies who hates to see people suffer and fills my intravenous glucose bag with Exxon regular.
Worse than death, too, is to be on life support listening to my loved ones in a heated debate over whether to terminate me and hear my wife say, “I think we can pull the plug, it’s been 15 minutes and we’ll be late for our dinner reservation.”
What worries me most is winding up a vegetable — any vegetable, and that includes corn, which under happier circumstances I rather like. And yet is it really so great to live forever? Sometimes in the news I see features about certain tall people who reside in snow-capped regions where a whole village population lives to 140 or so. Of course all they ever eat is yogurt, and when they finally do die they are not embalmed but pasteurized. And don’t forget these healthy people walk everyplace because try getting a cab in the Himalayas. I mean do I really want to pass my days in some remote place where the main entertainment is seeing which guy in town can lift the ox highest with his bare hands?
Summing up, there are two distinct groups, hypochondriacs and alarmists. Both suffer in their own ways, and traits of one group may overlap the other, but whether you’re a hypochondriac or an alarmist, at this point in time, either is probably better than being a Republican.
By JOHN TIERNEY NY Times Published: January 3, 2013
When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.
They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.
“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”
Other psychologists said they were intrigued by the findings, published Thursday in the journal Science, and were impressed with the amount of supporting evidence. Participants were asked about their personality traits and preferences — their favorite foods, vacations, hobbies and bands — in years past and present, and then asked to make predictions for the future. Not surprisingly, the younger people in the study reported more change in the previous decade than did the older respondents.
But when asked to predict what their personalities and tastes would be like in 10 years, people of all ages consistently played down the potential changes ahead.
Thus, the typical 20-year-old woman’s predictions for her next decade were not nearly as radical as the typical 30-year-old woman’s recollection of how much she had changed in her 20s. This sort of discrepancy persisted among respondents all the way into their 60s.
And the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.
Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.
“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”
Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.
The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.
And that illusion of stability could lead to dubious financial expectations, as the researchers showed in an experiment asking people how much they would pay to see their favorite bands.
When asked about their favorite band from a decade ago, respondents were typically willing to shell out $80 to attend a concert of the band today. But when they were asked about their current favorite band and how much they would be willing to spend to see the band’s concert in 10 years, the price went up to $129. Even though they realized that favorites from a decade ago like Creed or the Dixie Chicks have lost some of their luster, they apparently expect Coldplay and Rihanna to blaze on forever.
“The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.
Dr. McAdams was reminded of a conversation with his 4-year-old daughter during the craze forTeenage Mutant Ninja Turtles in the 1980s. When he told her they might not be her favorite thing one day, she refused to acknowledge the possibility. But later, in her 20s, she confessed to him that some part of her 4-year-old mind had realized he might be right.
“She resisted the idea of change, as it dawned on her at age 4, because she could not imagine what else she would ever substitute for the Turtles,” Dr. McAdams said. “She had a sneaking suspicion that she would change, but she couldn’t quite imagine how, so she stood with her assertion of continuity. Maybe something like this goes on with all of us.”
Three days before 20 year-old Adam Lanza killed his mother, then opened fire on a classroom full of Connecticut kindergartners, my 13-year old son Michael (name changed) missed his bus because he was wearing the wrong color pants.
“I can wear these pants,” he said, his tone increasingly belligerent, the black-hole pupils of his eyes swallowing the blue irises.
“They are navy blue,” I told him. “Your school’s dress code says black or khaki pants only.”
“They told me I could wear these,” he insisted. “You’re a stupid bitch. I can wear whatever pants I want to. This is America. I have rights!”
“You can’t wear whatever pants you want to,” I said, my tone affable, reasonable. “And you definitely cannot call me a stupid bitch. You’re grounded from electronics for the rest of the day. Now get in the car, and I will take you to school.”
I live with a son who is mentally ill. I love my son. But he terrifies me.
A few weeks ago, Michael pulled a knife and threatened to kill me and then himself after I asked him to return his overdue library books. His 7 and 9 year old siblings knew the safety plan—they ran to the car and locked the doors before I even asked them to. I managed to get the knife from Michael, then methodically collected all the sharp objects in the house into a single Tupperware container that now travels with me. Through it all, he continued to scream insults at me and threaten to kill or hurt me.
That conflict ended with three burly police officers and a paramedic wrestling my son onto a gurney for an expensive ambulance ride to the local emergency room. The mental hospital didn’t have any beds that day, and Michael calmed down nicely in the ER, so they sent us home with a prescription for Zyprexa and a follow-up visit with a local pediatric psychiatrist.
We still don’t know what’s wrong with Michael. Autism spectrum, ADHD, Oppositional Defiant or Intermittent Explosive Disorder have all been tossed around at various meetings with probation officers and social workers and counselors and teachers and school administrators. He’s been on a slew of antipsychotic and mood altering pharmaceuticals, a Russian novel of behavioral plans. Nothing seems to work.
At the start of seventh grade, Michael was accepted to an accelerated program for highly gifted math and science students. His IQ is off the charts. When he’s in a good mood, he will gladly bend your ear on subjects ranging from Greek mythology to the differences between Einsteinian and Newtonian physics to Doctor Who. He’s in a good mood most of the time. But when he’s not, watch out. And it’s impossible to predict what will set him off.
Several weeks into his new junior high school, Michael began exhibiting increasingly odd and threatening behaviors at school. We decided to transfer him to the district’s most restrictive behavioral program, a contained school environment where children who can’t function in normal classrooms can access their right to free public babysitting from 7:30-1:50 Monday through Friday until they turn 18.
The morning of the pants incident, Michael continued to argue with me on the drive. He would occasionally apologize and seem remorseful. Right before we turned into his school parking lot, he said, “Look, Mom, I’m really sorry. Can I have video games back today?”
“No way,” I told him. “You cannot act the way you acted this morning and think you can get your electronic privileges back that quickly.”
His face turned cold, and his eyes were full of calculated rage. “Then I’m going to kill myself,” he said. “I’m going to jump out of this car right now and kill myself.”
That was it. After the knife incident, I told him that if he ever said those words again, I would take him straight to the mental hospital, no ifs, ands, or buts. I did not respond, except to pull the car into the opposite lane, turning left instead of right.
“Where are you taking me?” he said, suddenly worried. “Where are we going?”
“You know where we are going,” I replied.
“No! You can’t do that to me! You’re sending me to hell! You’re sending me straight to hell!”
I pulled up in front of the hospital, frantically waiving for one of the clinicians who happened to be standing outside. “Call the police,” I said. “Hurry.”
Michael was in a full-blown fit by then, screaming and hitting. I hugged him close so he couldn’t escape from the car. He bit me several times and repeatedly jabbed his elbows into my rib cage. I’m still stronger than he is, but I won’t be for much longer.
The police came quickly and carried my son screaming and kicking into the bowels of the hospital. I started to shake, and tears filled my eyes as I filled out the paperwork—“Were there any difficulties with….at what age did your child….were there any problems with…has your child ever experienced…does your child have….”
At least we have health insurance now. I recently accepted a position with a local college, giving up my freelance career because when you have a kid like this, you need benefits. You’ll do anything for benefits. No individual insurance plan will cover this kind of thing.
For days, my son insisted that I was lying—that I made the whole thing up so that I could get rid of him. The first day, when I called to check up on him, he said, “I hate you. And I’m going to get my revenge as soon as I get out of here.”
By day three, he was my calm, sweet boy again, all apologies and promises to get better. I’ve heard those promises for years. I don’t believe them anymore.
On the intake form, under the question, “What are your expectations for treatment?” I wrote, “I need help.”
Drew Petersen didn’t speak until he was 3½, but his mother, Sue, never believed he was slow. When he was 18 months old, in 1994, she was reading to him and skipped a word, whereupon Drew reached over and pointed to the missing word on the page. Drew didn’t produce much sound at that stage, but he already cared about it deeply. “Church bells would elicit a big response,” Sue told me. “Birdsong would stop him in his tracks.”
Sue, who learned piano as a child, taught Drew the basics on an old upright, and he became fascinated by sheet music. “He needed to decode it,” Sue said. “So I had to recall what little I remembered, which was the treble clef.” As Drew told me, “It was like learning 13 letters of the alphabet and then trying to read books.” He figured out the bass clef on his own, and when he began formal lessons at 5, his teacher said he could skip the first six months’ worth of material. Within the year, Drew was performing Beethoven sonatas at the recital hall at Carnegie Hall. “I thought it was delightful,” Sue said, “but I also thought we shouldn’t take it too seriously. He was just a little boy.”
On his way to kindergarten one day, Drew asked his mother, “Can I just stay home so I can learn something?” Sue was at a loss. “He was reading textbooks this big, and they’re in class holding up a blowup M,” she said. Drew, who is now 18, said: “At first, it felt lonely. Then you accept that, yes, you’re different from everyone else, but people will be your friends anyway.” Drew’s parents moved him to a private school. They bought him a new piano, because he announced at 7 that their upright lacked dynamic contrast. “It cost more money than we’d ever paid for anything except a down payment on a house,” Sue said. When Drew was 14, he discovered a home-school program created by Harvard; when I met him two years ago, he was 16, studying at the Manhattan School of Music and halfway to a Harvard bachelor’s degree.
Prodigies are able to function at an advanced adult level in some domain before age 12. “Prodigy” derives from the Latin “prodigium,” a monster that violates the natural order. These children have differences so evident as to resemble a birth defect, and it was in that context that I came to investigate them. Having spent 10 years researching a book about children whose experiences differ radically from those of their parents and the world around them, I found that stigmatized differences — having Down syndrome, autism or deafness; being a dwarf or being transgender — are often clouds with silver linings. Families grappling with these apparent problems may find profound meaning, even beauty, in them. Prodigiousness, conversely, looks from a distance like silver, but it comes with banks of clouds; genius can be as bewildering and hazardous as a disability. Despite the past century’s breakthroughs in psychology and neuroscience, prodigiousness and genius are as little understood as autism. “Genius is an abnormality, and can signal other abnormalities,” says Veda Kaplinsky of Juilliard, perhaps the world’s pre-eminent teacher of young pianists. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”
We live in ambitious times. You need only to go through the New York preschool application process, as I recently did for my son, to witness the hysteria attached to early achievement, the widespread presumption that a child’s destiny hinges on getting a baby foot on a tall ladder. Parental obsessiveness on this front reflects the hegemony of developmental psychiatry, with its insistence that first experience is formative. We now know that brain plasticity diminishes over time; it is easier to mold a child than to reform an adult. What are we to do with this information? I would hate for my children to feel that their worth is contingent on sustaining competitive advantage, but I’d also hate for them to fall short of their potential. Tiger mothers who browbeat their children into submission overemphasize a narrow category of achievement over psychic health. Attachment parenting, conversely, often sacrifices accomplishment to an ideal of unboundaried acceptance that can be equally pernicious. It’s tempting to propose some universal answer, but spending time with families of remarkably talented children showed me that what works for one child can be disastrous for another.
Children who are pushed toward success and succeed have a very different trajectory from that of children who are pushed toward success and fail. I once told Lang Lang, a prodigy par excellence and now perhaps the most famous pianist in the world, that by American standards, his father’s brutal methods — which included telling him to commit suicide, refusing any praise, browbeating him into abject submission — would count as child abuse. “If my father had pressured me like this and I had not done well, it would have been child abuse, and I would be traumatized, maybe destroyed,” Lang responded. “He could have been less extreme, and we probably would have made it to the same place; you don’t have to sacrifice everything to be a musician. But we had the same goal. So since all the pressure helped me become a world-famous star musician, which I love being, I would say that, for me, it was in the end a wonderful way to grow up.”
While it is true that some parents push their kids too hard and give them breakdowns, others fail to support a child’s passion for his own gift and deprive him of the only life that he would have enjoyed. You can err in either direction. Given that there is no consensus about how to raise ordinary children, it is not surprising that there is none about how to raise remarkable children. Like parents of children who are severely challenged, parents of exceptionally talented children are custodians of young people beyond their comprehension.
Spending time with the Petersens, I was struck not only by their mutual devotion but also by the easy way they avoided the snobberies that tend to cling to classical music. Sue is a school nurse; her husband, Joe, works in the engineering department of Volkswagen. They never expected the life into which Drew has led them, but they have neither been intimidated by it nor brash in pursuing it; it remains both a diligence and an art. “How do you describe a normal family?” Joe said. “The only way I can describe a normal one is a happy one. What my kids do brings a lot of joy into this household.” When I asked Sue how Drew’s talent had affected how they reared his younger brother, Erik, she said: “It’s distracting and different. It would be similar if Erik’s brother had a disability or a wooden leg.”
Prodigiousness manifests most often in athletics, mathematics, chess and music. A child may have a brain that processes chess moves or mathematical equations like some dream computer, which is its own mystery, but how can the mature emotional insight that is necessary to musicianship emerge from someone who is immature? “Young people like romance stories and war stories and good-and-evil stories and old movies because their emotional life mostly is and should be fantasy,” says Ken Noda, a great piano prodigy in his day who gave up public performance and now works at the Metropolitan Opera. “They put that fantasized emotion into their playing, and it is very convincing. I had an amazing capacity for imagining these feelings, and that’s part of what talent is. But it dries up, in everyone. That’s why so many prodigies have midlife crises in their late teens or early 20s. If our imagination is not replenished with experience, the ability to reproduce these feelings in one’s playing gradually diminishes.”
Musicians often talked to me about whether you achieve brilliance on the violin by practicing for hours every day or by reading Shakespeare, learning physics and falling in love. “Maturity, in music and in life, has to be earned by living,” the violinist Yehudi Menuhin once said. Who opens up or blocks access to such living? A musical prodigy’s development hinges on parental collaboration. Without that support, the child would never gain access to an instrument, the technical training that even the most devout genius requires or the emotional nurturance that enables a musician to achieve mature expression. As David Henry Feldman and Lynn T. Goldsmith, scholars in the field, have said, “A prodigy is a group enterprise.”
Some prodigies seem to trade on a splinter skill — an ability in music that occupies their whole consciousness, leaving them virtually incompetent in all other areas. Others have a dazzling capacity for achievement in general and select music from among multitudinous gifts. Mikhail and Natalie Paremski held comfortable positions within the Soviet system: Mikhail with the Kurchatov Institute of Atomic Agency; Natalie with the Moscow Engineering Physics Institute. Their daughter, Natasha, born in 1987, showed a precocious interest in the piano. “I was in the kitchen, and I thought, Who is playing?” Natalie recalls. “Then I saw: it’s the baby, picking out nursery songs.” By the time she was 4, Natasha had played a Chopin mazurka in a children’s concert.
After the Soviet Union collapsed, Mikhail emigrated to California; the family followed in 1995. Natasha entered fourth grade, two years younger than her classmates. Within months, she was speaking English without an accent and coming in first on every school test. The family couldn’t afford a good piano; they finally found a cheap one that “sounded like cabbage,” Natasha recalls, and she began performing Haydn concertos, Beethoven sonatas and Chopin études. “Everyone would say, ‘You must be so proud of your daughter,’ ” Natalie told me. “I used to say that it’s not for me to be proud; it’s Natasha who does this herself — but I learned that this is not the polite American way. So now I always say, ‘I am so proud of my daughter,’ and then maybe we can have a conversation.” Natasha agreed. “What did they do to make me practice?” she asked when I first interviewed her, at 16. “What did they do to make me eat or sleep?”
Natasha graduated with top honors from high school at 14 and was offered a full scholarship by Mannes College the New School for Music in New York. Her mother worried about a deficit of soul in New York. “There is no time for vision! People are just struggling to survive, like in Moscow,” Natalie said — to which her daughter replied, “Vision is how I survive.” In those early New York days, Natasha and her mother spoke by phone constantly. Nonetheless, Natalie said, “that was my present to her: I gave her her own life.”
In 2004, when Natasha was 16, I went to her Carnegie Hall debut, for which she played Rachmaninoff’s Piano Concerto No. 2. She’s a beautiful young woman, with cascades of hair and a sylphlike figure, and she wore a sleeveless, black velvet dress, so her arms would feel free, and a pair of insanely high heels that she said gave her better leverage on the pedals. Her parents were not there. “They’re too supportive to come,” Natasha told me just before the concert. Afterward, Natalie explained, “If I am there, I am so worried about every single note that I can’t even sit still. It’s not helpful to Natasha.”
Natasha later said she saw nothing strange in a musician’s ability to express emotions she has not experienced. “Had I experienced them, that wouldn’t necessarily help me to express them better in my music. I’m an actress, not a character; my job is to represent something, not to live it. Chopin wrote a mazurka, Person X in the audience wants to hear the mazurka and so I have to decipher the score and make it apprehensible to Person X, and it’s really hard to do. But it has nothing to do with my life experience.”
After the English lawyer Daines Barrington examined the 8-year-old Mozart in 1764, he wrote: “He had a thorough knowledge of the fundamental principles of composition. He was also a great master of modulation, and his transitions from one key to another were excessively natural and judicious.” Yet, Mozart was also clearly a child. “Whilst he was playing to me, a favorite cat came in, upon which he immediately left his harpsichord, nor could we bring him back for a considerable time. He would also sometimes run about the room with a stick between his legs by way of horse.”
Every prodigy is a chimera of such mastery and childishness, and the contrast between musical sophistication and personal immaturity can be striking. One prodigy I interviewed switched from the violin to the piano when she was 7. She offered to tell me why if I didn’t tell her mother. “I wanted to sit down,” she said.
Chloe Yu was born in Macao and came to the United States to study when she was 17. She married at 25, and her son, Marc, was born a year later, in Pasadena, Calif. While she was pregnant, Chloe played the piano to him. When Marc was almost 3, he picked out a few tunes on the piano with two fingers; within a few months, Chloe had found him a teacher advanced enough to respond to his emerging talent. At 5, he added the cello to his regimen. “Soon he asked for more instruments,” Chloe told me. “I said: ‘That’s it, Marc. Be realistic. Two is enough.’ ”
Chloe gave up on the master’s degree she was working on. She had divorced Marc’s father, but because she had no money, she and Marc ended up living with her ex-in-laws, in a room over the garage. Marc’s grandparents did not approve of his “excessive” devotion to the piano. “His grandmother loves him a lot,” Chloe said. “But she just wanted him to be a normal 5-year-old.” When Marc was in preschool, Chloe felt he was ready to perform, and she contacted local retirement facilities and hospitals to offer free recitals. Soon the papers were writing about this young genius. “When I began to understand how talented he is, I was so excited!” Chloe said. “And also so afraid!”
At 6, Marc won a fellowship for gifted youth that covered the down payment on a Steinway. By the time Marc was 8, he and Chloe were flying to China frequently for lessons; Chloe explained that whereas her son’s American teachers gave him broad interpretive ideas to explore freely, his Chinese teacher taught measure by measure. I asked Marc whether he found it difficult traveling so far. “Well, fortunately, I don’t have vestigial somnolence,” he said. I raised an eyebrow. “You know — jet lag,” he apologized.
Marc was being home-schooled to accommodate his performance and practice schedule. At the age of a third-grader, he was taking an SAT class. Chloe serves as his manager and reviews concert invitations with him. “In America, every kid has to be well rounded,” Chloe said. “They have 10 different activities, and they never excel at any of them. Americans want everyone to have the same life; it’s a cult of the average. This is wonderful for disabled children, who get things they would never have otherwise, but it’s a disaster for gifted children. Why should Marc spend his life learning sports he’s not interested in when he has this superb gift that gives him so much joy?”
At their home in California, I asked Marc what he thought of a normal childhood. “I already have a normal childhood,” he said. “Do you want to see my room? It’s messy, but you can come anyway.” Upstairs, he showed me a yellow remote-controlled helicopter that his father had sent from China. The bookshelves were crammed with Dr. Seuss, “Jumanji” and “The Wind in the Willows” but also “Moby-Dick”; with “Sesame Street” videos and also a series of DVDs on the music of Prague, Vienna and so on. We sat on the floor, and he showed me his favorite Gary Larson cartoons, and then we played the board game Mouse Trap.
Then we went downstairs, and Marc sat on a phone book on the piano bench so his hands would be high enough to play comfortably and launched into Chopin’s “Fantasie-Impromptu,” which he imbued with a quality of nuanced yearning that seemed almost inconceivable in someone with a shelf of Cookie Monster videos. “You see?” Chloe said to me. “He’s not a normal child. Why should he have a normal childhood?”
A parent is the progenitor of much of a child’s behavior, telling that child repeatedly who he has been, is and could be, reconciling accomplishment and naïveté. In constructing this narrative, parents often confuse the anomaly of developing fast with the objective of developing profoundly. There is no clear delineation between supporting and pressuring a child, between believing in your child and forcing your child to conform to what you imagine for him. If society’s expectations for most children with profound differences are too low, expectations for prodigies are often perilously high. “When you have a child whose gift is so overshadowing, it is possible for parents to be distracted and lose track of the child himself,” says Karen Monroe, a psychiatrist at Boston’s McLean Hospital who works with prodigious children.
If you dream of having a genius for a child, you will spot brilliance in your child, sometimes even when it isn’t there. Such children, despite being the subjects of obsessive attention, can suffer from not being seen; their sorrow is organized not so much around the rigor of practicing as around invisibility. And yet, accomplishment entails giving up the pleasures of the present moment in favor of anticipated triumphs, and that is an impulse that must be learned. Left to their own devices, children do not become world-class instrumentalists before they turn 10.
When I spoke to the mother of one musical prodigy on the telephone to set up an interview, I invited her and her daughter to dinner, but she said, “We have a family of fussy eaters, so we’ll eat before we come.” The girl and her parents, whom I’ve granted anonymity for their own protection, arrived wearing coats, and I offered to hang them up. “That won’t be necessary,” the mother said, and they sat holding them through the interview. I offered them something to drink, but the woman said, “We are so used to our schedule, and it’s not time for a drink right now.” In three hours, none of them had a sip of water. I had put out homemade cookies, and the daughter kept glancing at them; every time she did, the mother shot her a look. Whenever I asked the daughter a question, her mother jumped in to answer on her behalf; when the daughter did reply, she did so with an anxious glance at her mother, as if worried that she delivered the wrong response.
The daughter was holding her instrument case, so I invited her to play. “I think I’ll play the Bach Chaconne,” she said. Her mother said, “How about the Rimsky-Korsakov?” She replied, “No, no, no, the Chaconne is better.” The daughter had told me that she chose her instrument for its resemblance to her voice; now it provided her only chance to be heard over her mother. She played the Chaconne. When she finished, her mother said, “Now you can play the Rimsky-Korsakov.” The daughter dutifully launched into “Flight of the Bumblebee,” the proof of every virtuoso. “Vivaldi?” her mother said, and she played “Summer” from “The Four Seasons.” She played with a clear, bright tone, although not with such brilliance as to resolve the question of why a childhood had been sacrificed for this art. I had hoped this child would light up when her bow met the strings, but instead she brought out her instrument’s searing melancholy.
Throughout much of history, prodigies were thought to be possessed; Aristotle believed that there could be no genius without madness. Paganini was accused of putting himself in the hands of the devil. The Italian criminologist Cesare Lombroso said in 1891, “Genius is a true degenerative psychosis belonging to the group of moral insanity.” Recent neuroscience demonstrates that the processes of creativity and psychosis map similarly in the brain, each contingent on a reduced number of dopamine D2 receptors in the thalamus. A continuum runs between the two conditions; there is no sharp line.
The parents of children with disabilities must be educated to see the identity within a perceived illness, but the parents of prodigies are confronted with an identity and must be educated to recognize the prospect of illness within it. Even those without a sideline diagnosis like A.D.D. or Asperger’s need to mitigate the loneliness of being peerless and of having their primary emotional relationship with an inanimate object. “If you’re spending five hours a day practicing, and the other kids are out playing baseball, you’re not doing the same things,” Karen Monroe says. “Even if you love it and can’t imagine yourself doing anything else, that doesn’t mean you don’t feel lonely.”
If Chloe Yu scorned the idea of a normal childhood, May Armstrong simply had to bow to the reality that no such thing could be achieved with her only son, Kit. Born in 1992, Kit could count at 15 months; May taught him addition and subtraction at 2, and he worked out multiplication and division for himself. While digging in the garden, he explained the principle of leverage to his mother. By 5, he explained Einstein’s theory of time dilation to her. May, an economist, was frankly bemused: “By nature, every mother wants to be protective, but he didn’t need protection. I can’t say that was easy.”
May had left Taiwan at 22 to study in the United States and spent holidays by herself. “I knew what loneliness was all about, and I thought he needed a hobby he could enjoy on his own,” she says. So she started him on piano lessons when he was 5, even though she had no interest in music. After three weeks of lessons, Kit started composing without an instrument on staff paper: the written language of music had come to him whole.
When Kit was 3, a supervisor of his play group told May that he let other children push him around. “I went in one day and saw another child snatch a toy away from him,” May said. “I told him he should stand up for himself, and he said: ‘That kid will be bored in two minutes, and then I can play with it again. Why start a fight?’ So he was mature already. What did I have to teach this kid? But he always seemed happy, and that was what I wanted most for him. He used to look in the mirror and burst out laughing.” May enrolled him in school. “His teacher told me that she wanted her other kids to grow up in kindergarten,” she said. “She wanted mine to grow down.”
By age 9, he had graduated from high school and started college in Utah. “The other students often thought it was strange that he was there,” May says, “but Kit never did.” His piano skills, meanwhile, had advanced enough so that by the time he was 10, he appeared on David Letterman. Shortly after, Kit toured the physics research facility at Los Alamos. A physicist said that, unlike the postdoctoral physicists who usually visited, Kit was so bright that no one could “find the bottom of this boy’s knowledge.” A few years later, Kit attended a summer program at M.I.T., where he helped edit papers in physics, chemistry and mathematics. “He just understands things,” May said to me, almost resigned. “Someday, I want to work with parents of disabled children, because I know their bewilderment is like mine. I had no idea how to be a mother to Kit, and there was no place to find out.”
May moved them to London to pursue Kit’s musicianship, and he soon met the revered pianist Alfred Brendel; he took Kit on and refused payment for lessons. When he learned that Kit was practicing at a piano showroom, he had a Steinway delivered to their apartment.
“I have no ear to be any help to Kit,” May said. “All I can do is remind him that he is very lucky to have been born with those talents. I’d have preferred that he be a professor of mathematics. It’s an easier life.” Then she added, “But Kit has decided that mathematics is his hobby, and the piano is his work.” At 18, Kit was pursuing an M.A. in pure mathematics in Paris; he said he did it “to unwind.” I asked May if she ever worried that Kit, like many young people of remarkable ability, might have a nervous breakdown. She laughed. “If anyone’s going to have a nervous breakdown in this setup,” she said, “it’s me!”
There is no federal mandate for gifted education. But if we recognize the importance of special programs for students whose atypical brains encode less-accepted differences, we should extrapolate to create programs for those whose atypical brains encode remarkable abilities. Writing in Time magazine in 2007, the educator John Cloud faulted the “radically egalitarian” values underlying the No Child Left Behind Act, which provided little support for gifted students. Once again, it falls to parents to advocate for their children’s needs, often in the face of a hostile or indifferent educational system. Leon Botstein, president of Bard College, himself a conductor and a former wunderkind, remarked dryly, “If Beethoven were sent to nursery school today, they would medicate him, and he would be a postal clerk.”
Growing up gay in the 1970s, I encountered prejudice from the world at large that often crossed into disdain. My parents were never derisive, but they were uncomfortable with the ways I differed from them and encouraged me to try to be straight. I began researching children of difference in a quest to forgive my mother and father for pressing me to be untrue to myself. I wanted to look at the process through which parents reconcile themselves to children who throw up significant challenges. I found that many families come to celebrate children with characteristics they initially found incomprehensible — just as my parents did. Having seen how hard it was for other parents, I decided, with considerable relief, that mine had actually done a pretty good job and realized that I was ready to be a parent myself.
My research on prodigies echoed my study of children with other differences. Sue Petersen compared her experience to having a child with a wooden leg; May Armstrong saw common ground with parents of disabled children; and I realized that parenthood always entails perplexity and that the valence of that perplexity matters less than the spirit with which parents respond to it. Half the prodigies I studied seemed to be under pressure to be even more astonishing than they naturally were, and the other half, to be more ordinary than their talents. Studying their families, I gradually recognized that all parenting is guesswork, and that difference of any kind, positive or negative, makes the guessing harder. That insight has largely shaped me as a father. I don’t think I would love my children more if they could play Rachmaninoff’s Third, and I hope I wouldn’t love them less for having that consuming skill, any more than I would if they were affected with a chronic illness. But I am frankly relieved that so far, they show no such uncanny aptitude.
Software engineer Randy Adams initially turned down Steve Jobs’ offer to work at NeXT, the computer company started by Jobs after his ouster from Apple. It was 1985. Adams wasn’t ready to go back to work after selling his pioneering desktop software publishing company. Within a few days Jobs was on Adams’ answering machine. “You’re blowing it, Randy. This is the opportunity of a lifetime, and you’re blowing it.” Adams reconsidered.
Adams, using some of the cash he’d earned from the sale of his company, bought a Porsche 911 at the same time Jobs did. To avoid car-door dings, they parked near each other–taking up three parking spaces between them. One day Jobs rushed over to Adams’ cubicle and told him they had to move the cars.
“I said, ‘Why?,’ and he said, ‘Randy, we have to hide the Porsches. Ross Perot is coming by and thinking of investing in the company, and we don’t want him to think we have a lot of money.’” They moved the cars around to the back of NeXT’s offices in Palo Alto, Calif. and Perot invested $20 million in the company in 1987 and took a seat on the board.
Adams also recalls the time Bill Gates showed up at NeXT for a meeting. It was the fall of 1986. The receptionist in the downstairs lobby called Jobs, whose cube was upstairs, to let him know that Gates was in the lobby. “I could see him sitting in his cube, not really busy. But he didn’t get up or call Gates up. In fact, he left him waiting in the lobby for an hour. That speaks to their rivalry.”
NeXT engineers, Adams said, took the opportunity to go downstairs and ply Gates with questions. “We enjoyed it and spent an hour talking to him until Steve finally called him in.”
Adams said he left NeXT after disagreeing with Jobs about the use of the optical drive in the NeXT workstation, which he felt would be too slow. Some time later Jobs convinced Adams to start a software business around NeXT, which he did with a $2 million investment from Sequoia Capital. But as the business was under way, Jobs called Adams again to let him know that NeXT was going to give up its workstation business and focus instead on software.
“He told me that the cost of hardware is coming down and we think it’s a commodity. I said, ‘Then why don’t you sell PCs?’ Jobs told me, ‘I’d rather sell dog s— than PCs.’”
Adams says he has many memories of Jobs from those days at NeXT – how Jobs, a vegan, would pass by engineers enjoying their Subway sandwiches and comment, “Oh, the smell of burnt animal flesh. How delightful.” In 1986, Jobs dressed up as Santa Claus and handed out $100 bills to employees. Adams also said Jobs was constantly telling employees who had screwed up or done something he didn’t like to “fire yourself.” Was Jobs serious? “Well, if you didn’t get a termination notice then you knew he was only kidding.”
A year after Jobs’ death, Adams, who went on after NeXT to help lead development of Adobe Acrobat and PDF and is a co-founder of the FunnyorDie.com site, says the tech industry is still feeling his loss. “His charisma, was like electricity – he was giving off this incredible force. It was inspirational. He lifted you. I used to believe when I was with Steve, you could do anything. You could change the world. When he died, a little bit of that feeling left me. There’s no one like him.”
Scuff Marks in the Mini-Store
In his first public appearance after revealing he had surgery to remove a tumor from his pancreas in 2004, Jobs met with a handful of reporters (including me) at the Stanford Shopping Center in Palo Alto, Calif. to unveil a new 750-square-foot “mini” store design. Half the size of the typical Apple retail stores of the time, the mini design featured an all-white ceiling, lit from behind; Japanese-made stainless-steel walls, with holes around the top for ventilation that mimicked the design of the PowerMac G5; and a shiny, seamless white floor made with “material used in aircraft hangars,” Jobs said at the time.
Before the gigantic curtain draped across the storefront came down, though, Jobs was having a meltdown, refusing in the minutes before the unveiling to step outside and greet reporters. Why? Because the store design that looked so great on paper didn’t stand up to real-world use. The walls showed off every handprint and the floors were marred by black scuff marks from the handful of people readying the store for the big reveal.
Jobs was ultimately convinced to step outside, and the curtain was drawn before the small gathering of reporters. When I saw the floor, I immediately turned to Jobs, standing next to me, and asked if he had been involved in every aspect of the design. He said yes. “It was obvious that whoever designed the store had never cleaned a floor in their life,” I told him. He narrowed his eyes at me and stepped inside.
A few months later an Apple executive told me that Jobs had all of the designers return to the store after it opened on Saturday, and spend the night on their hands and knees cleaning the white surface. After that, Apple switched the floors to the stone tiles now prevalent in its designs. –C.G.
They’ll Get Used To It
Marc Andreessen, Internet browser pioneer turned venture capitalist, recalls a double-date he had with Jobs a few months before the iPhone was unveiled. “In the fall of 2006, my wife, Laura, and I went out to dinner with Steve and his brilliant and lovely wife, Laurene. Sitting outside of the restaurant on California Avenue in Palo Alto waiting for a table to open up, on a balmy Silicon Valley evening, Steve pulled his personal prototype iPhone out of his jeans pocket and said, ‘Here, let me show you something.’ He took me on a tour through all of the features and capabilities of the new device.
“After an appropriate amount of oohing and aahing, I ventured a comment. BlackBerry aficionado as I was, I said, ‘Boy, Steve, don’t you think it’s going to be a problem not having a physical keyboard? Are people really going to be okay typing directly on the screen?’ He looked me right in the eye with that piercing gaze and said, ‘They’ll get used to it.’”
Apple has sold more than 250 million iPhones since 2007 and it’s one of the top-selling smartphones in the world.
Blunt, But With Taste
Guy Kawasaki, Apple’s chief evangelist and liaison to the Mac developer community, was working in his cubicle after the Mac was introduced in 1984 when Jobs showed up one day with another guy in tow. Jobs asked Kawasaki for his opinion about a program from a Mac developer called Knoware, which was short for the knowledge software the company made.
“I tell him what I think, which is extremely negative. When I’m done, he turns to the guy. The he looks back at me. And then he says, ‘Guy, this is the CEO of Knoware.’”
At first, Kawasaki says the story illustrates Jobs’ “lack of hesitancy to hang his employees out to dry.” Then he says it’s “indicative of Steve in general. If you’re a Steve fan, you say – ‘See, he knew how to cut through all the bullshit. If you’re not a Steve fan, he lacked social graces.’”
“Even though he treated people like this, the reason he got such great people to work there, unlike most bosses, is that he appreciated great work. There are two components to giving employees great feedback. It takes someone who has the taste to know when you did great or lousy, and it takes someone who’s blunt enough to tell you. There are plenty of people who don’t have taste but are blunt.
“If you wanted to do great work, you can do it at Apple. But there’s a cost–public humiliation. Something like this could never have happened at HP. It’s contrary to the HP way. On the other hand, you couldn’t do your best work at HP because there is no one there to appreciate it. Where would you rather work–Apple or HP?”
A Christmas Story
Regis McKenna, Apple’s original marketing guru, met the 22-year-old Jobs when he drove up to his house on a motorcycle and talked about how he wanted to build Apple into a global brand. McKenna sat in on Apple executive meetings from 1983 to 1987, and the two men remained close throughout the years.
“In 1998 my wife and I bought five iMacs as Christmas gifts for our grandchildren. We watched them open their presents, and when 5-year-old Molly opened her iMac, she said, ‘Life is good.’ Unfortunately, Molly’s iMac developed a problem. After using it a few hours, the disc drive door would not open. The dealer told me he was not authorized to exchange the computer for another one due to an Apple policy. Repair would take several weeks, he told me. I sent an e-mail to Steve and asked him about Apple’s return/exchange policy on a new product. Within five minutes my phone rang. It was Steve. He asked me what the problem was and the name of the dealer. ‘I’ll call you back,’ he said. A few minutes later the phone rang and it was a very apologetic dealer. ‘I have a new iMac here for your granddaughter,’ he said. I e-mailed Steve, thanking him and assuring him that he had made my granddaughter’s Christmas a happy one. Steve immediately replied with a simple ‘Ho, ho, ho.’”
McKenna also recalls another story. In 1985, after being fired by Apple by CEO John Sculley and the board, Jobs talked with McKenna about his next steps the week after his ouster. “Steve said that Apple might even benefit from his leaving,” McKenna says. “That he may be able to help Apple with his new venture. He said that his new company might possibly develop technology that Apple could use and that he could in that way benefit the company. ‘Maybe we can develop a new successful product line that would enhance the Apple product line and they will buy us,’ he told me. At the time, Steve did not foresee how prescient that statement would be.”
In 1996, Apple bought Jobs’ NeXT Inc. for $429 million, a move that facilitated Jobs’ return to the company – and the launch of the most successful turnaround in U.S. corporate history.
A Friend In Need
Heidi Roizen, now a venture capitalist, was the head of software company T/Maker, which distributed software for the Mac in the 1980s. She had many experiences with Jobs she would call “character-building,” but one was more personal.
“On Mar. 1, 1989 Steve called to talk to me about a negotiation, and as it was Steve I took the call, even though I had just learned the night before that my father had died suddenly while on a business trip in Paris. When I told Steve what had happened, he said, ‘Then why are you working? You need to go home. I’ll be right over.’
Jobs came to her house and sat on the floor beside her while she sobbed for two hours. “Yes, I had sofas, but Steve didn’t like to sit on sofas. He asked me to talk about my father, what was important about him, what I loved best about him. Steve’s mother had passed away a few months earlier, so I think he was particularly attuned to how I felt and what I needed to talk about. I will always remember and appreciate what an incredible thing he did for me in helping me grieve.”
He Notices Everything
Emily Brower Auchard, who worked on the P.R. team for Steve Jobs at NeXT, says that Jobs was a “noticer” who picked up on the smallest detail. “One of my tasks was to sit in press interviews with Steve and take notes. Once before an interview, I realized that I was wearing two different shoes. I had dressed quickly that morning and had grabbed what I thought were a pair of black pumps. They weren’t. I called my boss for advice. She said I absolutely needed to fix the situation because Steve would definitely notice. So I drove like a maniac to the Stanford Mall and bought myself a pair of replacement shoes at Nordstrom and then sped back to NeXT’s offices. It was the fastest shopping decision I ever made.”
Disarm, Rather Than Charm
In 1989, NeXT, struggling to win over buyers, got a meeting with IBM to discuss licensing the NeXTStep software for use on IBM’s OS/2 computers, recalls a former NeXT executive, who asked not to be named. NeXT really wanted the deal (IBM did end up licensing the software for $65 million at the end of that year). Executives from both companies gathered in a conference room at NeXT’s headquarters on Deer Creek Road in Palo Alto, Calif., waiting for Jobs to arrive. He finally came in, turned to the senior IBM executive and said “Your user interface sucks.” There were gasps from executives at both companies.
“This is kind of how he got to be a good negotiator. He would totally disarm people by dropping F-bombs,” the NeXT executive recalls. “He would say, ‘We’re doing this deal but your products are s—. He was outrageous. But he always ending up getting exactly what he wanted.”
Okay to Lie
“Steve was really, really, really good one-on-one to interact with. You could have a conversation with him, he wasn’t putting on a show. If there were more than two people in the room, the marketing in Steve came out. He then put on the show,” says a CEO who worked with Jobs after his return to Apple in the late 1990s. “He was always polite and had to be nice because he needed something from me. There was a period of time when he came back to Apple, where he needed people to work with him, Apple was a mess, Next was a failure. So he was somebody you could work with. As he became more and more successful with the iPod and then the iPhone, he became more and more arrogant. The old Steve.
“There were times where what he said he was going to do something and what he did were not necessarily consistent, and when I challenged him on that. He was, ‘Yeah, yeah, I know, but I needed to change my mind.’ It was okay in his mind to lie.’”
A Little Hand In the Screen
Nolan Bushnell, founder of Atari, who hired Jobs in 1974, says what he remembers most about Jobs was his intensity. “Steve was the first guy I found who would be regularly curled up under his desk in the morning after an all-nighter. A lot of people think that success is luck and being in the right place at the right time. But I think if you’re willing to work harder than anybody else, you can create an awful lot of your own luck.
“We tended to have this philosophical relationship. He liked to talk about big ideas and where did big ideas come from. He was always interested in talking about creating products and how do you know when a product is ready for market.”
In the early 1980s Bushnell bought a 15,000-square-foot house in Paris and invited all his Silicon Valley friends to a housewarming party. There was a band, lots of food and drink, lavishly attired guests–and Jobs, who had left Atari to start Apple in 1976. While everyone else was dressed up for the party, Jobs showed up in his Levi’s.
Bushnell remembers “sitting on the Left Bank [the day after the party], me sipping coffee and Steve always drinking tea, sort of watching Paris walk by. We had a delightful conversation about the importance of creativity. He was at a phase where he knew that the Apple II was nearing the end of its life. He was not happy with the Apple III. He was just starting to kick around the ideas for the Lisa and what was going to be the Macintosh. We were talking about trackballs and joysticks and mice, and the whole idea of having a little hand in the screen, which is essentially what the mouse was.
“I last saw him a year before his death. He was very, very thin, but he didn’t look frail. He had a strength about him. He said, ‘I think I’m going to beat this thing.’”
I’ve been saying that college is obsolete for a very long time. I dropped out in 2000, because even back then I could see that it was a really poor value proposition. I didn’t predict this because I’m some crazy genius, but because I’m willing to discard emotional attachment and stare plainly at the facts.
School is oturageously expensive, leaving graduates with a debt (or net expenditure) of tens of thousands of dollars— sometimes even one or two hundred thousand. There are some things that are worth that amount of money, but for many people school isn’t one of them. In fact, apart from very specific cases, I think that school is a bad thing, not worth doing even if it was free.
That’s not to say that school has no benefits whatsoever. It does, and although I left with zero additional skills after my three semesters there, I had a good time and benefited from the social aspect. The problem is that you can’t just compare college to doing nothing at all. You have to compare it to what you COULD have done.
Let’s say that when you turn eighteen, it’s a good idea to take four years to develop yourself. College is one way to do that. If we were to construct an alternative way to do that, what could it look like? One of the biggest weaknesses of school is how inflexible it is, so one of the greatest benefits of designing your own curriculum is that you could come up with one that uniquely suits you. That said, here’s a plan that I think would benefit many people MORE than school would. Let’s call it the Hustler’s MBA.
1. Learn poker. To an outsider, poker seems like a form of degenerate gambling. It can be, but that’s not its nature. One of the most valuable skills I’ve learned in life is how to assess hundreds of factors, choose the important ones, evaluate them to make a decision quickly, and then execute that decision. Poker teaches this extremely well. So does pickup, incidentally. Poker develops your logic like nothing else I’ve experienced, and it develops your math skills to a lesser degree. It also teaches a skill I can’t quite define, but would best describe as learning how hard you can push. I’ve found all of these skills to be very useful in life.
Poker will cost you money at first. Let’s say $5000 in the first year. After that you’ll be able to make between $45-60 per hour for the rest of your life. That’s about $85,000 per year, which adjusts for inflation because as money is inflated, the stakes to keep the game interesting will go up. You will also receive “raises” because you’ll always improve as a player and be able to play better stakes. If you’re dedicated to poker, getting this good is virtually guaranteed. I’ve been through the process and it’s not particularly hard. Can school guarantee you a job that pays this well?
Besides being able to make $85k/year, you could also play for six months and make $40k a year. Ultimate flexibility. I don’t think that poker is the best career in the world, because it doesn’t give back to society, but I do think that it’s an excellent backup plan. Knowing that I can always support myself playing poker gives me the freedom to work on big projects without fear.
2. Travel a lot. For the first year, learn a foreign language that interests you. Start with three months of Pimsleur tapes, then get a local tutor. That should cost about $1000 for the first year, and will yield results FAR greater than a class in school. After the first year your self-education will be paid for by poker, so start traveling for three months every year. That should cost around $8k at the most, probably more like $5-6k. When traveling, education comes to you in the form of perspective. You understand other cultures and other people, and will get to practice your foreign language in its native setting. I would also combine travel with watching documentaries about the history of that place. I learned a lot about Rome after visiting, and now I’m kicking myself for not educating myself first.
3. Read every single day for at least an hour. Books get lumped in with other reading like magazines and blogs, but they’re actually far more valuable. The amount of value an author compresses into a book is often astounding. There are books I’ve paid $10 for that have completely changed my life. If you read for 1-2 hours on average, you’ll read around a hundred books per year. I do this now and find it to be one of the most valuable uses of my time. Read at least 50% non-fiction, but fiction is good, too. In school you would probably read 12 books a year at most.
4. Write every single day. Write blog posts, work on a book, write how you’re feeling, or write short stories. I don’t think it really matters. Writing every day helps you develop and refine your thoughts, as well as learn to communicate with others. Almost any field you’ll go into will require communication, so you may as well get good at it. After you write, record a video yourself explaining what you wrote. This will help with public speaking and conversation. After the first year at the very latest, start publicly posting your work. This teaches you to ship and to integrate feedback.
5. Learn to program, even if you don’t want to be a programmer. Programming develops logic and efficiency, amongst other things. Even an intermediate understanding of programming will allow you to be a creator. Programming languages are the languages of the future, so even if you aren’t a programmer yourself, there’s a good chance you’ll be working with them. Speaking someone’s language is nice when you’re working with them, right?
6. Do something social. College is really excellent for making people social, and it’s the one aspect in which don’t expect my plan to exceed school. If you’re a guy, consider getting into pickup. If you’re a human, take group art classes, yoga, dance, or go to meetup groups. Social skills are some of the most important skills you can learn, and they can only truly be developed through social interaction. This interaction has to be in person, too… online chatting can be beneficial, but it’s not enough. Traveling will help you be social as well, especially if you stay in hostels.
7. Eat healthy. When you eat healthy, your brain functions better and you’re safeguarding its longevity. Developing yourself is at least as much about good habits as it is about learning skills. And like all habits, the earlier you start, the better. I’d say that the minimum to shoot for here is cutting out all sweeteners and refined grains. Besidses the obvious health benefits, eating healthy will help you build discipline, which is an absolutely essential life skill.
8. Follow curiousity and spend money on it when necessary. These things that I’ve included so far are the baseline— the new liberal arts education. They leave you plenty of time in your day to follow whatever you’re interested in. Don’t force it and try to learn investment banking because you think it would make a good career. If you’re interested in butterflies, learn about butterflies. The rest of the curriculum is enough to make sure that you’ll always be able to provide for yourself and will be a well rounded person, so consider this section your speculative learning. Maybe you’ll find something you’re passionate about, which will become your career, or maybe you’ll just become a really interesting person who knows a lot about a lot of things. Either way is fine. Don’t be afraid to spend money on tutors, classes, equipment, seminars, or travel.
9. Start a business after two years. With a full two years of self-education under your belt, you should have something useful to contribute to society. School makes you go from sheltered learning mode straight into real-world career mode. I think a better way is to have a transition, and to couple productivity with learning. Having that habit will ensure that you continue to perfect your craft as you get older. Your business can be anything— a tech startup, publishing books you’ve written, giving speeches, making clothing and selling it online, whatever you’re into. Read some business books before starting it and try to make money. One of the most common complaints I hear from graduates of traditonal school is that nothing they learned was actually applicable to real ife. Everything you learn from starting a business IS.
This is a modern curriculum that, on average, will produce people better prepared for real life than college. Obviously, it won’t work if you want to be something that requires certification like a doctor or lawyer. The beauty of it is that it has a negative cost (you will make money due to poker, and hopefully your business), and can be funded initially with $5000 for poker. A few months into the second year, you will have paid off the poker debt and begun to self fund your life.
Will this work for you? There’s no guarantee, but I see people work pretty hard at school, and if that same effort were put towards the Hustler’s MBA, I thnk the chance of being self-sufficient and prepared for “real life” is about 90%. I’d estimate that non-laywer/doctor college is somewhere around 50-70%. So, like anything, this plan is not totally foolproof, but I think it’s a lot better and cheaper than the alternative.
There’s a big taboo around telling people not to go to college. I find myself adhering to it, not ever suggesting that younger members of my family should drop out or skip school entirely. But maybe the time has come for us to look at college objectively, really quantify what goes in and what comes out, and evaluate it on its merits alone, rather than its historical value or its societal aura. (via)
By the time you finish reading this, both of these consumer products will have been recycled at a local e-waste facility.
Both the iPhone and the analog Canon served useful lives and have been replaced with their newest counterparts. The iPhone is being retired after 3 years because its touchscreen has stopped working. In contrast, the point-and-shoot camera works like new after 7 full years, but is being retired because it’s not digital. It’s funny how sometimes a low-tech product can outlive a high-tech one because there are fewer components to fail. So here we have two well-worn objects in product purgatory. Before they are sent away to be dismantled and melted away, I’d like to take a moment to examine their materiality and how it has aged with time. At a glance both products are a satin-finish silver color, but upon closer investigation their battle scars reveal the stuff they’re actually made of.
After 3+ years of having been carried in the same pocket as a ring of keys, the iPhone has acquired a polished patina over its aluminum shell. Abrasion of its hard-anodized surface has revealed the raw aluminum within. The camera’s shell has been worn in a very similar way but instead reveals black plastic concealed by silver paint. Slightly less flattering. The camera’s emulated metallic finish is only surface-deep and its wear tends to emphasizes awkward artifacts of the injection molding process used to create it. At this point the Canon camera’s shell looks like garbage while the iPhone’s is starting to resemble something more like an heirloom pocket watch.
Interestingly, both of these products spent the greater part of their working lives sporting these wear patterns. The truth is that consumer products are ‘new’ for a very brief moment when they are first removed from the packaging, but spend the great majority of their useful lives as ‘used’ products in the process of decay. Many welcome the breaking-in of products like a leather wallet or a pair of jeans as this wear can be aesthetically-pleasing. The Japanese have a term for this, “Wabi-sabi”. Wabi-sabi can be used to describe the aesthetically pleasing wear of an object as it decays over time. It’s a notion that embraces the transience of objects and celebrates the purity of the imperfect. Aging with dignity is a criteria designers should recognize in their efforts. I’m thinking of a future when products are designed not for the brief moment when they are new, but for when they have been aged to perfection. (via)
Here he is. Matches in one hand, petrol bottle in the other. He removes the bottle cap, drops it to the ground and douses himself in liquid. He does everything slowly, methodically, as if it were part of a routine he has practiced for years. Then he stops, looks around, and strikes a match.
At this moment nothing in the world can bridge the gap that separates the self-immolator from the others. His total defiance of the survival and self-preservation instincts, his determination to trample on what everybody else finds precious, the ease with which he seems to dispose of his own life, all these place him not only beyond our capacity of understanding, but also outside of human society. He now inhabits a place that most of us find inhabitable. Yet, from there he does not cease to dominate us.
“As he burned he never moved a muscle, never uttered a sound, his outward composure in sharp contrast to the wailing people around him.”
Journalist David Halberstam describes the death of Thích Quàng Đúc, the Vietnamese Buddhist monk who set himself on fire in Saigon in 1963. The quieter the self-immolator the more agitated those around him. The former may slip into nothingness, but his performance changes the latter’s lives forever. They experience repulsion and attraction, terror and boundless reverence, awe and fear, all at once. Over them he now has the uncanniest form of power.
The experience is so powerful because it is so deeply seated in the human psyche. In front of self-immolation, even the most secularized of us have a glimpse into a primordial experience of the sacred.
SOMETIME in the dark stretch of the night it happens. Perhaps it’s the chime of an incoming text message. Or your iPhone screen lights up to alert you to a new e-mail. Or you find yourself staring at the ceiling, replaying the day in your head. Next thing you know, you’re out of bed and engaged with the world, once again ignoring the often quoted fact that eight straight hours of sleep is essential.
Sound familiar? You’re not alone. Thanks in part to technology and its constant pinging and chiming, roughly 41 million people in the United States — nearly a third of all working adults — get six hours or fewer of sleep a night, according to a recent report from the Centers for Disease Control and Prevention. And sleep deprivation is an affliction that crosses economic lines. About 42 percent of workers in the mining industry are sleep-deprived, while about 27 percent of financial or insurance industry workers share the same complaint.
Typically, mention of our ever increasing sleeplessness is followed by calls for earlier bedtimes and a longer night’s sleep. But this directive may be part of the problem. Rather than helping us to get more rest, the tyranny of the eight-hour block reinforces a narrow conception of sleep and how we should approach it. Some of the time we spend tossing and turning may even result from misconceptions about sleep and our bodily needs: in fact neither our bodies nor our brains are built for the roughly one-third of our lives that we spend in bed.
The idea that we should sleep in eight-hour chunks is relatively recent. The world’s population sleeps in various and surprising ways. Millions of Chinese workers continue to put their heads on their desks for a nap of an hour or so after lunch, for example, and daytime napping is common from India to Spain.
One of the first signs that the emphasis on a straight eight-hour sleep had outlived its usefulness arose in the early 1990s, thanks to a history professor at Virginia Tech named A. Roger Ekirch, who spent hours investigating the history of the night and began to notice strange references to sleep. A character in the “Canterbury Tales,” for instance, decides to go back to bed after her “firste sleep.” A doctor in England wrote that the time between the “first sleep” and the “second sleep” was the best time for study and reflection. And one 16th-century French physician concluded that laborers were able to conceive more children because they waited until after their “first sleep” to make love. Professor Ekirch soon learned that he wasn’t the only one who was on to the historical existence of alternate sleep cycles. In a fluke of history, Thomas A. Wehr, a psychiatrist then working at the National Institute of Mental Health in Bethesda, Md., was conducting an experiment in which subjects were deprived of artificial light. Without the illumination and distraction from light bulbs, televisions or computers, the subjects slept through the night, at least at first. But, after a while, Dr. Wehr noticed that subjects began to wake up a little after midnight, lie awake for a couple of hours, and then drift back to sleep again, in the same pattern of segmented sleep that Professor Ekirch saw referenced in historical records and early works of literature.
It seemed that, given a chance to be free of modern life, the body would naturally settle into a split sleep schedule. Subjects grew to like experiencing nighttime in a new way. Once they broke their conception of what form sleep should come in, they looked forward to the time in the middle of the night as a chance for deep thinking of all kinds, whether in the form of self-reflection, getting a jump on the next day or amorous activity. Most of us, however, do not treat middle-of-the-night awakenings as a sign of a normal, functioning brain.
Doctors who peddle sleep aid products and call for more sleep may unintentionally reinforce the idea that there is something wrong or off-kilter about interrupted sleep cycles. Sleep anxiety is a common result: we know we should be getting a good night’s rest but imagine we are doing something wrong if we awaken in the middle of the night. Related worries turn many of us into insomniacs and incite many to reach for sleeping pills or sleep aids, which reinforces a cycle that the Harvard psychologist Daniel M. Wegner has called “the ironic processes of mental control.”
As we lie in our beds thinking about the sleep we’re not getting, we diminish the chances of enjoying a peaceful night’s rest.
This, despite the fact that a number of recent studies suggest that any deep sleep — whether in an eight-hour block or a 30-minute nap — primes our brains to function at a higher level, letting us come up with better ideas, find solutions to puzzles more quickly, identify patterns faster and recall information more accurately. In a NASA-financed study, for example, a team of researchers led by David F. Dinges, a professor at the University of Pennsylvania, found that letting subjects nap for as little as 24 minutes improved their cognitive performance.
In another study conducted by Simon Durrant, a professor at the University of Lincoln, in England, the amount of time a subject spent in deep sleep during a nap predicted his or her later performance at recalling a short burst of melodic tones. And researchers at the City University of New York found that short naps helped subjects identify more literal and figurative connections between objects than those who simply stayed awake.
Robert Stickgold, a professor of psychiatry at Harvard Medical School, proposes that sleep — including short naps that include deep sleep — offers our brains the chance to decide what new information to keep and what to toss. That could be one reason our dreams are laden with strange plots and characters, a result of the brain’s trying to find connections between what it’s recently learned and what is stored in our long-term memory. Rapid eye movement sleep — so named because researchers who discovered this sleep stage were astonished to see the fluttering eyelids of sleeping subjects — is the only phase of sleep during which the brain is as active as it is when we are fully conscious, and seems to offer our brains the best chance to come up with new ideas and hone recently acquired skills. When we awaken, our minds are often better able to make connections that were hidden in the jumble of information.
Gradual acceptance of the notion that sequential sleep hours are not essential for high-level job performance has led to increased workplace tolerance for napping and other alternate daily schedules.
Employees at Google, for instance, are offered the chance to nap at work because the company believes it may increase productivity. Thomas Balkin, the head of the department of behavioral biology at the Walter Reed Army Institute of Research, imagines a near future in which military commanders can know how much total sleep an individual soldier has had over a 24-hour time frame thanks to wristwatch-size sleep monitors. After consulting computer models that predict how decision-making abilities decline with fatigue, a soldier could then be ordered to take a nap to prepare for an approaching mission. The cognitive benefit of a nap could last anywhere from one to three hours, depending on what stage of sleep a person reaches before awakening.
Most of us are not fortunate enough to work in office environments that permit, much less smile upon, on-the-job napping. But there are increasing suggestions that greater tolerance for altered sleep schedules might be in our collective interest. Researchers have observed, for example, that long-haul pilots who sleep during flights perform better when maneuvering aircraft through the critical stages of descent and landing.
Several Major League Baseball teams have adapted to the demands of a long season by changing their sleep patterns. Fernando Montes, the former strength and conditioning coach for the Texas Rangers, counseled his players to fall asleep with the curtains in their hotel rooms open so that they would naturally wake up at sunrise no matter what time zone they were in — even if it meant cutting into an eight-hour sleeping block. Once they arrived at the ballpark, Montes would set up a quiet area where they could sleep before the game. Players said that, thanks to this schedule, they felt great both physically and mentally over the long haul.
Strategic napping in the Rangers style could benefit us all. No one argues that sleep is not essential. But freeing ourselves from needlessly rigid and quite possibly outdated ideas about what constitutes a good night’s sleep might help put many of us to rest, in a healthy and productive, if not eight-hour long, block.
We have all held leaves, driven miles to see their fall colors, eaten them, raked them, sought their shade. Since they are everywhere, it’s easy to take them for granted.
But even when we do, they continue in their one occupation: turning light into life. When rays of sunlight strike green leaves, wavelengths in the green spectrum bounce back toward our eyes. The rest—the reds, blues, indigos, and violets—are trapped. A leaf is filled with chambers illuminated by gathered light. In these glowing rooms photons bump around, and the leaf captures their energy, turning it into the sugar from which plants, animals, and civilizations are built.
Chloroplasts, fed by sun, water, carbon dioxide, and nutrients, do the leaf’s work. They evolved about 1.6 billion years ago when one cell, incapable of using the sun’s energy, engulfed another cell—a cyanobacterium—that could. That cyanobacterium became the ancestor of every living chloroplast. Without their chloroplasts plants would be left like the rest of us, to eat what they find. Instead they hold out their green palms and catch light. If there is magic in the world, surely this is it: the descendants of tiny creatures in leaves, capable of ingesting the sun.
If you gather a bouquet of leaves to consider their magic, it is hard to overlook their diversity and, if you are the curious sort, to wonder why there exists such a preponderance of forms. Some leaves don’t seem to be leaves at all, having become flower petals, thorns, or the spines on a cactus. But even an ordinary oak leaf, dandelion leaf, and grass blade differ in size, thickness, shape, hue, texture, taste, and nearly every other feature.
Leaves are large, small, thick, thin, compound, simple, curved, or lobed. And these terms just begin to describe the differences botanists have tried to catalog in their rich poetry of obscure adjectives—pinnate, ciliate, barbellate, bearded, canescent, glabrous, glandular, viscid, scurfy, floccose, arachnoid, and my favorite, tomentose (covered with woolly hairs). But putting the variety of structures aside, most leaves do essentially the same thing: They exist in the main to hold chloroplasts aloft. How can so many different geometries all perfectly capture the sun?
The work of natural selection offers a key to the puzzle. Desert leaves tend to be small, thick-skinned, waxy, or spiny‚ just like leaves in salty regions or other harsh lands—clear examples of the relatively few ways evolution can deal with a lack of water. Rain forest plants often have narrow leaves, with long, thin “drip tips,” to drain away excess water. In cold places one finds leaves with teeth—like birches and cherries—though why this particular pattern exists is the subject of debate.
Some of the most extreme examples of the way natural selection shapes leaves can be found at high elevations in the tropics, where nights are consistently cold and damp and the days hot and dry. Scramble high enough above the tree line in the mountains of Africa, Asia, Hawaii, and the Americas, and you will see thick towers of plants crowned by mops of living and dead leaves.
In a poetic moment botanists named these lovely circular leaf arrangements “giant rosettes.” The thick living leaves of these rosettes shelter new buds. They’re hairy too, which adds insulation. The dead leaves help the plants withstand freezing at night and, simultaneously, save the night’s cold dew for the dry day. Remove those decaying leaves from rosettes at high elevations and the plants can freeze to death, naked without their dead-leaf fur.
In many environments natural selection tends to favor a limited number of similar forms again and again, given the genes it has to work with. Sometimes there really does seem to be just one or a few best ways to deal with a particular set of conditions. If rosettes are not convincing, consider the meat-eaters. In nutrient-poor bogs, plants have repeatedly turned to animals to supplement what the soil alone cannot provide. They have evolved rolled leaves, sticky hairs, mucous pools, or snap traps, all for capturing live prey. A bog is a terrifying place to be a fly.
But if climate and nutrient availability were the only explanations for leaf diversity, all of the leaves in a particular environment—a desert, a mountaintop, your backyard—would tend to be the same. Of course they are not. Many of the qualities of the leaves in your yard or salad are due to the limits of genes and time. Not all plants have the genetic variation it takes to become, under the natural selection imposed by desert conditions, a cactus. Conditions change. Species move. Every leaf is a work in progress. One suspects, for example, that leaves are evolving now to deal with the conditions in cities—pollution, drought, intense heat, and animal waste—but it may require more generations for natural selection to stumble, death by death, upon the more successful forms.
Other specific traits may have to do with the battles that have gone on among plants each day for more than 400 million years. Plants fight for nutrients and water in the soil, and they fight for sunlight in the canopy. Competition is why trees grow tall, stems become trunks, and forests grow dense. Trees have evolved in the struggle of plant against plant many times, in vastly different lineages. The highest leaves win, and so trees tend to evolve to be as tall as possible, given the limits of physics and precipitation. Without competition, every forest would be a thick film of green life.
The battles among plants have changed their stems and their veins. Leaves with more veins can carry more water to the chloroplasts, allowing the chloroplasts to make more sugar and the plants to grow faster. These species in turn can hold their leaves aloft to occupy more space in the sky and consume more sunlight before others get to it. Through time the plants that were able to produce more and more veins in their leaves won many battles and some wars.
Leaves with densely branched patterns of veins are also able to grow more quickly. The veins of a maple leaf, for instance, are like the roads of a city; they go everywhere and often intersect. They traffic in nutrients and water. The maple leaf can quickly get what it needs to continue to feed from the sun. Other leaves are not so lucky. Amid the seething competition for space in tropical forests, pity the single-veined leaf.
Plants have more to cope with than competition from other plants. The evidence of animals eating leaves is almost as ancient as the evidence for leaves themselves. In fossil dinosaur poop one finds evidence of ancient leaves. In fossil leaves one finds the holes made by ancient mouths. Nothing on life’s menu is more popular. Moths, butterflies, beetles, fungi, monkeys, sloths, and great loping monsters like cows, bison, and giraffes eat the hard-earned greenery of plants, which, for all of their ingenuity, have never figured out how to run away.
So leaves resort to self-defense. Some plant leaves have become specialists in deadly tricks. Grass blades evolved the ability to accumulate the silica from the soil—becoming like tiny glass slivers, which ruin the teeth of browsers like cows one bite at a time. Other plants use chemicals to make themselves unpalatable or even poisonous. Sometimes the weapons are visible: latex oozing out of a vein or tingly hairs projecting from leaf blades. Other times they lurk unseen, waiting for the unsuspecting victim, be it the larva of a moth or an undiscriminating sheep.
Climate, competition, defense—these evolutionary saws and scissors can explain much of the diversity of leaves. Yet if you pick up two leaves in your backyard, most of what differs between them—the details naturalists have spent thousands of years naming—remains unaccounted for. Evolution can whittle similar forms again and again when confronted with similar circumstances. But through innovation and chance, evolution can also work in the abstract: Jackson Pollock dashing paint on the canvas of life. We should not expect to understand every tomentose blade or arachnoid lobe. Sometimes it is enough to step back and know a masterwork when we see one, whether it hangs in a museum or from its petiole on the branch of a park tree. Not that leaves care whether you notice; the blessing they convey comes each day with the rise of the edible sun.
A few years ago, Peter Frew came to New York with an important professional skill. He was one of maybe a few dozen people in the U.S. who could construct a true bespoke suit. Frew, who apprenticed with a Savile Row tailor, can — all by himself, and almost all by hand — create a pattern, cut fabric and expertly construct a suit that, for about $4,000, perfectly molds to its owner’s body. In a city filled with very rich people, he quickly had all the orders he could handle.
When I learned about Frew, I assumed he was some rich designer in an atelier on Madison Avenue. That’s what Frew hopes to be one day, but for now the 33-year-old Jamaican immigrant works out of his ground-floor apartment near Flatbush Avenue, in Brooklyn, and makes around $50,000 a year. His former living room consists of one large table piled with bolts of cloth and a form with a half-made suit. As Frew sewed a jacket, he explained how he customizes every aspect of its design — the width of the lapel, the number and size of the pockets — for each client. What makes a bespoke suit unique, he said, is that it’s the result of skills that only a trained hand can perform. Modern technology cannot create anything comparable.
As I watched Frew work, it became glaringly obvious why he is not rich. Like a 17th-century craftsman, he has no economy of scale. It takes Frew about 75 hours to make a suit — he averages about two per month — and he has no employees. A large part of his revenue is used to pay off his material expenses, and because his labor is so demanding, he relies on an outside salesman, who requires commissions. (Frew can’t even afford to make a suit for himself. When we met, he was wearing shorts and a T-shirt.) While he hopes to one day hire full-time assistant tailors and rent a Manhattan showroom, he knows it will be a huge challenge to get there.
A poignant lesson in capitalism: quality + demand ≠ success story.