How I snuffed that Tartar air! — how I spurned that turnpike earth! —

. . . all betokening that new cruises were on the start; that one most perilous and long voyage ended, only begins a second; and a second ended, only begins a third, and so on, for ever and for aye. Such is the endlessness, yea, the intolerableness of all earthly effort.

Gaining the more open water, the bracing breeze waxed fresh; the little Moss tossed the quick foam from her bows, as a young colt his snortings. How I snuffed that Tartar air! — how I spurned that turnpike earth! — that common highway all dented over with the marks of slavish heels and hoofs; and turned me to admire the magnanimity of the sea which will permit no records.

Moby Dick, “Wheelbarrow”

The things which will save you

A sufferer of depression should not feel shame or believe they are worthless. . . . Look deep into yourself for the qualities you need to survive. Your talents, hopes, dreams, and desires. Because these are the things which will save you.

— Darryl Cunningham, Psychiatric Tales

A knowing irony of the intellect and a lassitude of the heart

With arguments that have long been at the cutting edge of political philosophy, Cohen develops a distinctive conception of democracy as based upon reason-giving among equals, and he does so in a way that never ignores political reality. . . . All too often, Cohen observes, we respond to political ideals with ‘a knowing irony of the intellect and a lassitude of the heart.’ Realistic without cynicism, idealistic without naivete, Cohen’s book responds to the real world with rigorous intellectual aspiration and undaunted practical hope.

— Martha C. Nussbaum, reviewing Joshua Cohen’s Philosophy, Politics, Democracy: Selected Essays

Handing Out Knives to Madmen

By Josiah Ober
wsj.com

A review of:

The Hemlock Cup: Socrates, Athens and the Search for the Good Life
By Bettany Hughes

Two and a half millennia after an Athenian philosopher drank a poisoned cup of hemlock as punishment for crimes against the state, the ancient Greek world continues to captivate us. And rightly so: New scholarship continues to reveal just how remarkable it was. Most premodern states, like too many countries today, were dominated by a small elite of ultra-privileged insiders who monopolized public goods, skimming off whatever surplus was produced by populations living near bare subsistence and thus seizing super size shares of stagnant economies. By contrast, the Greek world in the 500 years from Homer to Aristotle saw sustained economic growth and historically low level of economic inequality. Recent studies suggest that, from 800 B.C. to 300 B.C., the population of the Greek city-states increased by a factor of 10, while per capita incomes roughly doubled. That growth rate may be sluggish compared to leading 21st- century economies, but it is amazing by the standards of premodernity. What can explain such economic growth?

Free Greeks (we must never forget that this was a slave society) invested their efforts in industry, commerce and politics because they did not fear that the fruits of their effort would be expropriated by the powerful. Further, though the roughly 1,000 city-states of classical Greece competed fiercely in war, they actively exchanged goods and ideas. Among the productive innovations that spread rapidly across the Greek world were coinage, codes of law, and deliberative councils. There was no imperial center but instead a historically distinctive approach to politics.

The typical city-state was republican rather than autocratic, and a substantial part of the adult male population enjoyed participation rights. In Athens, the epoch-making “People’s Revolution” in 508 B.C. resulted in democracy—the strongest form of republicanism the world had ever known—which meant universal native adult male franchise and a commitment to the principles of equal votes and freedom of speech and association. The Greek word demokratia asserted a fact and an aspiration: The people (demos) have the capacity (kratos) to make history.

Aristotle (384-322 B.C.) inhabited a world of dense, urban populations and vast trade in food and labor. By his time, perhaps half of all Greek city-states were democracies, and the philosopher opined that, “now that city-states are even larger, it is difficult for any non-democratic regime to arise.” Wealth was not concentrated at the top: Laborers (citizens, foreigners and slaves alike) commanded wages far above subsistence. Archaeological excavations show that even the houses of those in the lowest quartile of income were spacious and well built. In Athens, citizens promoted ever more open access through new forms of constitutional and commercial law. They supported centers of higher education that laid the foundations of Western thought: first Plato’s Academy and Isocrates’ school of rhetoric; then Aristotle’s Lyceum, Zeno’s Stoa and Epicurus’ Garden.

Socrates, the central figure of Bettany Hughes’s delightful, if occasionally exasperating, book, was born in 469 B.C.—a long generation after Athens’s democratic revolution—and died in 399 B.C., two generations before the Greek world reached its apex of population, economic success and democratization. Yet his Athens was already experiencing what can be considered a golden age of advancements in thought, art and politics. During Socrates’ lifetime, the city also built and lost an Aegean empire that, at its height, encompassed more than 150 communities: a quarter-million Athenian residents and perhaps three times as many of their fellow Greeks.

Ms. Hughes presents a high-octane account of Socrates and his age, based on ancient literary sources, current archaeology and her own fertile imagination. She offers vivid slices of imagined Athenian life, taking us behind the public realm dominated by citizen men and exploring the lives of women, slaves and foreign residents in Socrates’ city. Her biographical sketches of key figures— Pericles, the courtesan Aspasia, the military commander Alcibiades—are rich with lively detail. Here is Ms. Hughes on a religious sanctuary frequented by Athenian girls: “This sacred zone would have resembled an outpost of the rag-trade: when women died in childbirth their clothes were dedicated here; draped, hung and stored around the sanctuary; a limp gift to pitiless Artemis, to whom, probably just a few years from now, the girls would be calling out during the dreadful pangs of labor.” Ms. Hughes weaves a morality tale about the danger of mixing politics with empire, wealth and religion, which ends with the trial of Socrates and the eclipse (she claims) of Athens’ democratic golden age.

Though Ms. Hughes celebrates the Socratic ideal—the idea that a constantly examined life, devoted to seeking the good and true, is the only life worth living—she does not pretend to present new insights into Socratic thought. The payoff comes, instead, through re-situating the origins of moral philosophy in the context of a vibrant cultural and historical milieu. She invites the reader to travel as her companion beyond the familiar byways of Greek history, offering a full menu of “you were there” sounds, smells and textures: noisy Eleusinian cults, stinking excrement, “the dark, the whispers, the unseen skin pricks connecting flesh to flesh” at a symposium. All of this is great fun.

Yet readers must proceed with caution, for her conclusions are questionable. Athenian greatness, for Ms. Hughes, is coterminous with Socrates’ life. “Socrates’ lifespan marked the beginning and an end of an idea—the idealistic vision of an autonomous, tolerant, democratic Athenian city-state,” she writes. Aristotle, who was born 15 years after Socrates’ death and lived in a wealthy, densely populated Greek world in which democracy was increasingly prevalent, would be puzzled by Ms. Hughes’s conclusions. But precipitous rises and tragic falls tend to be attractive to popular historians.

Even more problematic is that Ms. Hughes fails to provide a convincing answer to the central question of her tale: Why was Socrates tried and condemned? Socrates was a public philo sopher who could reliably be found conducting dialogues near the bankers’ tables in the Agora. But he was also known to contemporaries as a teacher of aristocrats—including Alcibiades and Critias, who became enemies of the Athenian democracy, the latter in a blood-drenched postwar coup d’état in 404-03 B.C.

Protected by an amnesty, Socrates could not be prosecuted for having taught the tyrant Critias to despise democracy. But the philosopher’s behavior following the democratic restoration was not similarly protected—and both before and after the coup his behavior included engaging others in public dialogues. Athenians had always held citizens legally responsible for the effects of their public speech. Socrates’ dialogues were intended to make his listeners behave differently, and he denied that good ideas ever produced bad behavior. But the monstrous acts of Critias (among others) seemed evidence to the contrary.

And so Socrates’ accusers were free to claim that, whether he realized it or not, he was a public danger, especially to impressionable youths. Since his public speech included sharp criticism of demo cracy, they could imply that Socrates, in effect, handed out knives to madmen, that when he denied dialogue could ever be dangerous Socrates was disingenuous or deluded, that his irresponsible speech was likely to have bad effects in the future—as it seemingly had in the past.

The charge brought against Socrates, by a team of voluntary prosecutors, was impiety—they accused him of impropriety in respect to the gods and corruption of the youth. I would suggest that Socrates was convicted not because the jurors were religious fanatics, not because they had lost their democratic tolerance, but because he seemed to them unreasonably unwilling to take responsibility for what he said in public. Today we allow pundits to say what they please, even if their speech has pernicious or even fatal effects. I think we are right to do so, and I think that 280 (out of 501) Athenian jurymen were wrong when they voted to condemn Socrates. But I think they were right to believe that when prominent public figures refuse to take personal responsibility for the consequences of their speech, democracy is in grave danger.

Beyond her failure to convincingly interpret Socrates’ trial, Ms. Hughes gets some facts wrong: The Athenian population did not increase five-fold during Socrates’ life; nor did Athens lose four-fifths of its population during the Peloponnesian War. Papyrus does not rot within weeks in the Greek climate. Socrates’ main weapon as a soldier in the Athenian army was not a broad-sword. Pericles was never elected “chief democrat.” The walls of Athens did not separate citizen-haves from non-citizen have-nots. And far from despising the disabled, Athens paid handicapped citizens a daily wage.

Ms. Hughes is also too uncritical of her sources: She offers late Roman accounts of the Athenian persecution of intellectuals, long ago rejected as tendentious fictions by serious scholars, as fact and even inflates them into a systematic regime of censorship, book-burning and execution. She imagines that “scores— perhaps hundreds” of intellectuals shared Socrates’ fate. If this were true, Socrates’ trial would be unsurprising, but Ms. Hughes’s claim is not supported by any credible evidence.

All the same, and despite these glitches, do read this book, both because of its marvelous storytelling and because it will stimulate a desire to learn more about the ancient world. Ms. Hughes’s work joins a growing shelf of books about antiquity that exemplify the honorable goal of responsible popularization. Readers attracted to the splendid range of classical texts excerpted by Ms. Hughes will want to read the full versions; almost all are available in modern translations. (Start with Plato, in the complete edition from Hackett, edited by John Cooper—superb translations, with helpful introductions by one of the world’s leading experts.) Socrates would be pleased if his story awakened a few modern Americans from moral slumber.


I took off the mask and looked in the mirror

I took off the mask and looked in the mirror.
I was the same child I was a year ago.
I hadn’t changed at all . . .

That’s the advantage of knowing how to remove your mask.
You’re still the child,
The past that lives on,
The child.

I took off my mask, and I put it back on.
It’s better this way.
This way I’m the mask.

And I return to normality as to a streetcar terminus.

11 August 1934

— Álvaro de Campos (Fernando Pessoa)

Bother Me, I’m Thinking

By Jonah Lehrer
wsj.com

We live in a time that worships attention. When we need to work, we force ourselves to focus, to stare straight ahead at the computer screen. There’s a Starbucks on seemingly every corner—caffeine makes it easier to concentrate—and when coffee isn’t enough, we chug Red Bull.

In fact, the ability to pay attention is considered such an essential life skill that the lack of it has become a widespread medical problem. Nearly 10% of American children are now diagnosed with attention-deficit hyperactivity disorder (ADHD).

In recent years, however, scientists have begun to outline the surprising benefits of not paying attention. Sometimes, too much focus can backfire; all that caffeine gets in the way. For instance, researchers have found a surprising link between daydreaming and creativity—people who daydream more are also better at generating new ideas. Other studies have found that employees are more productive when they’re allowed to engage in “Internet leisure browsing” and that people unable to concentrate due to severe brain damage actually score above average on various problem-solving tasks.

A new study led by researchers at the University of Memphis and the University of Michigan extends this theme. The scientists measured the success of 60 undergraduates in various fields, from the visual arts to science. They asked the students if they’d ever won a prize at a juried art show or been honored at a science fair. In every domain, students who had been diagnosed with attention-deficit disorder achieved more: Their inability to focus turned out to be a creative advantage.

And this lesson doesn’t just apply to people with a full-fledged disorder. A few years ago, scientists at the University of Toronto and Harvard gave a short mental test to 86 Harvard undergraduates. The test was designed to measure their ability to ignore irrelevant stimuli, such as the air-conditioner humming in the background or the conversation taking place nearby. This skill is typically seen as an essential component of productivity, since it keeps people from getting distracted by extraneous information.

Here’s where the data get interesting: Those undergrads who had a tougher time ignoring unrelated stuff were also seven times more likely to be rated as “eminent creative achievers” based on their previous accomplishments. (The association was particularly strong among distractible students with high IQs.)

According to the scientists, the inability to focus helps ensure a richer mixture of thoughts in consciousness. Because these people struggled to filter the world, they ended up letting everything in. They couldn’t help but be open-minded.

Such lapses in attention turn out to be a crucial creative skill. When we’re faced with a difficult problem, the most obvious solution—that first idea we focus on—is probably wrong. At such moments, it often helps to consider far-fetched possibilities, to approach the task from an unconventional perspective. And this is why distraction is helpful: People unable to focus are more likely to consider information that might seem irrelevant but will later inspire the breakthrough. When we don’t know where to look, we need to look everywhere.

This doesn’t mean, of course, that attention isn’t an important mental skill, or that attention-deficit disorders aren’t a serious problem. There’s clearly nothing advantageous about struggling in the classroom, or not being able to follow instructions. (It’s also worth pointing out that these studies all involve college students, which doesn’t tell us anything about those kids with ADHD who fail to graduate from high school. Distraction might be a cognitive luxury that not everyone can afford.)

Nevertheless, this new research demonstrates that, for a certain segment of the population, distractibility can actually be a net positive. Although we think that more attention can solve everything—that the best strategy is always a strict focus fueled by triple espressos—that’s not the case. Sometimes, the most productive thing we can do is surf the Web and eavesdrop on that conversation next door.

Where Have the Good Men Gone?

By Kay Hymowitz
wsj.com

Not so long ago, the average American man in his 20s had achieved most of the milestones of adulthood: a high-school diploma, financial independence, marriage and children. Today, most men in their 20s hang out in a novel sort of limbo, a hybrid state of semi-hormonal adolescence and responsible self-reliance. This “pre-adulthood” has much to recommend it, especially for the college-educated. But it’s time to state what has become obvious to legions of frustrated young women: It doesn’t bring out the best in men.

“We are sick of hooking up with guys,” writes the comedian Julie Klausner, author of a touchingly funny 2010 book, “I Don’t Care About Your Band: What I Learned from Indie Rockers, Trust Funders, Pornographers, Felons, Faux-Sensitive Hipsters and Other Guys I’ve Dated.” What Ms. Klausner means by “guys” is males who are not boys or men but something in between. “Guys talk about ‘Star Wars’ like it’s not a movie made for people half their age; a guy’s idea of a perfect night is a hang around the PlayStation with his bandmates, or a trip to Vegas with his college friends…. They are more like the kids we babysat than the dads who drove us home.” One female reviewer of Ms. Kausner’s book wrote, “I had to stop several times while reading and think: Wait, did I date this same guy?”

For most of us, the cultural habitat of pre-adulthood no longer seems noteworthy. After all, popular culture has been crowded with pre-adults for almost two decades. Hollywood started the affair in the early 1990s with movies like “Singles,” “Reality Bites,” “Single White Female” and “Swingers.” Television soon deepened the relationship, giving us the agreeable company of Monica, Joey, Rachel and Ross; Jerry, Elaine, George and Kramer; Carrie, Miranda, et al.

But for all its familiarity, pre-adulthood represents a momentous sociological development. It’s no exaggeration to say that having large numbers of single young men and women living independently, while also having enough disposable income to avoid ever messing up their kitchens, is something entirely new in human experience. Yes, at other points in Western history young people have waited well into their 20s to marry, and yes, office girls and bachelor lawyers have been working and finding amusement in cities for more than a century. But their numbers and their money supply were always relatively small. Today’s pre-adults are a different matter. They are a major demographic event.

What also makes pre-adulthood something new is its radical reversal of the sexual hierarchy. Among pre-adults, women are the first sex. They graduate from college in greater numbers (among Americans ages 25 to 34, 34% of women now have a bachelor’s degree but just 27% of men), and they have higher GPAs. As most professors tell it, they also have more confidence and drive. These strengths carry women through their 20s, when they are more likely than men to be in grad school and making strides in the workplace. In a number of cities, they are even out-earning their brothers and boyfriends.

Still, for these women, one key question won’t go away: Where have the good men gone? Their male peers often come across as aging frat boys, maladroit geeks or grubby slackers—a gender gap neatly crystallized by the director Judd Apatow in his hit 2007 movie “Knocked Up.” The story’s hero is 23-year-old Ben Stone (Seth Rogen), who has a drunken fling with Allison Scott (Katherine Heigl) and gets her pregnant. Ben lives in a Los Angeles crash pad with a group of grubby friends who spend their days playing videogames, smoking pot and unsuccessfully planning to launch a porn website. Allison, by contrast, is on her way up as a television reporter and lives in a neatly kept apartment with what appear to be clean sheets and towels. Once she decides to have the baby, she figures out what needs to be done and does it. Ben can only stumble his way toward being a responsible grownup.

So where did these pre-adults come from? You might assume that their appearance is a result of spoiled 24-year-olds trying to prolong the campus drinking and hook-up scene while exploiting the largesse of mom and dad. But the causes run deeper than that. Beginning in the 1980s, the economic advantage of higher education—the “college premium”—began to increase dramatically. Between 1960 and 2000, the percentage of younger adults enrolled in college or graduate school more than doubled. In the “knowledge economy,” good jobs go to those with degrees. And degrees take years.

Another factor in the lengthening of the road to adulthood is our increasingly labyrinthine labor market. The past decades’ economic expansion and the digital revolution have transformed the high-end labor market into a fierce competition for the most stimulating, creative and glamorous jobs. Fields that attract ambitious young men and women often require years of moving between school and internships, between internships and jobs, laterally and horizontally between jobs, and between cities in the U.S. and abroad. The knowledge economy gives the educated young an unprecedented opportunity to think about work in personal terms. They are looking not just for jobs but for “careers,” work in which they can exercise their talents and express their deepest passions. They expect their careers to give shape to their identity. For today’s pre-adults, “what you do” is almost synonymous with “who you are,” and starting a family is seldom part of the picture.

Pre-adulthood can be compared to adolescence, an idea invented in the mid-20th century as American teenagers were herded away from the fields and the workplace and into that new institution, the high school. For a long time, the poor and recent immigrants were not part of adolescent life; they went straight to work, since their families couldn’t afford the lost labor and income. But the country had grown rich enough to carve out space and time to create a more highly educated citizenry and work force. Teenagers quickly became a marketing and cultural phenomenon. They also earned their own psychological profile. One of the most influential of the psychologists of adolescence was Erik Erikson, who described the stage as a “moratorium,” a limbo between childhood and adulthood characterized by role confusion, emotional turmoil and identity conflict.

Like adolescents in the 20th century, today’s pre-adults have been wait-listed for adulthood. Marketers and culture creators help to promote pre-adulthood as a lifestyle. And like adolescence, pre-adulthood is a class-based social phenomenon, reserved for the relatively well-to-do. Those who don’t get a four-year college degree are not in a position to compete for the more satisfying jobs of the knowledge economy.

But pre-adults differ in one major respect from adolescents. They write their own biographies, and they do it from scratch. Sociologists use the term “life script” to describe a particular society’s ordering of life’s large events and stages. Though such scripts vary across cultures, the archetypal plot is deeply rooted in our biological nature. The invention of adolescence did not change the large Roman numerals of the American script. Adults continued to be those who took over the primary tasks of the economy and culture. For women, the central task usually involved the day-to-day rearing of the next generation; for men, it involved protecting and providing for their wives and children. If you followed the script, you became an adult, a temporary custodian of the social order until your own old age and demise.

Unlike adolescents, however, pre-adults don’t know what is supposed to come next. For them, marriage and parenthood come in many forms, or can be skipped altogether. In 1970, just 16% of Americans ages 25 to 29 had never been married; today that’s true of an astonishing 55% of the age group. In the U.S., the mean age at first marriage has been climbing toward 30 (a point past which it has already gone in much of Europe). It is no wonder that so many young Americans suffer through a “quarter-life crisis,” a period of depression and worry over their future.

Given the rigors of contemporary career-building, pre-adults who do marry and start families do so later than ever before in human history. Husbands, wives and children are a drag on the footloose life required for the early career track and identity search. Pre-adulthood has also confounded the primordial search for a mate. It has delayed a stable sense of identity, dramatically expanded the pool of possible spouses, mystified courtship routines and helped to throw into doubt the very meaning of marriage. In 1970, to cite just one of many numbers proving the point, nearly seven in 10 25-year-olds were married; by 2000, only one-third had reached that milestone.

American men have been struggling with finding an acceptable adult identity since at least the mid-19th century. We often hear about the miseries of women confined to the domestic sphere once men began to work in offices and factories away from home. But it seems that men didn’t much like the arrangement either. They balked at the stuffy propriety of the bourgeois parlor, as they did later at the banal activities of the suburban living room. They turned to hobbies and adventures, like hunting and fishing. At midcentury, fathers who at first had refused to put down the money to buy those newfangled televisions changed their minds when the networks began broadcasting boxing matches and baseball games. The arrival of Playboy in the 1950s seemed like the ultimate protest against male domestication; think of the refusal implied by the magazine’s title alone.

In his disregard for domestic life, the playboy was prologue for today’s pre-adult male. Unlike the playboy with his jazz and art-filled pad, however, our boy rebel is a creature of the animal house. In the 1990s, Maxim, the rude, lewd and hugely popular “lad” magazine arrived from England. Its philosophy and tone were so juvenile, so entirely undomesticated, that it made Playboy look like Camus.

At the same time, young men were tuning in to cable channels like Comedy Central, the Cartoon Network and Spike, whose shows reflected the adolescent male preferences of its targeted male audiences. They watched movies with overgrown boy actors like Steve Carell, Luke and Owen Wilson, Jim Carrey, Adam Sandler, Will Farrell and Seth Rogen, cheering their awesome car crashes, fart jokes, breast and crotch shots, beer pong competitions and other frat-boy pranks. Americans had always struck foreigners as youthful, even childlike, in their energy and optimism. But this was too much.

What explains this puerile shallowness? I see it as an expression of our cultural uncertainty about the social role of men. It’s been an almost universal rule of civilization that girls became women simply by reaching physical maturity, but boys had to pass a test. They needed to demonstrate courage, physical prowess or mastery of the necessary skills. The goal was to prove their competence as protectors and providers. Today, however, with women moving ahead in our advanced economy, husbands and fathers are now optional, and the qualities of character men once needed to play their roles—fortitude, stoicism, courage, fidelity—are obsolete, even a little embarrassing.

Today’s pre-adult male is like an actor in a drama in which he only knows what he shouldn’t say. He has to compete in a fierce job market, but he can’t act too bossy or self-confident. He should be sensitive but not paternalistic, smart but not cocky. To deepen his predicament, because he is single, his advisers and confidants are generally undomesticated guys just like him.

Single men have never been civilization’s most responsible actors; they continue to be more troubled and less successful than men who deliberately choose to become husbands and fathers. So we can be disgusted if some of them continue to live in rooms decorated with “Star Wars” posters and crushed beer cans and to treat women like disposable estrogen toys, but we shouldn’t be surprised.

Relatively affluent, free of family responsibilities, and entertained by an array of media devoted to his every pleasure, the single young man can live in pig heaven—and often does. Women put up with him for a while, but then in fear and disgust either give up on any idea of a husband and kids or just go to a sperm bank and get the DNA without the troublesome man. But these rational choices on the part of women only serve to legitimize men’s attachment to the sand box. Why should they grow up? No one needs them anyway. There’s nothing they have to do.

They might as well just have another beer.

Virtually You / Reality Is Broken

By WILLIAM SALETAN
Published: February 11, 2011
nyt.com

Humanity is migrating to cyberspace. In the past five years, Americans have doubled the hours they spend online, exceeding their television time and more than tripling the time they spend reading newspapers or magazines. Most now play computer or video games regularly, about 13 hours a week on average. By age 21, the average young American has spent at least three times as many hours playing virtual games as reading. It took humankind eight years to spend 100 million hours building Wikipedia. We now spend at least 200 million hours a week playing World of Warcraft.

Elias Aboujaoude, a Silicon Valley psychiatrist, finds this alarming. In “Virtually You,” he argues that the Internet is unleashing our worst instincts. It connects you to whatever you want: gambling, overspending, sex with strangers. It speeds transactions, facilitating impulse purchases and luring you away from the difficulties of real life. It lets you customize your fantasies and select a date from millions of profiles, sapping your patience for imperfect partners. It lets you pick congenial news sources and avoid contrary views and information. It conceals your identity, freeing you to be vitriolic or dishonest. It shields you from detection and disapproval, emboldening you to download test answers and term papers. It hides the pain of others, liberating your cruelty in games and forums. It rewards self-promotion on blogs and Facebook. It teaches you how to induce bulimic vomiting or kill yourself.

In short, everything you thought was good about the Internet — information, access, personalization — is bad. Aboujaoude isn’t shy in his indictment. He links the Internet to consumer debt, the housing crash, eating disorders, sexually transmitted infections, psychopathy, racism, terrorism, child sexual abuse, suicide and murder. Everything online worries him: ads, hyperlinks, even emoticons. The Internet makes us too quarrelsome. It makes us too like-minded. It makes us work too little. It makes us work too much.

In part, this grim view stems from Aboujaoude’s work. He sees patients with online compulsions. He believes in the Freudian id — a shadowy swirl of infantile impulses — and perceives its modern incarnation in what he calls the “e-personality,” a parallel identity that hijacks your mind online. In the physical world, your superego restrains your id. But in the virtual world, where you can instantly fulfill your whims, the narcissism and grandiosity of the e-personality run wild.

To Aboujaoude, the Internet is a mechanical alien, “a new type of machine . . . that can efficiently prey on our basic instincts.” It converts children into bullies “almost automatically.” It turned Philip Markoff, the accused “Craigslist killer,” who committed suicide in jail, into a serial assailant. Lori Drew, the woman whose online impersonation of a teenage boy supposedly drove a girl to suicide, seemed normal until “the Internet made her fleeting dark wish . . . take on a life of its own.” Again and again, computers get the blame.

Jane McGonigal, the author of “Reality Is Broken,” sees the Internet differently. She’s a game designer. To her, the virtual world isn’t a foreign contraption. It’s our own evolving creation. She agrees that bad online games can addict people, make them belligerent, distract them from reality and leave them empty. But this is our fault, not the Internet’s. When virtual life brings out the worst in us, redesign it.

If Aboujaoude is the Internet’s Hobbes, McGonigal is its Rousseau. In the rise of multiplayer games, she sees a happier picture of human nature — a thirst for community, a craving for hard work and a love of rules. This, she argues, is the essence of games: rules, a challenge and a shared objective. The trick is to design games that reward good behavior. The Internet’s unprecedented power, its ability to envelop and interact with us, is a blessing, not a threat. We can build worlds in which nice guys finish first.

The point isn’t just to enhance virtual reality. It’s to fix the real world, too. McGonigal offers several examples, some of which she helped create. Chore Wars, an alternate-reality game, builds positive attitudes toward housework by rewarding virtual housework. Cruel 2 B Kind invites players to “kill” competitors with smiles or compliments. The Extraordinaries hands out missions like one in which the player must GPS-tag a defibrillator so its location can be registered for later use. Groundcrew assigns players to help people with transportation, shopping or housekeeping.

The premise is that since games motivate us more effectively than real life, making them altruistic and bringing them into the physical world will promote altruistic behavior. But is this motivating power transferable? What draws us to virtual worlds, McGonigal notes, is their “carefully designed pleasures” and “thrilling challenges” customized to our strengths. They’re never boring. They let us choose our missions and control our work flow. They make us feel powerful. They offer “a guarantee of productivity” in every quest. And when we fail, they make our failure entertaining.

Reality doesn’t work this way. Floors need scrubbing. Garbage needs hauling. Invalids need their bedpans washed. This work isn’t designed for your pleasure or stimulation. It just needs to be done.

McGonigal points to studies suggesting that games that reward socially constructive behavior promote such behavior in real life. But the only outputs measured by these studies are self-reported values, self-reported behavior in the real world, and objectively measured behavior in games. Where’s the reliable evidence that this data translates to people’s doing more real work? Projects like Groundcrew, McGonigal concedes, have produced “modest if any results so far.” Hundreds of thousands of people play Free Rice, a game designed to feed the hungry, but the rice comes from advertisers, not players. Thousands sign up every day for Folding@home, a game to cure diseases, but all these players contribute is processing power on game consoles.

If reality is inherently less attractive than games, then the virtual world won’t save the physical world. It will empty it. Millions of gamers, in McGonigal’s words, are “opting out” of the bummer of real life. And they aren’t coming back. Halo 3, for example, has become a complete virtual world, with its own history documented in an online museum and Ken Burns-style videos. McGonigal calls this war game a model for inspiring mass cooperation. Two years ago, its 15 million players reached a long-sought objective: They killed their 10 billionth alien. “Fresh off one collective achievement, Halo players were ready to tackle an even more monumental goal,” McGonigal writes. And what goal did they choose? Feeding the hungry? Clothing the poor? No. The new goal was to kill 100 billion aliens.

Game designers can’t be counted on to arrest this trend. McGonigal says the game industry wants to help users avoid addiction so that they’ll remain functional and keep buying its products. But we’ve heard that argument before from the tobacco industry. Addiction, as a business model, is too addictive to give up. She says Foursquare, a game that rewards you for going out with friends and “checking in” at restaurants, promotes sociability. That would be nice, but the game’s Web site devotes a whole section (“Foursquare for Business”) to commercial exploitation.

The Internet isn’t heaven. It isn’t hell, either. It’s just another new world. Like other worlds, it can be civilized. It will need rules, monitoring and benevolent designers who understand the flaws of its inhabitants. If Aboujaoude is right about our weakness for virtual vice, we’ll need all the McGonigals we can get. The point isn’t just to enhance virtual reality. It’s to fix the real world, too. McGonigal offers several examples, some of which she helped create. Chore Wars, an alternate-reality game, builds positive attitudes toward housework by rewarding virtual housework. Cruel 2 B Kind invites players to “kill” competitors with smiles or compliments. The Extraordinaries hands out missions like one in which the player must GPS-tag a defibrillator so its location can be registered for later use. Groundcrew assigns players to help people with transportation, shopping or housekeeping.

The premise is that since games motivate us more effectively than real life, making them altruistic and bringing them into the physical world will promote altruistic behavior. But is this motivating power transferable? What draws us to virtual worlds, McGonigal notes, is their “carefully designed pleasures” and “thrilling challenges” customized to our strengths. They’re never boring. They let us choose our missions and control our work flow. They make us feel powerful. They offer “a guarantee of productivity” in every quest. And when we fail, they make our failure entertaining.

Reality doesn’t work this way. Floors need scrubbing. Garbage needs hauling. Invalids need their bedpans washed. This work isn’t designed for your pleasure or stimulation. It just needs to be done.

McGonigal points to studies suggesting that games that reward socially constructive behavior promote such behavior in real life. But the only outputs measured by these studies are self-reported values, self-reported behavior in the real world, and objectively measured behavior in games. Where’s the reliable evidence that this data translates to people’s doing more real work? Projects like Groundcrew, McGonigal concedes, have produced “modest if any results so far.” Hundreds of thousands of people play Free Rice, a game designed to feed the hungry, but the rice comes from advertisers, not players. Thousands sign up every day for Folding@home, a game to cure diseases, but all these players contribute is processing power on game consoles.

If reality is inherently less attractive than games, then the virtual world won’t save the physical world. It will empty it. Millions of gamers, in McGonigal’s words, are “opting out” of the bummer of real life. And they aren’t coming back. Halo 3, for example, has become a complete virtual world, with its own history documented in an online museum and Ken Burns-style videos. McGonigal calls this war game a model for inspiring mass cooperation. Two years ago, its 15 million players reached a long-sought objective: They killed their 10 billionth alien. “Fresh off one collective achievement, Halo players were ready to tackle an even more monumental goal,” McGonigal writes. And what goal did they choose? Feeding the hungry? Clothing the poor? No. The new goal was to kill 100 billion aliens.

Game designers can’t be counted on to arrest this trend. McGonigal says the game industry wants to help users avoid addiction so that they’ll remain functional and keep buying its products. But we’ve heard that argument before from the tobacco industry. Addiction, as a business model, is too addictive to give up. She says Foursquare, a game that rewards you for going out with friends and “checking in” at restaurants, promotes sociability. That would be nice, but the game’s Web site devotes a whole section (“Foursquare for Business”) to commercial exploitation.

The Internet isn’t heaven. It isn’t hell, either. It’s just another new world. Like other worlds, it can be civilized. It will need rules, monitoring and benevolent designers who understand the flaws of its inhabitants. If Aboujaoude is right about our weakness for virtual vice, we’ll need all the McGonigals we can get.

Céline, absolute bastard

Perhaps only in France could the novelist who was also author, in Vichy France, of antisemitic rants, be considered a candidate for state-sponsored celebration.

Every year, the French government publishes a list of cultural events and personalities to be commemorated over the next 12 months. Compiling it is a lengthy and carefully-considered process. A High Committee of National Celebrations draws up a provisional list, which is then submitted to the Culture Ministry and, once approved, published in book form. Some 10,000 copies of the Recueil des Célébrations nationals 2011 were printed last autumn ahead of last week’s launch. Frédéric Mitterrand – the culture minister lui-même – had even penned a foreword, proving beyond a shadow of doubt that the project had received his imprimatur. However, when word got out that Louis-Ferdinand Céline was to feature alongside the likes of Blaise Cendrars, Théophile Gautier, Franz Liszt and Georges Pompidou, all hell broke loose.

Serge Klarsfeld, the country’s most famous Nazi hunter and Holocaust memorialist, expressed his indignation in the name of the Association of Sons and Daughters of Jews Deported from France. The Republic, he argued, shouldn’t celebrate “the most antisemitic” Frenchman of his day – a time, lest we forget, when antisemitism was so rife that it led to state-sanctioned Jewish persecution under the Vichy regime. Mitterrand’s decision, two days later, to remove the novelist from the list was logical in light of this backlash, but also somewhat surprising since he must have known that his inclusion would prove controversial in the first place (was he protecting Sarkozy, whose favourite author happens to be Céline?)

Far more surprising, however, was the reaction of the French intelligentsia, who were almost unanimous in their defence of the author of Journey to the End of the Night. Literary heavyweight Philippe Sollers accused the Culture Ministry of “censorship”. Frédéric Vitoux, a member of the prestigious Académie française who wrote a biography of Céline, likened this decision to the airbrushing of history under Stalin. Pop philosopher Alain Finkielkraut feared that some people would draw the conclusion that a “Jewish lobby” was dictating policy to the French government. Bernard-Henri Lévy, another celebrity philosopher, claimed that the commemoration of Céline’s death should have been an opportunity to try to understand how a “truly great author” can also be an “absolute bastard”. Even more surprising, perhaps, was the fact that Serge Klarsfeld himself felt the need to declare that he rated Céline as a “great writer” before going on to describe him as a “despicable human being”.

France is a place where authors and artists are granted a special status – a kind of poetic licence or artistic immunity. In fact, the country continues to view itself, and sometimes to be regarded as, the natural second home of all artists. It is this very liberal attitude which attracted many members of the Lost and Beat generations after the second world war, and that still attracts outsider writers such as Dennis Cooper. Some of the greatest works of contemporary fiction in English – Joyce’s Ulysses, Nabokov’s Lolita or Burroughs’s Naked Lunch – were available in France when they were banned or considered unpublishable in Britain or the US. A telling culture shock occurred on live television, in 1990, when a journalist from Quebec told Gabriel Matzneff that only in Paris would he be feted for writing – however exquisitely – about his amorous liaisons with underage partners (of both genders). Anywhere else, she argued, he would probably end up in prison. The journalist was subsequently depicted as a philistine, unable to appreciate the subtlety of Matzneff’s feelings or the beauty of his style. Baudelaire once wrote that “literature and the arts pursue an aim independent of morality” and, for better or worse, this has clearly become France’s official artistic credo.

Trying to account for this “exception française” is no mean task, but I suspect it has something to do with the elevation of art to the status of surrogate religion during the second half of the 19th century. A similar phenomenon was taking place all over Europe, of course, but it probably had more resonance against the backdrop of the ongoing struggle between Republicanism and Catholicism. Both Flaubert and Baudelaire were prosecuted for public obscenity, but when French MPs called for the banning of Jean Genet‘s The Screens in 1966 (for political reasons, this time), the culture minister (and novelist) André Malraux immediately stepped in to defend the inviolability of artistic freedom. By then, artistic creation was largely considered as a value in itself, beyond morals and politics; even beyond good and evil.

The dominant French view on literature was probably best expressed by Oscar Wilde, who ended his life in exile in Paris: “There is no such thing as a moral or an immoral book. Books are well written, or badly written. That is all.” Not quite, though, in this case. No one is denying Céline’s talent as one of the greatest French writers of the 20th century – probably the greatest, with Proust. Is it possible, however, to distinguish the author of antisemitic tracts from the genius novelist; the man from the artist?

— From guardian.co.uk

Spattered with Words

She looked at him helplessly. He seemed serious. But he had seemed serious when he spoke of putting on his gems and lemon, etc. She felt, as she felt so often with Murphy, spattered with words that went dead as soon as they sounded; each word obliterated, before it had time to make sense, by the word that came next; so that in the end she did not know what had been said. It was like difficult music heard for the first time.

— Samuel Beckett, Murphy

The horse leech’s daughter is a closed system

“I greatly fear,” said Wylie, “that the syndrome known as life is too diffuse to admit of palliation. For every symptom that is eased, another is made worse. The horse leech’s daughter is a closed system. Her quantum of wantum cannot vary.”

“Very prettily put,” said Neary.

— Samuel Beckett, Murphy

Ventilate a virility

“And your whiskers?” said Wylie.

“Suppressed without pity,” said Neary, “in discharge of a vow, never again to ventilate a virility denied discharge into its predestined channel.”

“These are dark sayings,” said Wylie.

— Samuel Beckett, Murphy

Cross-eyed, nose-picking turpitude

I had spent so much of my childhood sunk into a cross-eyed, nose-picking turpitude of shame and self-loathing, scrunched up in the corner of a sweating leather chair on a hot summer day, the heat having silenced the birds . . .

— Edmund White, A Boy’s Own Story

Crapulence (…excellent…)

Smithers had thwarted my earlier attempt to take candy from a baby, but with him out of the picture, I was free to wallow in my own crapulence.

— Mr. Burns