What should young people do with their lives today?

What should young people do with their lives today? Many things, obviously. But the most daring thing is to create stable communities in which the terrible disease of loneliness can be cured.

— Kurt Vonnegut Jr. (via Sleepz and Thinkz)

If something could happen right now

If something could happen right now that is not likely to or impossible but that would really cheer you up if it did, just light you up like a child again, what would it be?

–Padgett Powell, The Interrogative Mood

Sunset

The cabin; by the stern windows; Ahab sitting alone, and gazing out.

I leave a white and turbid wake; pale waters, paler cheeks, where’er I sail. The envious billows sidelong swell to whelm my track; let them; but first I pass.

Yonder, by the ever-brimming goblet’s rim, the warm waves blush like wine. The gold brow plumbs the blue. The diver sun — slow dived from noon, — goes down; my soul mounts up! she wearies with her endless hill. Is, then, the crown too heavy that I wear? this Iron Crown of Lombardy! Yet is it bright with many a gem; I, the wearer, see not its far flashings; but darkly feel that I wear that, that dazzlingly confounds. ‘Tis iron — that I know — not gold. ‘Tis split, too — that I feel; the jagged edge galls me so, my brain seems to beat against the solid metal; aye, steel skull, mine; the sort that needs no helmet in the most brain-battering fight!

Dry heat upon my brow? Oh! time was, when as the sunrise nobly spurred me, so the sunset soothed. No more. This lovely light, it lights not me; all loveliness is anguish to me, since I can ne’er enjoy. Gifted with the high perception, I lack the low, enjoying power; damned, most subtly and most malignantly! damned in the midst of Paradise! Good night — good night! (waving his hand, he moves from the window.)

‘Twas not so hard a task. I thought to find one stubborn, at the least; but my one cogged circle fits into all their various wheels, and they revolve. Or, if you will, like so many ant-hills of powder, they all stand before me; and I their match. Oh, hard! that to fire others, the match itself must needs be wasting! What I’ve dared, I’ve willed; and what I’ve willed, I’ll do! They think me mad — Starbuck does; but I’m demoniac, I am madness maddened! That wild madness that’s only calm to comprehend itself! The prophecy was that I should be dismembered; and — Aye! I lost this leg. I now prophecy that I will dismember my dismemberer. Now, then, be the prophet and the fulfiller one. That’s more than ye, ye great gods, ever were. I laugh and hoot at ye, ye cricket-players, ye pugilists, ye deaf Burkes and blinded Bendigoes! I will not say as schoolboys do to bullies, — Take some one of your own size; don’t pommel me! No, ye’ve knocked me down, and I am up again; but ye have run and hidden. Come forth from behind your cotton bags! I have no long gun to reach ye. Come, Ahab’s compliments to ye; come and see if ye can swerve me. Swerve me? ye cannot swerve me, else ye swerve yourselves! man has ye there. Swerve me? The path to my fixed purpose is laid with iron rails, whereon my soul is grooved to run. Over unsounded gorges, through the rifled hearts of mountains, under torrents’ beds, unerringly I rush! Naught’s an obstacle, naught’s an angle to the iron way!

Moby Dick, chapter 37

Daydream Belieber

By James Parker
theatlantic.com

I get it, believe me—the fever, the yearning, the collapsed distances. I once wrote a letter to Jimmy Osmond, transatlantically, inviting him to come and live with me over the summer holidays. And Jimmy, compared with this, was just a pie-faced kid in a white suit. This is perfection. This is a boy suspended above the masses in a heart-shaped trellis, a boy with his baseball hat reversed, cradling an acoustic guitar and exhorting his fans to prayer. This is a dancing hairstyle and an unbroken voice. This is the lyrical distillate of 50 years of hysteria. This is Galahad in puffy sneakers, brandishing his virginity like a lightsaber. This is barely even pop music—this is angelology.

By this, of course, I mean Justin Bieber, age 16 and eleven-twelfths. You are aware of him. He has sold a zillion records and stormed a zillion hot little hearts. What do you make of that mop of his? If you’re an un-Belieber, a hater, it makes him look like a Lego character with its hairpiece on backward; if you’re a fan, it’s a sign. Smooshed forward from his crown by the gales of love that pursue him always, laid on impasto by the Great Artist but then tapering into tip-of-the-brush teasings at his temples, it is beautiful. Bieber is beautiful. His music is beautiful. Some of it, anyway. “Baby,” even with the clonking middle-section rap from Ludacris, is elemental sentimental pop, three and a half minutes of spray-on serotonin and tingling Brill Building strings. “My first love broke my heart for the first time!” (Had Ludacris been available to the Brill Building songwriters, no doubt they would have used him too.) “U Smile” is a gorgeous Jacksonoid piano-pumper, with Bieber suffering chivalric agonies—“Your lips, my biggest weakness / Shouldn’t have let you know / I’m always gonna do what they say”—as his voice bears the melody aloft on a cluster of vowel sounds plump as Renaissance putti. Hey-yey! Wo-woah!

This year, just in time for Valentine’s Day, came the Bieber movie, a full-length in-cinemas-everywhere life-of-Justin 3-D epic called Justin Bieber: Never Say Never. Commercially speaking, this is rather remarkable. No one ever made a movie about Nik Kershaw. Perhaps, in preparation for the event, you already purchased your special $30 pre-release package (includes glow stick, bracelet, souvenir VIP laminate, and purple 3-D glasses). Never Say Never is Bieber’s Truth or Dare, his ABBA: The Movie, his (in a way) The Man Who Fell to Earth, and it follows other works of Biebography in emphasizing the up-from-Nowheresville side of the story. Proclaims the trailer: THEY SAID IT WOULD NEVER HAPPEN. (Did they?) THEY SAID HE WOULD NEVER MAKE IT. (Who said that?) Whatever—the point is that Bieber overcame the odds. Born in 1994 and raised in Stratford, Ontario, with only his huge and unmistakable charisma to help him, he dramatically rescued himself from a null state of non-superstardom and Canadianness.

How did he do it? With YouTube, that’s how. Kissed in his cradle by the witch of the Web, Justin was throwing up little promo reels by the time he was 12. Singing a Brian McKnight song into the bathroom mirror. Or sitting on some municipal steps somewhere, busking mightily about the Lord: “You’re my God and my Fa-ther!” he bellows through the legs of passersby, the wooden body of his guitar reverberating with his shouts. “You’re forever the same / How could I feel so empty now when I speak your name?” (The song is an altered version of 1998’s “Refine Me,” by Jennifer Knapp.) He’s a handy little drummer, the clips show, and a nifty little dancer. He can blast through a ballad but is equally at home with the nibbled syllables and melismata of contemporary R&B. And the, ahem, “star quality” is impossible to miss: the insolent naturalness before the camera, the lifting eyebrows and swerving octaves.

His mother took him to church and envisioned him as a singing prophet, but the world of Pop was calling to him, calling—had been all along. “The day I was born,” records his 2010 memoir, First Step 2 Forever: My Story, “Celine Dion was solid at #1 on the Billboard Hot 100 with ‘The Power of Love.’”

Those YouTube videos garnered thousands of hits, eyeballs, unique visitors, one of whom was the Atlanta hip-hop manager Scooter Braun. Braun tracked Bieber to Stratford, flew him to Atlanta to meet the industry, Bieber signed a production deal with Usher, and the rest is history, or soon will be. Bieber wasn’t from the Disney factory, and he didn’t have a show on Nickelodeon, so the marketing plan was skewed toward his already established constituency in social media: lots of Facebooking and YouTubing and sugary tweets to his millions-strong Twitter army. “Music is the universal language no matter the country we are born in or the color of our skin. Brings us all together.” (May 19, 2010, 11:37 a.m.) Braun described the strategy to The New York Times: “We’ll give it to kids, let them do the work, so that they feel like it’s theirs.” The kids have performed reliably. “I’ve been a fan of him from when he posted his first video,” testifies a hard-core Belieber in Never Say Never, “and I’ll be his fan ’til he posts his last video.”

The Monkees had their TV show; Bieber has the Web. Carl Wilson, author of the acclaimed book on Celine Dion, Let’s Talk About Love: A Journey to the End of Taste, has noted Bieber’s “McLuhanesque fluency” in cyberspace. It comes with a price. Sex slurs, death rumors, pranks, random venom, swarmings of hostility and obscenity: the underbrain of the Internet is magnetized to Justin Bieber. But the Beliebers have his back, furious little proselytes. Slag him off in a comments thread and they’ll be all over you: “You dumbass, every one like you hate JB cuz you envied him, actually I just don’t know why you hate him!!!” The occasional wistful note of grievance sounded via Twitter from Bieber HQ—“It’s funny when I read things about myself that are just not true. Why would people take time out from their day to hate on a sixteen-year-old?”—suffices to keep their indignation on the boil.

His claim upon their hearts is no mystery. The kid is teenybop-perfect, eagerly suspending his charm-particles in the requisite solution of pure nonentity. He smiles, he thanks, he praises; he displays a persona from which every hint of psychology has been combed out; and he wonders, mildly but without cease, just who the right girl for him might be. “I haven’t been in love yet but I’ve felt love,” he told M magazine in 2009. “It’s a beautiful emotion that you can’t really describe.” No matter that he could, presumably, be slathered in gratifications at the snap of his princely fingers; or that he drops a saucy hint, here and there, about Beyoncé or Rihanna or Kim Kardashian … Justin is alone, silhouetted against the blast of his fame. He pines nonspecifically, a knight-errant with cloche hair at the foot of an invisible tower. “Is she out there?” he calls, echoingly, at the end of “Somebody to Love.” “Is she out there?” And the pulse of longing goes forth, like sonar, to reverberate in the cells of a million unformed libidos.

It’s cruel, really. And they’ve been doing it for decades. Here’s David Cassidy in 1971, meek and un-phallic, soliloquizing his way through the Partridge Family’s “Doesn’t Somebody Want to Be Wanted”: “Y’know, I’m no different from anybody else. I start each day, and I end each night, and it gets really lonely when you’re by yourself. Now where is love? And who is love? I gotta know.”

But David Cassidy performed the rest of that song in a pushy, forgettable ’70s tenor. The Bieber voice—as captured, say, in the acoustic version of “One Time”—is more druglike: a chemical purity of tone with something frictional in the breath, a soft croak or crackle, the faintest rasp of incoming puberty. Potent stuff: the philosopher’s stone of the music industry. Add to it the shimmer of quasi-religious transcendence, the Protestant Pop ethic with which a video like “Pray” is loaded up—images of Justin embracing unfortunates while he sings about closing his eyes and seeing a better day—and you have a product to knock the Archies, the Bay City Rollers, and the Backstreet Boys right on their bubblegum asses.

Naturally, none of this can last. “I’m going down, down, down, down,” he sings in “Baby,” the voice itself depthless, evaporating as it hits the lower range. But he is going down. His collision with biology can be postponed no longer. Gravity, muscles, sag, paunch, depression, hair growing in the ears … All too soon, all too soon. For the music industry, for Justin Bieber, for us poor fools who adore him, this is the magic moment. And to gaze upon it without blinking, you will need those purple 3-D glasses.

The 275th lay

. . . I am one of those that never take on about princely fortunes, and am quite content if the world is ready to board and lodge me, while I am putting up at this grim sign of the Thunder Cloud.

Moby Dick, “The Ship”

Professor Bem’s case for ESP

By Dan Kois
nymag.com

I stare at the two curtains side by side on my computer screen. I try to focus on the task at hand: Which image has a photo hidden behind it? And what might it be? The alpine lake at sunset? The loving husband embracing his wife?

I choose the curtain on the left. Behind it are a naked man and woman, fucking.

The pair’s sleek, airbrushed bodies flash on my monitor for precisely two seconds, long enough for me to wonder: Did they know? When these two posed, could they guess that one day this JPEG would wind up on a Mac Mini in a lab at Cornell University? Did he know that from his depilated testicles might be launched the first salvo in the war against the ESP skeptics? Did she know her O-face might change the face of science? Could they see the future?

Maybe so, if you believe the research of ­Daryl Bem. According to “Feeling the Future,” a peer-reviewed paper the APA’s Journal of Personality and Social Psychology will publish this month, Bem has found evidence supporting the existence of precognition. The experiment I’m trying, one of nine Bem cites in his study, asks me to guess which of two curtains hides a photograph. (Some of the images are erotic, some neutral, in an attempt to see if different kinds of photos have different effects.) If mere chance governed each guess, I’d be right 50 percent of the time. Naturally, I’d guess correctly more like 100 percent of the time if you showed me where the photo was before I chose.

But what about if you showed me the photo’s location immediately after I chose? Perhaps, if I had ESP, I could peek into the future and improve my guesswork, even just a little bit. Over seven years, Bem tested more than 1,000 subjects in this very room, and he believes he’s demonstrated that some mysterious force gives humans just the slightest leg up on chance.

Responses to Bem’s paper by the scientific community have ranged from arch disdain to frothing rejection. And in a rebuttal—which, uncommonly, is being published in the same issue of JPSP as Bem’s article—another scientist suggests that not only is this study seriously flawed, but it also foregrounds a crisis in psychology itself.

The scourge of responsible psychological research stands behind me, wearing a red cardigan and an expression of great interest. “How were your results?” Bem asks. He points out that I scored better predicting the location of erotic photos—in Bem’s hypothesis, more arousing images are more likely to inspire ESP—than I did boring old landscapes and portraits. In this dingy lab in the basement of an Ivy League psych department, is the future now?

Even before Daryl Bem, 72, began studying ESP, he was a mind reader—or rather, a mentalist, who performed Kreskin-style magic acts for students and friends. He knew how easily audiences could be tricked, so he was a skeptic about parapsychology, or PSI. “Like most psychologists,” he says, sitting on an elderly couch in his townhouse two miles from Cornell’s campus in Ithaca, “I knew all the ways in which people could fool themselves and interpret coincidences as premonitions.” But reading the existing PSI research changed his opinion about how the brain works. Years ago, he says, “the model of the brain we had was more of a switchboard: stimulus in, response out. Now we have a richer metaphor for thinking about the brain: the computer.” His hands trace a flourish in the air, as if to say Presto!“Short-term and long-term memory have analogs in the computer. There’s stuff in RAM that’ll disappear when you turn the computer off, and there’s stuff you’ve saved to disk. The computer does an enormous amount of unconscious processing—that is, stuff that does not appear on the screen, if you think of the screen as the consciousness.”

Over seven years, Bem measured what he considers statistically significant results in eight of his nine studies. In the experiment I tried, the average hit rate among 100 Cornell undergraduates for erotic photos was 53.1 percent. (Neutral photos showed no effect.) That doesn’t seem like much, but as Bem points out, it’s about the same as the house’s advantage in roulette.

Thinking counterintuitively about ESP appealed to him. “My career has been characterized,” he says, “by trying to solve conundrums where I just don’t believe the conventional explanation.” More than 40 years ago, Bem’s doctoral dissertation challenged the dominant paradigm of social psychology, Leon Festinger’s concept of cognitive dissonance. Bem’s groundbreaking “self-perception theory” suggests that rather than possessing an ironclad sense of self, we define our own emotions and attitudes using the same haphazard external cues (If I bite my nails, I must be nervous) that others use when observing us. “It’s a clever theory,” Bem says, “but what made me rich and famous is that I called the article ‘Self-­Perception Theory: An Alternative to Cognitive Dissonance.’ ”

Still, precognition seems a little too counter­intuitive—and easily counterargued. For example, wouldn’t I notice if I had ESP? Also, why do I always lose at roulette?

To science-writing eminence Douglas Hofstadter, the publication of work like Bem’s has the potential to unleash, and legitimize, other “crackpot ideas.” In the New York Times, the University of Oregon’s Ray Hyman used the words “an embarrassment for the entire field.” Some critics protest that the article can’t explain what mechanism might be behind precognition. (“We almost always have the phenomenon before we have the explanation,” Bem says.) Others just scoff: Why limit yourself to one kind of pseudoscience? As York University’s James Alcock points out in Skeptical Inquirer, that 53 percent might as well be proof of the power of prayer.

“It shouldn’t be difficult to do one proper experiment and not nine crappy experiments,” the University of Amsterdam’s Eric-Jan Wagenmakers tells me. He’s the co-author of the rebuttal that accompanies Bem’s article in JPSP. Wagen­makers uses Bayesian analysis—a statistical method meant to enforce the notion that extraordinary claims require extra­ordinary evidence—to argue that Bem’s results are indistinguishable from chance. In essence, he explains, 53 percent of a bunch of Cornell sophomores, in unmonitored experiments conducted by a pro-PSI professor, shouldn’t really move the needle, considering how deeply unlikely the existence of precognition actually is. The paper, says Wagenmakers, never should have made it through peer review, and the fact that it did is representative of a larger crisis in the field: The methods and statistics used in psychology, he writes, are “too weak, too malleable, and offer far too many opportunities for researchers to befuddle themselves and their peers.”

In a statement printed in the March issue, JPSP’s editors admit that they find Bem’s results “extremely puzzling.” Never­theless, they write, “our obligation as journal editors is not to endorse particular hypotheses, but to advance and stimulate science through a rigorous review process.” One of the article’s four peer reviewers, Jonathan Schooler of UC Santa Barbara, says he approved the study for publication because, simply, “I truly believe that this kind of finding from a well-respected, careful researcher deserves public airing.” (Schooler is currently engaged in PSI research; the JPSPwould not divulge the identities of any of the peer reviewers.) He agrees with Wagenmakers’s objections to a point, but protests that “if you hold the bar too high, you’ll never be able to get the data out there for scrutiny.”

And boy, has the data gotten out there; Bem even made a lively appearance on The Colbert Report. Which means that if his study fails to replicate and is discredited, it’ll be just another widely reported “breakthrough” that turned out to be wrong—like vaccines and autism, except this time it’s ESP. “When I look at the results in high-­impact journals, I have to laugh, the ridiculous things that are in there. And it’s your fault as well,” Wagenmakers tells me. “The media presents these spectacular findings, like, if you eat a certain species of tomato, you’re 12 percent less likely to develop cancer. Well, how on earth could you design an experiment to prove that?”

Daryl Bem’s mother, Sylvia, was the bowling pioneer of Denver, running the local leagues when the game was still considered unsuitable for ladies. “She always had a gleam in her eye about the fact the neighbors disapproved,” Bem remembers. “Being out of step with the rest of the world just never bothered her any.”

Nor him. Bem dismisses detractors like Ray Hyman as “not worth listening to, because they haven’t come up with any alternative.” But he insists he takes serious critics—“who take the time to read the research thoroughly”—seriously. He praises Wagenmakers’s rebuttal, and with the help of two statisticians, he has written a rebuttal to the rebuttal, currently in peer review at JPSP. “I think I’ve pretty well covered my ass.” (I later send a copy to Wagen­makers; he comments, “There’s nothing new there. I’m not convinced at all.”)

Bem’s gone against the grain his whole life; sometimes, he’s been right. He was arrested at civil-rights sit-ins in Michigan in the sixties and testified with his wife before the FCC in the seventies to force AT&T to change its discriminatory hiring practices toward women. Daryl Bem, Ph.D., and Sandra Lipsitz Bem, Ph.D., were even interviewed in the first issue of Ms. magazine about their egalitarian, gender-­liberated marriage.

The Drs. Bem lectured together for years, giving three-hour seminars to packed houses about a partnership in which housework was split evenly and careers were equally important. Though both are now professors emeriti at Cornell, they don’t share a home; neither divorced nor legally separated, they’ve been apart for eighteen years.

“I always loved living alone,” Bem says. “And then the other thing is, I identify as gay.” He tucks his white-socked feet under a couch cushion and remembers an early date with Sandra. “I said, ‘Well, I’m from Colorado, and I’m a stage magician, and I’m predominantly homoerotic.’ And she said, ‘I don’t think I’ve ever known anyone from Colorado before.’ ” He chuckles at his well-­practiced joke.

Bem and his partner, Bruce Henderson, a professor of communication studies at Ithaca College, just celebrated their fifteenth anniversary. They’ve never lived together. “He’s a total slob,” Bem confides. “His place looks like somewhere you’d find a body amid all the junk and the cats.” (Insists Henderson: “I have what seems like 100 cats but is in reality only one very old one.”) Bem’s home, by contrast, is tidy. Decorative owl knickknacks perch attentively atop the window seat, flanking a doll of Edna Mode, the fashion designer from The Incredibles.

Despite “a certain irreverence toward the academy,” as Henderson puts it, Bem never senses any resentment from inside Cornell. “The faculty all have the property of being sufficiently arrogant,” Bem explains, “that it doesn’t trouble them to have a flake in their midst. They value the kind of non-­conformity that leads you to new things. That’s why they’re here rather than at the University of Mexico, or whatever.”

Thomas Gilovich, Bem’s department head, agrees: “You have to be a very solid university to have the luxury of having someone like Daryl around.” Before retiring to emeritus status, Bem taught a variety of classes, including a seminar on the culture wars. “My purpose every week,” he says, “was to give them an aha! experience, where they say, ‘I’ve never thought of an issue in that way.’ ” He publishes less than other academics of his stature. “It’s not embarrassingly low, but in pure publication terms, you’d probably be surprised,” says Gilovich. But when he does publish, it’s a showstopper. As Gilovich puts it, “His impact factor is high.”

Still, Cornell’s grad students never assist with Bem’s ESP studies, at Bem’s insistence, to avoid possible career-­hampering stigma. “I tell students, ‘Only undergraduates and tenured professors should study this stuff,’ ” he says. Bem got tenure “in 1968, 1969, or 1970, I’m not sure which,” well before he ever started studying PSI. Because he can’t get grants, he pays for his research himself.

Gilovich has known Bem for 33 years. (“Daryl and Sandy taught us how to play bridge.”) He has an unforced affection for his colleague, and he’s dubious of warnings that science might suffer if Bem’s research turns out to be bunk. “I feel like science is strong enough,” he says. “It’s a very corrective discipline. If an idea is boringly wrong, it’ll be forgotten. If it’s excitingly wrong, other people will do research and will find out.”

I ask him about Bem’s research plans for the spring semester: to recruit students in Gilovich’s Intro to Social Psych class and feed them answers after they’ve taken the ­multiple-choice quizzes. If Bem’s results are positive, would that be a violation of Cornell’s Code of Academic Integrity?

Gilovich laughs. As long as everyone in the class has the same opportunity, he says, it should be okay. “Look, there are a lot of skeptics who say, ‘Oh, the world’s interesting enough.’ Yeah, it is, but if there were other, you know, realms, dimensions, whatever, that we don’t know about—that would be even better.” He’s quiet for a moment. “It would be cool if it’s true. I’m just … I’d bet a lot of money it’s not.”

Before PSI, Bem made his biggest splash in the nonacademic world with a politically incorrect but weirdly compelling theory of sexual orientation. In 1996, he published “Exotic Becomes Erotic” in Psychological Review, arguing that neither gays nor straights are “born that way”—they’re born a certain way, and that’s what eventually determines their sexual preference.

“I think what the genes code for is not sexual orientation but rather a type of personality,” he explains. According to the EBE theory, if your genes make you a traditionally “male” little boy, a lover of sports and sticks, you’ll fit in with other boys, so what will be exotic to you—and, eventually, erotic—are females. On the other hand, if you’re sensitive, flamboyant, performative, you’ll be alienated from other boys, so you’ll gravitate sexually toward your exotic—males.

EBE is not exactly universally accepted. “The evidence is overwhelming that sexuality is constitutionally based,” Glenn Wilson, a professor at London’s Gresham College and the co-author of a book on the psychobiology of sexual orientation, tells me in an e-mail. “Bem’s theory has no merit. It does not specify why one individual would be affected by ‘alienation’ rather than another.”

Bem seems unconcerned. “Colleagues of mine, especially those in biological science, say, ‘Daryl, your theory is beautifully written and well argued and almost certainly wrong.’ Which is fine!” He laughs. He’s moved on in search of other magic tricks—more aha! moments to rile, and perhaps expand, the world of research psychology. “I’m perfectly happy to be wrong.”

How I snuffed that Tartar air! — how I spurned that turnpike earth! —

. . . all betokening that new cruises were on the start; that one most perilous and long voyage ended, only begins a second; and a second ended, only begins a third, and so on, for ever and for aye. Such is the endlessness, yea, the intolerableness of all earthly effort.

Gaining the more open water, the bracing breeze waxed fresh; the little Moss tossed the quick foam from her bows, as a young colt his snortings. How I snuffed that Tartar air! — how I spurned that turnpike earth! — that common highway all dented over with the marks of slavish heels and hoofs; and turned me to admire the magnanimity of the sea which will permit no records.

Moby Dick, “Wheelbarrow”

The things which will save you

A sufferer of depression should not feel shame or believe they are worthless. . . . Look deep into yourself for the qualities you need to survive. Your talents, hopes, dreams, and desires. Because these are the things which will save you.

— Darryl Cunningham, Psychiatric Tales

A knowing irony of the intellect and a lassitude of the heart

With arguments that have long been at the cutting edge of political philosophy, Cohen develops a distinctive conception of democracy as based upon reason-giving among equals, and he does so in a way that never ignores political reality. . . . All too often, Cohen observes, we respond to political ideals with ‘a knowing irony of the intellect and a lassitude of the heart.’ Realistic without cynicism, idealistic without naivete, Cohen’s book responds to the real world with rigorous intellectual aspiration and undaunted practical hope.

— Martha C. Nussbaum, reviewing Joshua Cohen’s Philosophy, Politics, Democracy: Selected Essays

Handing Out Knives to Madmen

By Josiah Ober
wsj.com

A review of:

The Hemlock Cup: Socrates, Athens and the Search for the Good Life
By Bettany Hughes

Two and a half millennia after an Athenian philosopher drank a poisoned cup of hemlock as punishment for crimes against the state, the ancient Greek world continues to captivate us. And rightly so: New scholarship continues to reveal just how remarkable it was. Most premodern states, like too many countries today, were dominated by a small elite of ultra-privileged insiders who monopolized public goods, skimming off whatever surplus was produced by populations living near bare subsistence and thus seizing super size shares of stagnant economies. By contrast, the Greek world in the 500 years from Homer to Aristotle saw sustained economic growth and historically low level of economic inequality. Recent studies suggest that, from 800 B.C. to 300 B.C., the population of the Greek city-states increased by a factor of 10, while per capita incomes roughly doubled. That growth rate may be sluggish compared to leading 21st- century economies, but it is amazing by the standards of premodernity. What can explain such economic growth?

Free Greeks (we must never forget that this was a slave society) invested their efforts in industry, commerce and politics because they did not fear that the fruits of their effort would be expropriated by the powerful. Further, though the roughly 1,000 city-states of classical Greece competed fiercely in war, they actively exchanged goods and ideas. Among the productive innovations that spread rapidly across the Greek world were coinage, codes of law, and deliberative councils. There was no imperial center but instead a historically distinctive approach to politics.

The typical city-state was republican rather than autocratic, and a substantial part of the adult male population enjoyed participation rights. In Athens, the epoch-making “People’s Revolution” in 508 B.C. resulted in democracy—the strongest form of republicanism the world had ever known—which meant universal native adult male franchise and a commitment to the principles of equal votes and freedom of speech and association. The Greek word demokratia asserted a fact and an aspiration: The people (demos) have the capacity (kratos) to make history.

Aristotle (384-322 B.C.) inhabited a world of dense, urban populations and vast trade in food and labor. By his time, perhaps half of all Greek city-states were democracies, and the philosopher opined that, “now that city-states are even larger, it is difficult for any non-democratic regime to arise.” Wealth was not concentrated at the top: Laborers (citizens, foreigners and slaves alike) commanded wages far above subsistence. Archaeological excavations show that even the houses of those in the lowest quartile of income were spacious and well built. In Athens, citizens promoted ever more open access through new forms of constitutional and commercial law. They supported centers of higher education that laid the foundations of Western thought: first Plato’s Academy and Isocrates’ school of rhetoric; then Aristotle’s Lyceum, Zeno’s Stoa and Epicurus’ Garden.

Socrates, the central figure of Bettany Hughes’s delightful, if occasionally exasperating, book, was born in 469 B.C.—a long generation after Athens’s democratic revolution—and died in 399 B.C., two generations before the Greek world reached its apex of population, economic success and democratization. Yet his Athens was already experiencing what can be considered a golden age of advancements in thought, art and politics. During Socrates’ lifetime, the city also built and lost an Aegean empire that, at its height, encompassed more than 150 communities: a quarter-million Athenian residents and perhaps three times as many of their fellow Greeks.

Ms. Hughes presents a high-octane account of Socrates and his age, based on ancient literary sources, current archaeology and her own fertile imagination. She offers vivid slices of imagined Athenian life, taking us behind the public realm dominated by citizen men and exploring the lives of women, slaves and foreign residents in Socrates’ city. Her biographical sketches of key figures— Pericles, the courtesan Aspasia, the military commander Alcibiades—are rich with lively detail. Here is Ms. Hughes on a religious sanctuary frequented by Athenian girls: “This sacred zone would have resembled an outpost of the rag-trade: when women died in childbirth their clothes were dedicated here; draped, hung and stored around the sanctuary; a limp gift to pitiless Artemis, to whom, probably just a few years from now, the girls would be calling out during the dreadful pangs of labor.” Ms. Hughes weaves a morality tale about the danger of mixing politics with empire, wealth and religion, which ends with the trial of Socrates and the eclipse (she claims) of Athens’ democratic golden age.

Though Ms. Hughes celebrates the Socratic ideal—the idea that a constantly examined life, devoted to seeking the good and true, is the only life worth living—she does not pretend to present new insights into Socratic thought. The payoff comes, instead, through re-situating the origins of moral philosophy in the context of a vibrant cultural and historical milieu. She invites the reader to travel as her companion beyond the familiar byways of Greek history, offering a full menu of “you were there” sounds, smells and textures: noisy Eleusinian cults, stinking excrement, “the dark, the whispers, the unseen skin pricks connecting flesh to flesh” at a symposium. All of this is great fun.

Yet readers must proceed with caution, for her conclusions are questionable. Athenian greatness, for Ms. Hughes, is coterminous with Socrates’ life. “Socrates’ lifespan marked the beginning and an end of an idea—the idealistic vision of an autonomous, tolerant, democratic Athenian city-state,” she writes. Aristotle, who was born 15 years after Socrates’ death and lived in a wealthy, densely populated Greek world in which democracy was increasingly prevalent, would be puzzled by Ms. Hughes’s conclusions. But precipitous rises and tragic falls tend to be attractive to popular historians.

Even more problematic is that Ms. Hughes fails to provide a convincing answer to the central question of her tale: Why was Socrates tried and condemned? Socrates was a public philo sopher who could reliably be found conducting dialogues near the bankers’ tables in the Agora. But he was also known to contemporaries as a teacher of aristocrats—including Alcibiades and Critias, who became enemies of the Athenian democracy, the latter in a blood-drenched postwar coup d’état in 404-03 B.C.

Protected by an amnesty, Socrates could not be prosecuted for having taught the tyrant Critias to despise democracy. But the philosopher’s behavior following the democratic restoration was not similarly protected—and both before and after the coup his behavior included engaging others in public dialogues. Athenians had always held citizens legally responsible for the effects of their public speech. Socrates’ dialogues were intended to make his listeners behave differently, and he denied that good ideas ever produced bad behavior. But the monstrous acts of Critias (among others) seemed evidence to the contrary.

And so Socrates’ accusers were free to claim that, whether he realized it or not, he was a public danger, especially to impressionable youths. Since his public speech included sharp criticism of demo cracy, they could imply that Socrates, in effect, handed out knives to madmen, that when he denied dialogue could ever be dangerous Socrates was disingenuous or deluded, that his irresponsible speech was likely to have bad effects in the future—as it seemingly had in the past.

The charge brought against Socrates, by a team of voluntary prosecutors, was impiety—they accused him of impropriety in respect to the gods and corruption of the youth. I would suggest that Socrates was convicted not because the jurors were religious fanatics, not because they had lost their democratic tolerance, but because he seemed to them unreasonably unwilling to take responsibility for what he said in public. Today we allow pundits to say what they please, even if their speech has pernicious or even fatal effects. I think we are right to do so, and I think that 280 (out of 501) Athenian jurymen were wrong when they voted to condemn Socrates. But I think they were right to believe that when prominent public figures refuse to take personal responsibility for the consequences of their speech, democracy is in grave danger.

Beyond her failure to convincingly interpret Socrates’ trial, Ms. Hughes gets some facts wrong: The Athenian population did not increase five-fold during Socrates’ life; nor did Athens lose four-fifths of its population during the Peloponnesian War. Papyrus does not rot within weeks in the Greek climate. Socrates’ main weapon as a soldier in the Athenian army was not a broad-sword. Pericles was never elected “chief democrat.” The walls of Athens did not separate citizen-haves from non-citizen have-nots. And far from despising the disabled, Athens paid handicapped citizens a daily wage.

Ms. Hughes is also too uncritical of her sources: She offers late Roman accounts of the Athenian persecution of intellectuals, long ago rejected as tendentious fictions by serious scholars, as fact and even inflates them into a systematic regime of censorship, book-burning and execution. She imagines that “scores— perhaps hundreds” of intellectuals shared Socrates’ fate. If this were true, Socrates’ trial would be unsurprising, but Ms. Hughes’s claim is not supported by any credible evidence.

All the same, and despite these glitches, do read this book, both because of its marvelous storytelling and because it will stimulate a desire to learn more about the ancient world. Ms. Hughes’s work joins a growing shelf of books about antiquity that exemplify the honorable goal of responsible popularization. Readers attracted to the splendid range of classical texts excerpted by Ms. Hughes will want to read the full versions; almost all are available in modern translations. (Start with Plato, in the complete edition from Hackett, edited by John Cooper—superb translations, with helpful introductions by one of the world’s leading experts.) Socrates would be pleased if his story awakened a few modern Americans from moral slumber.


I took off the mask and looked in the mirror

I took off the mask and looked in the mirror.
I was the same child I was a year ago.
I hadn’t changed at all . . .

That’s the advantage of knowing how to remove your mask.
You’re still the child,
The past that lives on,
The child.

I took off my mask, and I put it back on.
It’s better this way.
This way I’m the mask.

And I return to normality as to a streetcar terminus.

11 August 1934

— Álvaro de Campos (Fernando Pessoa)

Bother Me, I’m Thinking

By Jonah Lehrer
wsj.com

We live in a time that worships attention. When we need to work, we force ourselves to focus, to stare straight ahead at the computer screen. There’s a Starbucks on seemingly every corner—caffeine makes it easier to concentrate—and when coffee isn’t enough, we chug Red Bull.

In fact, the ability to pay attention is considered such an essential life skill that the lack of it has become a widespread medical problem. Nearly 10% of American children are now diagnosed with attention-deficit hyperactivity disorder (ADHD).

In recent years, however, scientists have begun to outline the surprising benefits of not paying attention. Sometimes, too much focus can backfire; all that caffeine gets in the way. For instance, researchers have found a surprising link between daydreaming and creativity—people who daydream more are also better at generating new ideas. Other studies have found that employees are more productive when they’re allowed to engage in “Internet leisure browsing” and that people unable to concentrate due to severe brain damage actually score above average on various problem-solving tasks.

A new study led by researchers at the University of Memphis and the University of Michigan extends this theme. The scientists measured the success of 60 undergraduates in various fields, from the visual arts to science. They asked the students if they’d ever won a prize at a juried art show or been honored at a science fair. In every domain, students who had been diagnosed with attention-deficit disorder achieved more: Their inability to focus turned out to be a creative advantage.

And this lesson doesn’t just apply to people with a full-fledged disorder. A few years ago, scientists at the University of Toronto and Harvard gave a short mental test to 86 Harvard undergraduates. The test was designed to measure their ability to ignore irrelevant stimuli, such as the air-conditioner humming in the background or the conversation taking place nearby. This skill is typically seen as an essential component of productivity, since it keeps people from getting distracted by extraneous information.

Here’s where the data get interesting: Those undergrads who had a tougher time ignoring unrelated stuff were also seven times more likely to be rated as “eminent creative achievers” based on their previous accomplishments. (The association was particularly strong among distractible students with high IQs.)

According to the scientists, the inability to focus helps ensure a richer mixture of thoughts in consciousness. Because these people struggled to filter the world, they ended up letting everything in. They couldn’t help but be open-minded.

Such lapses in attention turn out to be a crucial creative skill. When we’re faced with a difficult problem, the most obvious solution—that first idea we focus on—is probably wrong. At such moments, it often helps to consider far-fetched possibilities, to approach the task from an unconventional perspective. And this is why distraction is helpful: People unable to focus are more likely to consider information that might seem irrelevant but will later inspire the breakthrough. When we don’t know where to look, we need to look everywhere.

This doesn’t mean, of course, that attention isn’t an important mental skill, or that attention-deficit disorders aren’t a serious problem. There’s clearly nothing advantageous about struggling in the classroom, or not being able to follow instructions. (It’s also worth pointing out that these studies all involve college students, which doesn’t tell us anything about those kids with ADHD who fail to graduate from high school. Distraction might be a cognitive luxury that not everyone can afford.)

Nevertheless, this new research demonstrates that, for a certain segment of the population, distractibility can actually be a net positive. Although we think that more attention can solve everything—that the best strategy is always a strict focus fueled by triple espressos—that’s not the case. Sometimes, the most productive thing we can do is surf the Web and eavesdrop on that conversation next door.

Where Have the Good Men Gone?

By Kay Hymowitz
wsj.com

Not so long ago, the average American man in his 20s had achieved most of the milestones of adulthood: a high-school diploma, financial independence, marriage and children. Today, most men in their 20s hang out in a novel sort of limbo, a hybrid state of semi-hormonal adolescence and responsible self-reliance. This “pre-adulthood” has much to recommend it, especially for the college-educated. But it’s time to state what has become obvious to legions of frustrated young women: It doesn’t bring out the best in men.

“We are sick of hooking up with guys,” writes the comedian Julie Klausner, author of a touchingly funny 2010 book, “I Don’t Care About Your Band: What I Learned from Indie Rockers, Trust Funders, Pornographers, Felons, Faux-Sensitive Hipsters and Other Guys I’ve Dated.” What Ms. Klausner means by “guys” is males who are not boys or men but something in between. “Guys talk about ‘Star Wars’ like it’s not a movie made for people half their age; a guy’s idea of a perfect night is a hang around the PlayStation with his bandmates, or a trip to Vegas with his college friends…. They are more like the kids we babysat than the dads who drove us home.” One female reviewer of Ms. Kausner’s book wrote, “I had to stop several times while reading and think: Wait, did I date this same guy?”

For most of us, the cultural habitat of pre-adulthood no longer seems noteworthy. After all, popular culture has been crowded with pre-adults for almost two decades. Hollywood started the affair in the early 1990s with movies like “Singles,” “Reality Bites,” “Single White Female” and “Swingers.” Television soon deepened the relationship, giving us the agreeable company of Monica, Joey, Rachel and Ross; Jerry, Elaine, George and Kramer; Carrie, Miranda, et al.

But for all its familiarity, pre-adulthood represents a momentous sociological development. It’s no exaggeration to say that having large numbers of single young men and women living independently, while also having enough disposable income to avoid ever messing up their kitchens, is something entirely new in human experience. Yes, at other points in Western history young people have waited well into their 20s to marry, and yes, office girls and bachelor lawyers have been working and finding amusement in cities for more than a century. But their numbers and their money supply were always relatively small. Today’s pre-adults are a different matter. They are a major demographic event.

What also makes pre-adulthood something new is its radical reversal of the sexual hierarchy. Among pre-adults, women are the first sex. They graduate from college in greater numbers (among Americans ages 25 to 34, 34% of women now have a bachelor’s degree but just 27% of men), and they have higher GPAs. As most professors tell it, they also have more confidence and drive. These strengths carry women through their 20s, when they are more likely than men to be in grad school and making strides in the workplace. In a number of cities, they are even out-earning their brothers and boyfriends.

Still, for these women, one key question won’t go away: Where have the good men gone? Their male peers often come across as aging frat boys, maladroit geeks or grubby slackers—a gender gap neatly crystallized by the director Judd Apatow in his hit 2007 movie “Knocked Up.” The story’s hero is 23-year-old Ben Stone (Seth Rogen), who has a drunken fling with Allison Scott (Katherine Heigl) and gets her pregnant. Ben lives in a Los Angeles crash pad with a group of grubby friends who spend their days playing videogames, smoking pot and unsuccessfully planning to launch a porn website. Allison, by contrast, is on her way up as a television reporter and lives in a neatly kept apartment with what appear to be clean sheets and towels. Once she decides to have the baby, she figures out what needs to be done and does it. Ben can only stumble his way toward being a responsible grownup.

So where did these pre-adults come from? You might assume that their appearance is a result of spoiled 24-year-olds trying to prolong the campus drinking and hook-up scene while exploiting the largesse of mom and dad. But the causes run deeper than that. Beginning in the 1980s, the economic advantage of higher education—the “college premium”—began to increase dramatically. Between 1960 and 2000, the percentage of younger adults enrolled in college or graduate school more than doubled. In the “knowledge economy,” good jobs go to those with degrees. And degrees take years.

Another factor in the lengthening of the road to adulthood is our increasingly labyrinthine labor market. The past decades’ economic expansion and the digital revolution have transformed the high-end labor market into a fierce competition for the most stimulating, creative and glamorous jobs. Fields that attract ambitious young men and women often require years of moving between school and internships, between internships and jobs, laterally and horizontally between jobs, and between cities in the U.S. and abroad. The knowledge economy gives the educated young an unprecedented opportunity to think about work in personal terms. They are looking not just for jobs but for “careers,” work in which they can exercise their talents and express their deepest passions. They expect their careers to give shape to their identity. For today’s pre-adults, “what you do” is almost synonymous with “who you are,” and starting a family is seldom part of the picture.

Pre-adulthood can be compared to adolescence, an idea invented in the mid-20th century as American teenagers were herded away from the fields and the workplace and into that new institution, the high school. For a long time, the poor and recent immigrants were not part of adolescent life; they went straight to work, since their families couldn’t afford the lost labor and income. But the country had grown rich enough to carve out space and time to create a more highly educated citizenry and work force. Teenagers quickly became a marketing and cultural phenomenon. They also earned their own psychological profile. One of the most influential of the psychologists of adolescence was Erik Erikson, who described the stage as a “moratorium,” a limbo between childhood and adulthood characterized by role confusion, emotional turmoil and identity conflict.

Like adolescents in the 20th century, today’s pre-adults have been wait-listed for adulthood. Marketers and culture creators help to promote pre-adulthood as a lifestyle. And like adolescence, pre-adulthood is a class-based social phenomenon, reserved for the relatively well-to-do. Those who don’t get a four-year college degree are not in a position to compete for the more satisfying jobs of the knowledge economy.

But pre-adults differ in one major respect from adolescents. They write their own biographies, and they do it from scratch. Sociologists use the term “life script” to describe a particular society’s ordering of life’s large events and stages. Though such scripts vary across cultures, the archetypal plot is deeply rooted in our biological nature. The invention of adolescence did not change the large Roman numerals of the American script. Adults continued to be those who took over the primary tasks of the economy and culture. For women, the central task usually involved the day-to-day rearing of the next generation; for men, it involved protecting and providing for their wives and children. If you followed the script, you became an adult, a temporary custodian of the social order until your own old age and demise.

Unlike adolescents, however, pre-adults don’t know what is supposed to come next. For them, marriage and parenthood come in many forms, or can be skipped altogether. In 1970, just 16% of Americans ages 25 to 29 had never been married; today that’s true of an astonishing 55% of the age group. In the U.S., the mean age at first marriage has been climbing toward 30 (a point past which it has already gone in much of Europe). It is no wonder that so many young Americans suffer through a “quarter-life crisis,” a period of depression and worry over their future.

Given the rigors of contemporary career-building, pre-adults who do marry and start families do so later than ever before in human history. Husbands, wives and children are a drag on the footloose life required for the early career track and identity search. Pre-adulthood has also confounded the primordial search for a mate. It has delayed a stable sense of identity, dramatically expanded the pool of possible spouses, mystified courtship routines and helped to throw into doubt the very meaning of marriage. In 1970, to cite just one of many numbers proving the point, nearly seven in 10 25-year-olds were married; by 2000, only one-third had reached that milestone.

American men have been struggling with finding an acceptable adult identity since at least the mid-19th century. We often hear about the miseries of women confined to the domestic sphere once men began to work in offices and factories away from home. But it seems that men didn’t much like the arrangement either. They balked at the stuffy propriety of the bourgeois parlor, as they did later at the banal activities of the suburban living room. They turned to hobbies and adventures, like hunting and fishing. At midcentury, fathers who at first had refused to put down the money to buy those newfangled televisions changed their minds when the networks began broadcasting boxing matches and baseball games. The arrival of Playboy in the 1950s seemed like the ultimate protest against male domestication; think of the refusal implied by the magazine’s title alone.

In his disregard for domestic life, the playboy was prologue for today’s pre-adult male. Unlike the playboy with his jazz and art-filled pad, however, our boy rebel is a creature of the animal house. In the 1990s, Maxim, the rude, lewd and hugely popular “lad” magazine arrived from England. Its philosophy and tone were so juvenile, so entirely undomesticated, that it made Playboy look like Camus.

At the same time, young men were tuning in to cable channels like Comedy Central, the Cartoon Network and Spike, whose shows reflected the adolescent male preferences of its targeted male audiences. They watched movies with overgrown boy actors like Steve Carell, Luke and Owen Wilson, Jim Carrey, Adam Sandler, Will Farrell and Seth Rogen, cheering their awesome car crashes, fart jokes, breast and crotch shots, beer pong competitions and other frat-boy pranks. Americans had always struck foreigners as youthful, even childlike, in their energy and optimism. But this was too much.

What explains this puerile shallowness? I see it as an expression of our cultural uncertainty about the social role of men. It’s been an almost universal rule of civilization that girls became women simply by reaching physical maturity, but boys had to pass a test. They needed to demonstrate courage, physical prowess or mastery of the necessary skills. The goal was to prove their competence as protectors and providers. Today, however, with women moving ahead in our advanced economy, husbands and fathers are now optional, and the qualities of character men once needed to play their roles—fortitude, stoicism, courage, fidelity—are obsolete, even a little embarrassing.

Today’s pre-adult male is like an actor in a drama in which he only knows what he shouldn’t say. He has to compete in a fierce job market, but he can’t act too bossy or self-confident. He should be sensitive but not paternalistic, smart but not cocky. To deepen his predicament, because he is single, his advisers and confidants are generally undomesticated guys just like him.

Single men have never been civilization’s most responsible actors; they continue to be more troubled and less successful than men who deliberately choose to become husbands and fathers. So we can be disgusted if some of them continue to live in rooms decorated with “Star Wars” posters and crushed beer cans and to treat women like disposable estrogen toys, but we shouldn’t be surprised.

Relatively affluent, free of family responsibilities, and entertained by an array of media devoted to his every pleasure, the single young man can live in pig heaven—and often does. Women put up with him for a while, but then in fear and disgust either give up on any idea of a husband and kids or just go to a sperm bank and get the DNA without the troublesome man. But these rational choices on the part of women only serve to legitimize men’s attachment to the sand box. Why should they grow up? No one needs them anyway. There’s nothing they have to do.

They might as well just have another beer.