It must ensue

Don’t aim at success — the more you aim at it and make it a target, the more you are going to miss it. For success, like happiness, cannot be pursued; it must ensue, and it only does so as the unintended side-effect of one’s dedication to a cause greater than oneself or as the by-product of one’s surrender to a person other than oneself. Happiness must happen, and the same holds for success: you have to let it happen by not caring about it. I want you to listen to what your conscience commands you to do and go on to carry it out to the best of your knowledge. Then you will live to see that in the long run — in the long run, I say! — success will follow you precisely because you had forgotten to think of it.

— Viktor E. Frankl, Man’s Search For Meaning
Preface to the 1992 edition

Why aren’t philosophy departments doing this?

To teach how to live without certainty, and yet without being paralyzed by hesitation, is perhaps the chief thing that philosophy, in our age, can still do for those who study it.

— Bertrand Russell, A History of Western Philosophy
(1945)

Upper Middle Brow: The culture of the creative class

By William Deresiewicz
theamericanscholar

“Masscult and Midcult,” Dwight Macdonald’s famous essay in cultural taxonomy, distinguished three levels in modern culture: High Culture, represented most recently by the modernist avant-garde but already moribund in Macdonald’s day; Mass Culture (“or Masscult, since it really isn’t culture at all”), also known as pop culture or kitsch (or, more recently, entertainment); and the insidious new form Macdonald labeled Midcult. Midcult is Masscult masquerading as art: slick and predictable but varnished with ersatz seriousness. For Macdonald, Midcult was Our TownThe Old Man and the Sea, South PacificLife magazine, the Book-of-the-Month Club: all of them marked by a high-minded sentimentality that congratulated the audience for its fine feelings.

“Masscult and Midcult” was published in 1960. In his introduction to a recent collection of Macdonald’s essays, Louis Menand wrote that the culture that was about to emerge in the ensuing decade, a hybrid of pop demotics and high-art sophistication—Dylan, the Beatles, Bonnie and Clyde, Andy Warhol, Portnoy’s Complaint—rendered Macdonald’s categories obsolete. Perhaps, but Masscult and Midcult are certainly still with us, even if there are other forms of culture, too. Masscult today is Justin Bieber, the Kardashians, Fifty Shades of Grey, George Lucas, and a million other things. Midcult, still peddling uplift in the guise of big ideas, is Tree of Life, Steven Spielberg, Jonathan Safran Foer, MiddlesexFreedom—the things that win the Oscars and the Pulitzer Prizes, just like in Macdonald’s day.

But now I wonder if there’s also something new. Not middlebrow, not highbrow (we still don’t have an avant-garde to speak of), but halfway in between. Call it upper middle brow. The new form is infinitely subtler than Midcult. It is post- rather than pre-ironic, its sentimentality hidden by a veil of cool. It is edgy, clever, knowing, stylish, and formally inventive. It is Jonathan Lethem, Wes Anderson, Lost in TranslationGirls, Stewart/Colbert, The New YorkerThis American Life and the whole empire of quirk, and the films that should have won the Oscars (the films you’re not sure whether to call films or movies).

The upper middle brow possesses excellence, intelligence, and integrity. It is genuinely good work (as well as being most of what I read or look at myself). The problem is it always lets us off the hook. Like Midcult, it is ultimately designed to flatter its audience, approving our feelings and reinforcing our prejudices. It stays within the bounds of what we already believe, affirms the enlightened opinions we absorb every day in the quality media, the educated bromides we trade on Facebook. It doesn’t tell us anything we don’t already know, doesn’t seek to disturb—the definition of a true avant-garde—our fundamental view of ourselves, or society, or the world. (Think, by contrast, of some truly disruptive works: The WireBlood Meridian, almost anything by J. M. Coetzee.)

There is a sociology to all of this. As Clement Greenberg pointed out in “Avant-Garde and Kitsch” (1939), the predecessor to Macdonald’s essay, high culture flourished under the aristocracy. Mass culture came in with mass literacy, while Midcult is a product of the postwar college boom, a way of catering to the cultural aspirations of the exploding middle class. Now, since the ’70s, we’ve gone a step further, into an era of mass elite and postgraduate education. This is the root of the so-called creative class, the Bobos, the liberal elite as it exists today. The upper middle brow is the cultural expression of this demographic. Its purpose is to make consciousness safe for the upper middle class. The salient characteristic of that class, as a moral entity, is a kind of Victorian engorgement with its own virtue. Its need is for an art that will disturb its self-delight.

Don’t go back to sleep

The breeze at dawn has secrets to tell you.
Don’t go back to sleep.

You must ask for what you really want.
Don’t go back to sleep.

People are going back and forth across the doorsill
Where the two worlds touch.

The door is round and open.
Don’t go back to sleep.

— Mewlana Jalaluddin Rumi
13th century

College and employability

The man who has gone through a college or university easily becomes psychically unemployable in manual occupations without necessarily acquiring employability in, say, professional work. His failure to do so may be due either to lack of natural ability—perfectly compatible with passing academic tests—or to inadequate teaching; and both cases will . . . occur more frequently as ever larger numbers are drafted into higher education and as the required amount of teaching increases irrespective of how many teachers and scholars nature chooses to turn out.

The results of neglecting this and of acting on the theory that schools, colleges and universities are just a matter of money, are too obvious to insist upon. Cases in which among a dozen applicants for a job, all formally qualified, there is not one who can fill it satisfactorily, are known to everyone who has anything to do with appointments . . .

All those who are unemployed or unsatisfactorily employed or unemployable drift into the vocations in which standards are least definite or in which aptitudes and acquirements of a different order count. They swell the host of intellectuals in the strict sense of the term whose numbers hence increase disproportionately. They enter it in a thoroughly discontented frame of mind. Discontent breeds resentment. And it often rationalizes itself into that social criticism which as we have seen before is in any case the intellectual spectator’s typical attitude toward men, classes and institutions especially in a rationalist and utilitarian civilization.

Well, here we have numbers; a well-defined group situation of proletarian hue; and a group interest shaping a group attitude that will much more realistically account for hostility to the capitalist order than could the theory—itself a rationalization in the psychological sense—according to which the intellectual’s righteous indignation about the wrongs of capitalism simply represents the logical inference from outrageous facts. . . . Moreover our theory also accounts for the fact that this hostility increases, instead of diminishing, with every achievement of capitalist evolution.

— Joseph Schumpeter
from “Capitalism, Socialism and Democracy”
1942
wsj

Solitude is enlightening but if it does not lead us back to society, it can become a spiritual dead end

By John Burnside
aeonmagazine

‘I have a great deal of company in my house; especially in the morning, when nobody calls.’ Henry David Thoreau’s remark about his experience of solitude expresses many of the common ideas we have about the work — and the apparent privileges — of being alone. As he put it so vividly in Walden(1854), his classic account of the time he spent alone in the Massachusetts woods, he went there to ‘live deep and suck out all the marrow of life’. Similarly, when I retreat into solitude, I hope to reconnect with a wider, more-than-human world and by so doing become more fully alive, recovering what the Gospel of Thomas called, ‘he who was, before he came into being’.

It has always been a key step on the ‘way’ or ‘path’ in Taoist philosophy (‘way’ being the literal translation of Tao) to go into the wilderness and lay oneself bare to whatever one finds there, whether that be the agonies of St Anthony, or the detachment of the Taoist masters. Alone in the wild, we shed the conventions that keep society ticking over — freedom from the clock, in particular, is a hugely important factor. We are opened up to other, less conventional, customs: in the wild, animals may talk to us, birds will sometimes guide us to water or light, the wind may become a second skin. In the wild, we may even find our true bodies, creaturely and vivid and indivisible from the rest of creation — but this comes only when we break free, not just from the constraints of clock and calendar and social convention, but also from the sometimes-clandestine hopes, expectations and fears with which we arrived.

For many of us, solitude is tempting because it is ‘the place of purification’, as the Israeli philosopher Martin Buber called it. Our aspiration for travelling to that place might be the simple pleasure of being away, unburdened by the pettiness and corruption of the day-to-day round. For me, being alone is about staying sane in a noisy and cluttered world – I have what the Canadian pianist Glenn Gould called a ‘high solitude quotient’ — but it is also a way of opening out a creative space, to give myself a chance to be quiet enough to see or hear what happens next.

There are those who are inclined to be purely temporary dwellers in the wilderness, who don’t stay long. As soon as they are renewed by a spell of lonely contemplation, they are eager to return to the everyday fray. Meanwhile, the committed wilderness dwellers are after something more. Yet, even if contemplative solitude gives them a glimpse of the sublime (or, if they are so disposed, the divine), questions arise immediately afterwards. What now? What is the purpose of this solitude? Whom does it serve?

To take oneself out into the wilderness as part of a spiritual quest is one thing, but to remain there in a kind of barren ecstasy is another. The Anglo-American mystic Thomas Merton argues that ‘there is no greater disaster in the spiritual life than to be immersed in unreality, for life is maintained and nourished in us by our vital relation with realities outside and above us. When our life feeds on unreality, it must starve.’ If practised as part of a living spiritual path, he says, and not simply as an escape from corruption or as an expression of misanthropy, ‘your solitude will bear immense fruit in the souls of men you will never see on earth’. It is a point Ralph Waldo Emerson, Thoreau’s friend and teacher, also makes. Solitude is essential to the spiritual path, he argues, but ‘we require such solitude as shall hold us to its revelations when we are in the streets and in palaces … it is not the circumstances of seeing more or fewer people but the readiness of sympathy that imports’.

Thoreau, however, felt keenly the corruption of a politically compromised, profit-oriented, slave-keeping society. His posthumously published work Cape Cod (1865) is, at least in part, an expression of dismay, even grief, in which he revealed his desire to turn his back on American society. Yet for much of his life, he kept Emerson’s principle close, as he remembers in Walden:

There too, as everywhere, I sometimes expected the Visitor who never comes. The Vishnu Purana says, ‘The house-holder is to remain at eventide in his courtyard as long as it takes to milk a cow, or longer if he pleases, to await the arrival of a guest.’ I often performed this duty of hospitality, waited long enough to milk a whole herd of cows, but did not see the man approaching from the town.

Perhaps the ‘Visitor who never comes’ is the man approaching from town – or perhaps it is some other, more mysterious – and perhaps less benevolent arrival. As Merton cautioned, the wilderness is a place of becoming lost, as much as found. ‘First, the desert is the country of madness. Second, it is the refuge of the devil, thrown out … to “wander in dry places”. Thirst drives men mad, and the devil himself is mad with a kind of thirst for his own lost excellence — lost because he has immured himself in it and closed out everything else.’

Karl Marx expresses this idea in another way. In his A Contribution to the Critique of Hegel’s Philosophy of Right (1844) he says, ‘what difference is there between the history of our freedom and the history of the boar’s freedom if it can be found only in the forests? … It is common knowledge that the forest echoes back what you shout into it.’ Marx saw religion — and by implication, the spiritual life in general — as ‘the opium of the people,’ but the important point is the need to be careful of the dangers of forest thinking. As in every fairy tale and medieval romance, the wilderness is peopled with dragons, but only some of them are native to the place. The rest are introduced by the solitary pilgrim himself, whose quest had seemed so pure and well-intentioned when he set out.

If solitude does not lead us back to society, it can become a spiritual dead end, an act of self-indulgence or escapism, as Merton, Emerson, Thoreau, and the Taoist masters all knew. We might admire the freedom of the wild boar, we might even envy it, but as long as others are enslaved, or hungry, or held captive by social conventions, it is our duty to return and do what we can for their liberation. For the old cliché is true: no matter what I do, I cannot be free while others are enslaved, I cannot be truly happy while others suffer. And, no matter how sublime or close to the divine my solitary hut in the wilderness might be, it is a sterile paradise of emptiness and rage unless I am prepared to return and participate actively in the social world. Thoreau, that icon of solitary contemplation, did eventually return to support the cause of abolition. In so doing, he laid down the principles of civil disobedience that would later inspire Gandhi, Martin Luther King and the freedom fighters of anti-imperialist movements throughout the world.

‘No man is an island, entire of itself,’ wrote John Donne, in a too-often quoted line, but the full impact comes in the continuation of his meditation, where he writes:

every man is a piece of the continent, a part of the main. If a clod be washed away by the sea, Europe is the less, as well as if a promontory were, as well as if a manor of thy friend’s or of thine own were: any man’s death diminishes me, because I am involved in mankind, and therefore never send to know for whom the bell tolls; it tolls for thee.

It is one of the great paradoxes of solitude, that it offers us not an escape, not a paradise, not a dwelling place where we can haughtily maintain our integrity by ignoring a vicious and corrupt social world, but a way back to that world, and a new motive for being there. Moreover, it can enliven a new sense of what companionship means — and, with it, a courtesy and hospitability that goes beyond anything good manners might decree. Because, no matter who I am, and no matter what I might or might not have achieved, my very life depends on being prepared, always, for the one visitor who never comes, but might arrive at any moment, from the woods or from the town.

Proust wasn’t a neuroscientist. Neither was Jonah Lehrer.

By Boris Kachka
nymag

“We are all bad apples,” wrote Jonah Lehrer, in probably the last back-cover endorsement of his career. “Dishonesty is everywhere … It’s an uncomfortable message, but the implications are huge.”

Lehrer’s blurb was for behavioral economist Dan Ariely’s The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves.Among Ariely’s bite-size lessons: We all cheat by a “fudge factor” of roughly 15 percent, regardless of how likely we are to get caught; a few of us advance gradually to bigger and bigger fudges, often driven by social pressures; and it’s only when our backs are up against the wall that we resort to brazen lies.

Lehrer, 31, had already established the kind of reputation that made his backing invaluable to a popular science writer. Thanks to three books, countless articles and blog posts, and many turns on the lecture circuit, Lehrer was perhaps the leading explainer of neuroscience this side of a Ph.D. He was kind enough to interview Ariely this past June for the Frontal Cortex, a blog Lehrer had started in 2006 and carried with him from one high-profile appointment to the next.The New Yorker had begun hosting it that month, after Lehrer was hired as a staff writer—another major career milestone. But newyorker.com didn’t run the Ariely story, because by the time he wrote it, Lehrer had already been banned from his own blog. Two weeks earlier, readers had discovered that he was rampantly “self-plagiarizing” his own blog posts among different media outlets. Lehrer held onto his three-day-old print contract, but the blog was on ice.

Then it got so much worse. Four excruciating months later, Jonah Lehrer is known as a fabricator, a plagiarist, a reckless recycler. He’s cut-and-pasted not just his own stories but at least one from another journalist; he’s invented or conflated quotes; and he’s reproduced big errors even after sources pointed them out. His publisher, Houghton Mifflin Harcourt, will soon conclude a fact-check of his three books, the last of which, Imagine, was recalled from bookstores—a great expense for a company that, like all publishing houses, can’t afford to fact-check most books in the first place. In the meantime, he’s been completely ostracized. It’s unclear if he’ll ever write for a living again.

If the public battering seems excessive now, four months in, that should come as no surprise. That’s how modern scandals go—burning bright, then burning out, leaving a vacuum that fills with sympathy. It’s especially true in cases like Lehrer’s, where the initial fury is narrowly professional, ­fueled by Schadenfreude and inside-baseball ethical disputes. It was fellow journalists who felled Lehrer, after all, not the ­sources he betrayed. But the funny thing is that while the sins they accused him of were relatively trivial, more interesting to his colleagues than his readers, Lehrer’s serious distortions—of science and art and basic human motivations—went largely unnoticed. In fact, by the time he was caught breaking the rules of journalism, Lehrer was barely beholden to the profession at all. He was scrambling up the slippery slope to the TED-talk elite: authors and scientists for whom the book or the experiment is just part of a multimedia branding strategy. He was on a conveyor belt of blog posts, features, lectures, and inspirational books, serving an entrepreneurial public hungry for futurist fables, easy fixes, and scientific marvels in a world that often feels tangled, stagnant, and frustratingly familiar. He was less interested in wisdom than in seeming convincingly wise.

The remarkable thing about that transformation is that it wasn’t all that unusual. In his short, spectacular career, Lehrer had two advocate-editors who quickly became his exilers. The first, Wired editor Chris Anderson, has himself been caught plagiarizing twice, the second time in an uncorrected proof. The often-absentee editor of a futurist magazine that may be the house journal of the lecture circuit, Anderson makes his living precisely as Lehrer did—snipping and tailoring anecdotal factoids into ready-to-wear tech-friendly conclusions.

The second, David Remnick, has invested resources in The New Yorker’s own highbrow talk series, The New Yorker Festival, in which staff writers function as boldfaced brand experts in everything from economics to medicine to creativity. The tone of those talks mixes the smooth technospeak of the Aspen Ideas Festival—co-hosted by rival magazine The Atlantic—with the campfire spirit of first-person storyteller confab the Moth. It was at the Moth that The New Yorker’s biggest brand, The Tipping Point author Malcolm Gladwell, got into hot water in 2005 by telling a story about games he played in the pages of the Washington Post that turned out to be almost entirely untrue. In print, Gladwell is often knocked for reducing social science to easy epiphanies and is occasionally called out for ignoring evidence that contradicts his cozy theories—most recently over a piece this past September on Jerry Sandusky. Yet he also serves as a pioneer in the industry of big-idea books—like those by his New Yorker colleague James Surowiecki, the “Freakonomics” guys, Dan Ariely, and others. Theirs is a mixed legacy, bringing new esoteric research to a lay audience but sacrificing a great deal of thorny complexity in the process.

In the world of magazines, of course, none of us is immune to slickness or oversimplification—New York included. But two things make Lehrer’s glibness especially problematic, and especially representative. First, conferences and corporate speaking gigs have helped replace the ­journalist-as-translator with the journalist-as-sage; in a magazine profile, the scientist stands out, but in a TED talk, the speaker does. And second, the scientific fields that are the most exciting to today’s writers—neuroscience, evolutionary biology, behavioral economics—are fashionable despite, or perhaps because of, their newness, which makes breakthrough findings both thrilling and unreliable. In these fields, in which shiny new insights so rarely pan out, every popularizer must be, almost by definition, a huckster. When science doesn’t give us the answers we want, we find someone who will.

“I’ve never really gotten over the sense of fraudulence that comes with being onstage,” Lehrer once said. Young and striving and insecure, he was both a product of this glib new world and a perpetrator of its swindles. He was also its first real victim.

Lehrer, who grew up in L.A. and attended prestigious North Hollywood High School, was always precocious. At 15, he won $1,000 in a contest run by nasdaq with an essay calling the stock market “a crucial bond between plebeians and patricians.” Two years later, he and some other students made the finals of the countrywide Fed Challenge for a cogent argument against raising the national interest rate. “We felt we shouldn’t act on a guess or a premonition,” he told a newspaper. “We should act on the basis of statistics.”

At Columbia University, Lehrer majored in neuroscience, helped edit the literary Columbia Review, and spent a few years working in the lab of Eric Kandel. (Journalists and scientists often mistook this undergraduate experience for lab work that left Lehrer just shy of a Ph.D.) The Nobel Prize–winning neuroscientist, who was unlocking the secrets of our working memory, remembers his former lab assistant fondly. “He was the most gracious, decent, warm, nice kid to interact with,” says Kandel. “Cultured, fun to have a conversation with—and knew a great deal about food. I was surprised he didn’t go into science, because he had a real curiosity about it.”

Lehrer won a Rhodes Scholarship, then used some of his research at Oxford to write his first book, Proust Was a Neuroscientist. Published in late 2007, it was a grab bag of fun facts in the service of an earnest point: that great Modern artists anticipated the discoveries of brain science. It had a senior-thesis feel, down to an ambitious coda. Critic C. P. Snow had called, in 1959, for a “third culture” to bridge science and art—a prophecy that had been fulfilled, but to the advantage, Lehrer thought, of science. In Proust, Lehrer proposed a “fourth culture,” in which art would be a stronger “counterbalance to the glories and excesses of scientific reductionism.”

A year earlier, Lehrer had begun blogging on Frontal Cortex. After hundreds of posts, he began to find traction in magazines. Mark Horowitz, then an editor atWired, brought him in to write a feature on a project to map all the genes in the brain. “It was a very complex piece,” says ­Horowitz, “with lots of reporting, lots of science. I thought that was a breakthrough for him.” It was, he adds, thoroughly fact-checked; none of Lehrer’s magazine stories have been found to have serious errors.

“If you asked him, ‘How many ideas do you have for an article?’ he had ten ideas, more than anyone else,” Horowitz says. “That’s why he was able to churn out so many blog posts.” They were long posts, too, the kind that quickly became the basis for print stories. In 2010, Frontal Cortex moved over to wired.com. “Chris [Anderson] loved Jonah Lehrer—loved him,” Horowitz says. “Any story idea he had, it was, ‘See if Jonah will do it.’ He was good, he was young, and he was getting better with every story.”

Lehrer’s tortuous fall began on what should have been a day of celebration. Monday, June 18, was his official start date as a New Yorker staff writer. That evening, an anonymous West Coast journalist wrote to media watchdog Jim Romenesko, noting that one of Lehrer’s five New Yorker blog posts—“Why Smart People Are Stupid”—had whole paragraphs copied nearly verbatim from Lehrer’s October 2011 column for The Wall Street Journal.

Within 24 hours, journalists found several more recycled posts, setting off a feeding frenzy one blogger called the “Google Game”—find a distinctive passage, Google it: pay dirt. On Wednesday, the irascible arts blogger Ed Champion unleashed an 8,000-word catalogue of previously published story material Lehrer had worked into Imagine. (Never mind that drawing on earlier stories for book projects is standard practice.) It was called “The Starr Report of the Lehrer Affair.” That day Lehrer told the New York Times that repurposing his own material “was a stupid thing to do and incredibly lazy and absolutely wrong.”

At The New Yorker, David Remnick initially saw the “self-plagiarism” pile-on as overkill. “There are all kinds of crimes and misdemeanors in this business,”The New Yorker editor said that Thursday, explaining his decision to retain Lehrer. “If he were making things up or appropriating other people’s work, that’s one level of crime.” A source says Remnick did consider firing Lehrer outright, but decided against it.

Ironically, it was another journalist’s sympathy for Lehrer that led to his complete unraveling. “The Schadenfreude with Lehrer was pretty aggressive,” says Michael Moynihan, a freelance writer who was then guest-blogging for the Washington Post. “I was going to write a bit about the mania for destroying journalists because they’re popular and have more money than you do.” Having never read Lehrer’s books, he dug into Imagine (which purports to explain the brain science of “how creativity works”), not even knowing that its first chapter focused on one of his favorite musicians, Bob Dylan. He found some suspiciously unfamiliar quotes. “Every Dylan quote, every citation, is online,” Moynihan says. A new quote is “like finding another version of the Bible.”

He e-mailed Lehrer, who claimed to be on vacation until just after Moynihan’sPost gig was up. But off the top of his head, Lehrer offered one source for a quote—a book on Marianne Faithfull. It was wildly out of context, but no matter: Where did the ­other six come from? When Moynihan reached him the following week, Lehrer expressed surprise that he still planned to run the piece. That was when, as Moynihan puts it, “the calls started.”

On the phone, Lehrer seemed charming and cooperative. He said he’d pulled some quotes from a Dylan radio program as well as unaired footage for the documentary No Direction Home. Dylan’s manager, Jeff Rosen, had given him the latter. Moynihan pressed him for more details over the next several days, but Lehrer stalled.

Finally, Moynihan was able to reach Rosen, who said he’d never heard from Lehrer. When Moynihan spoke to the author, while walking down Flatbush Avenue near his Brooklyn home, the conversation grew so heated that a passing acquaintance thought it was a marital spat. Lehrer finally came clean about making up his sources. He was impressed that Moynihan had figured out how to reach Rosen. “It shows,” he told Moynihan, “you’re a better journalist than I am.”

An editor Moynihan knew at the online magazine Tablet had happily accepted Moynihan’s exposé. The Sunday before it was published, July 29, Moynihan had to ignore Lehrer’s late-night calls just to write the piece.

That same evening of July 29, David Remnick was at his first Yankees game of the season. After getting an e-mail from Tablet’s editor, Alana Newhouse, he spent most of the game in the aisle, calling and e-mailing with Newhouse, his editors, and Lehrer. It was all, as Remnick said the next day, “a terrifically sad situation.”

The next morning, a desperate Lehrer finally managed to reach Moynihan. Didn’t he realize, Lehrer pleaded, that if Moynihan went forward, he would never write again—would end up nothing more than a schoolteacher? The story was published soon after. That afternoon, Lehrer announced through his publisher that he’d resigned from The New Yorker and would do everything he could to help correct the record. “The lies,” he said, “are over now.”

The ensuing flurry of tweets and columns was split between the Google Game fact-checkers and opiners like David Carr, who felt that Lehrer’s missteps were the result of “the Web’s ferocious appetite for content” and the collapse of hard news. All of them were grappling to name Lehrer’s pathology. What none of them really asked, and what Houghton Mifflin’s fact-check won’t answer, is what Imagine would look like if it really were scrubbed of every slippery shortcut and distortion. In truth, it might not exist at all. The fabricated quotes are not just slight aberrations; they’re more like the tells of a poker player who’s gotten away with bluffing for far too long.

In case after case, bad facts are made to serve forced conclusions. Take that Dylan chapter. First, of course, there are the quotes debunked by Moynihan. Then there are the obvious factual errors: Dylan did not immediately repair from his 1965 London tour to a cabin in Woodstock to write “Like a Rolling Stone” (he took a trip with his wife first and spent only a couple of days in that cabin), and did not coin the word juiced, as Lehrer claims; it had meant “drunk” for at least a decade. (These errors were discovered by Isaac Chotiner, weeks before Moynihan’s exposé, in The New Republic: “almost everything,” he wrote, “from the minor details to the larger argument—is inaccurate, misleading, or simplistic.”) Lehrer’s analysis of Dylan’s “Like a Rolling Stone” breakthrough is also wrong. It was hardly his first foray into elliptical songwriting, and it was hardly the first piece to defy the “two basic ways to write a song”—a dichotomy between doleful bluesy literalism and “Sugar pie, honeybunch” that no serious student of American pop music could possibly swallow.

Finally and fatally, what ties the narrative together is not some real insight into the nature of Dylan’s art, but a self-help lesson: Take a break to recharge. To anyone versed in Dylan, this story was almost unrecognizable. Lehrer’s intellectual chutzpah was startling: His conclusions didn’t shed new light on the facts; they distorted or invented facts, with the sole purpose of coating an unrelated and essentially useless lesson with the thinnest veneer of plausibility.

It’s the same way with the science that “proves” the lesson. Lehrer quotes one neuroscientist, Mark Beeman, as saying that “an insight is like finding a needle in a haystack”—presumably an insight like Dylan’s, though Beeman’s study hinges on puzzles. Beeman tells me, “That doesn’t sound like me,” because it’s absolutely the wrong analogy for how the brain works—“as if a thought is embedded in one connection.” In the next chapter, Lehrer links his tale of Dylan’s refreshed creativity to Marcus Raichle’s discoveries on productive daydreaming. But Raichle tells me those discoveries aren’t about daydreaming. Then why, I ask, would Lehrer draw that conclusion? “It sounds like he wanted to tell a story.”

Consider another tall tale, this one from Lehrer’s previous book, How We Decide. Discussing what happens when we choke under pressure, Lehrer invokes the famous case of Jean Van de Velde, a golfer who blew a three-stroke lead in the eighteenth hole of the final round of the 1999 British Open. In Lehrer’s telling, the pressure caused Van de Velde to choke, focusing on mechanics and “swinging with the cautious deliberation of a beginner with a big handicap.”

Lehrer tees this up as a transition to a psychological study on overthinking. It fits perfectly into what one critic called “the story-study-lesson cycle” of this kind of book. And just like Dylan’s “insight,” it’s largely made up. Here too he flubs an important fact: Van de Velde didn’t lose outright: He tied and lost the subsequent playoff. But then there is the larger deception. Most golf commentators thought at the time that he simply chose risky clubs—that he wasn’t handicapped by anxiety, but undone by cockiness. Van de Velde agreed; he played too aggressively. A month after the disaster, he said, “I could not live with myself knowing that I tried to play for safety and blew it.” Lehrer just rewrote the history to reach a conclusion flatly contradicted by the story of how Van de Velde actually decided.

Unlike the books, Lehrer’s New Yorker pieces were thoroughly fact-checked. But even there, his conclusions are facile. One popular story, published in 2010, is especially symptomatic of how he misrepresents science—and harms it in the process. Headlined “The Truth Wears Off,” it sets out to describe a curious phenomenon in scientific research: the alarmingly high number of study results that couldn’t be repeated in subsequent experiments. Researchers worry a lot about this tendency, sometimes called the “decline effect.” But they’ve settled on some hard, logical truths: Studies are incredibly difficult to design well; scientists are biased toward positive results; and the more surprising the finding, the more likely it is to be wrong. Good theories require good science, and science that can’t be replicated isn’t any good.

That wasn’t Lehrer’s approach. His story begins, instead, with the question, “Is there something wrong with the scientific method?” To answer that question definitively would require a very rigorous review of research practice—one that demonstrated persuasively that even the most airtight studies produced findings that couldn’t be replicated. Lehrer’s conclusion is considerably more mystical, offering bromides where analysis should be: “Just because an idea is true doesn’t mean it can be proved. And just because an idea can be proved doesn’t mean it’s true. When the experiments are done, we still have to choose what to believe.” It sounds an awful lot like the Zen-lite conclusion of Imagine: “Every creative story is different. And every creative story is the same. There was nothing. Now there is something. It’s almost like magic.”

By August 14, the storm seemed to be passing. That morning, Lehrer’s aunt had breakfast in California with an old friend. While fretting over her nephew, she mentioned innocently that Lehrer still had an outstanding contract with Wired. Her friend happened to be the mother of Jonah Peretti, a founder of the website BuzzFeed, who gladly published the “update.” Wiredconfirmed that Lehrer was still under contract, but said it wouldn’t publish anything more until a full vetting of his blog posts, already under way, was completed.

In fact, the real online vetting hadn’t even begun, and probably wouldn’t have happened if not for Lehrer’s chatty aunt. On the 16th, wired.com editor-in-chief Evan Hansen called Charles Seife, a science writer and journalism professor at NYU, to ask if he could investigate Lehrer’s hundreds of Frontal Cortex posts. It was too much work, so they settled on a mere eighteen, some of them known to have problems. Seife found a range of issues, from recycling—in most of the stories—to lazy copying from press releases, a couple of slightly fudged quotes, and three cases of outright plagiarism. (In fairness, there was only one truly egregious case of stealing.)

Accounts vary over whether Seife was expected to publish a Wired story about Lehrer, and whether his 90-minute conversation with Lehrer was on the record. Lehrer told a friend that Chris Anderson assured him there wouldn’t be a story—but then Hansen called him to ask if his remarks were on the record. Lehrer said they weren’t. Wired decided against running a full story, but allowed Seife to take it elsewhere.

Slate’s story went up on August 31, just after Wired began posting its first corrections to Lehrer’s blog posts. All Seife could say about his phone conversation with Lehrer was that it “made me suspect that Lehrer’s journalistic moral compass is badly broken.” An hour later, wired.com issued a full statement saying they had “no choice but to sever our relationship” with Lehrer.

If Anderson did indeed quietly defend Lehrer against Seife, it would fit the pattern: For all the hand-wringing about the decline of print-media standards, Lehrer was not a new-media wunderkind but an old-media darling. Just as newyorker.com had banned Lehrer before David Remnick canceled his print contract, it was wired.com that led the charge against Lehrer, and the print magazine that only fired him when it “had no choice”—after Seife published his exposé at another web magazine.

Lehrer’s biggest defenders today tend to be veterans of traditional journalism. NPR’s longtime correspondent Robert Krulwich has known Lehrer for almost a decade and used him many times on the science program “Radiolab.” “I find myself uncomfortable with how he’s been judged,” Krulwich wrote in an e-mail, weeks after “Radiolab” ran six corrections online. “If in a next round, he produces work that’s better, more careful, I hope his editors and his readers will welcome him back.” Malcolm Gladwell wrote me, “[Lehrer] didn’t twist anyone’s meaning or libel anyone or manufacture some malicious fiction … Surely only the most hardhearted person wouldn’t want to give him a chance to make things right.”

If anyone could have gotten a second chance, it was Lehrer. He’d made himself the perfect acolyte. Lehrer seemed to relish exciting ideas more than workaday craft, but editors are ravenous for ideas, and Lehrer had plenty. He fed pitches to deskbound editors and counted on them and their staffs to clean up the stories for publication. He was a fluid writer with an instinctive sense of narrative structure. In fact, he was much better at writing magazine stories than he was at blogging. His online posts were not only repetitive but too long and full of facts—true or not.

Seife spent a chunk of his time tracking down a change made to an E. O. Wilson quote in one of Lehrer’s New Yorker stories, only to find that a fact-checker had altered it at Wilson’s insistence. The piece’s editor told Seife that Lehrer was “a model of probity.” Meanwhile, wired.com—the very site that hired Seife—couldn’t vouch for any of the work Lehrer had published there. Lehrer told a friend that the first time he heard from Hansen in his two years at wired.com was during the vetting. The lack of oversight became distressingly clear when Seife, on the phone with Lehrer, demanded to know why he hadn’t asked his blog editor to fix his errors. Lehrer shot back in frustration that there was no editor.

Lehrer spent much of August writing about the affair, trying to figure out where it had all gone wrong. He came to the conclusion that he’d stretched himself too thin. His excuses fall along those lines: He told Seife that his plagiarized blog post was a rough draft he’d posted by mistake. And his latest explanation for those fabricated Dylan quotes is that he had written them into his book proposal and forgotten to fix them later. Even by his own account, then, the writing wasn’t his top priority.

The lectures, though, were increasingly important. Lehrer gave between 30 and 40 talks in 2010, all while meeting constant deadlines, starting a family, and buying a home in the Hollywood Hills. It was more than just a time suck; it was a new way of orienting his work. Lehrer was the first of the Millennials to follow his elders into the dubious promised land of the convention hall, where the book, blog, TED talk, and article are merely delivery systems for a core commodity, the Insight.

The Insight is less of an idea than a conceit, a bit of alchemy that transforms minor studies into news, data into magic. Once the Insight is in place—Blink, Nudge, Free, The World Is Flat—the data becomes scaffolding. It can go in the book, along with any caveats, but it’s secondary. The purpose is not to substantiate but to enchant.

The tradition of the author’s lecture tour goes back at least as far as Charles Dickens. But its latest incarnation began with Gladwell in 2000. The Tipping Point, his breakthrough best seller, didn’t sell itself. His publisher, Little, Brown, promoted the book by testing out its theory—that small ideas become blockbusters through social networks. Gladwell was sent across the country not just to promote his book but to lecture to booksellers about the secrets of viral-marketing. Soon The New Yorker was dispatching him to speak before advertisers, charming them and implicitly promoting the magazine’s brand along with his own. Increasingly, he became a commodity in his own right, not just touring a book (which authors do for free) but giving “expert” presentations to professional groups who pay very well—usually five figures per talk.

Gladwell was quickly picked up by Bill Leigh, whose Leigh Bureau handles many of the journalist-lecturers of the aughts wave. Asked what bookers require from his journalist clients, Bill Leigh simply says, “The takeaway. What they’re getting is that everyone hears the same thing in the same way.” The writers, in turn, get a paying focus group for their book-in-progress. Leigh remembers talking to his client, the writer Steven Johnson, about how to package his next project. “He wanted to take his book sales to the next level,” says Leigh. “Out of those conversations came his decision to slant his material with a particular innovation feel to it.” That book was titled Where Big Ideas Come From: The Natural History of Innovation. His new one is called Future Perfect.

One of the sharpest critiques of this new guard of nonspecialist Insight peddlers came from a surprising source, a veteran of the lecture circuit who decried “our thirst for nonthreatening answers.” “I’m essentially a technocrat, a knowledge worker,” says Eric Garland, who was a futurist long before that became a trendy descriptor. A past consultant to some of the Insight men’s favorite companies—3M, GM, AT&T—Garland is wistful for a time when speakers were genuine experts in sales, leadership, and cell phones. “Has Jonah Lehrer ever presented anything at a neuroscience conference?” he asks a touch dismissively.

Lehrer has not. But Gladwell actually did give a talk, in 2004, at an academic conference devoted to decision-making. “Some people were outraged by the simplification,” remembers one attendee, who likes Gladwell’s work. Someone stood up and asked if he should be more careful about citing ­sources.

In reply, Gladwell offered another anecdote. A while back, he’d found out that the playwright Bryony Lavery’s award-winning play, Frozen, cribbed quotes from one of his stories. Though he might have sued Lavery for plagiarism, Gladwell concluded that, no, the definition of plagiarism was far too broad. The important thing is not to pay homage to the source material but to make it new enough to warrant the theft. Lavery’s appropriation wasn’t plagiarism but a tribute. “I thought it was a terrible answer,” says the attendee. “If there was ever an answer that was about rationalization, this was it.”

The worst thing about Lehrer’s “decline effect” story is that the effect is real—science is indeed in trouble—and Lehrer is part of the problem. Last month, the Nobel laureate behavioral economist and psychologist Daniel Kahneman sent a mass e-mail to colleagues warning that revelations of shoddy research and even outright fraud had cast a shadow over the hot new subfield of “social priming” (which studies how perceptions are influenced by subtle cues and expectations). Others blamed the broader “summer of our discontent,” as one science writer called it, on a hunger for publicity that leads to shaved-off corners or worse.

“There’s a habit among science journalists to treat a single experiment as something that is newsworthy,” says the writer-psychologist Steven Pinker. “But a single study proves very little.” The lesson of the “decline effect,” as Pinker sees it, is not that science is fatally flawed, but that readers have been led to expect shocking discoveries from a discipline that depends on slow, stutter-step progress. Call it the “TED ­effect.” Science writer Carl Zimmer sees it especially in the work of Lehrer and Gladwell. “They find some research that seems to tell a compelling story and want to make that the lesson. But the fact is that science is usually a big old mess.”

Sadly, Lehrer knows exactly how big a mess it is, especially when it comes to neuroscience. One of his earliest blog posts, back in 2006, was titled “The Dirty Secrets of fMRI.” Its subject was the most appealing tool of brain science, functional magnetic-resonance imaging. Unlike its cousin the MRI, fMRI can take pictures of the brain at work, tracking oxygen flow to selected chunks while the patient performs assigned tasks. The most active sections are thus “lit up,” sometimes in dazzling colors, seeming to show clumps of neurons in mid-thought. But in that early blog post, Lehrer warned of the machine’s deceptive allure. “The important thing,” he concluded, “is to not confuse the map of a place for the place itself.”

Lehrer repeated this warning about the limitations of fMRI in later stories. And yet, both in How We Decide and Imagine, fMRI is Lehrer’s deus ex machina. No supermarket decision or sneaker logo or song lyric is conceived without “lighting up” a telltale region of the brain. Here comes the anterior cingulated cortex, and there goes the superior temporal sulcus, and now the amygdala has its say. It reads like a symphony—magical, authoritative, deeply true.

This contradiction was pointed out back in March in a critique of Imaginepublished at the literary site the Millions by Tim Requarth and Meehan Crist. “It’s baffling that in Imagine Lehrer makes statements so similar to ones he thoroughly discredits” elsewhere. Then they offer an analogy to explain what’s wrong with drawing vast conclusions from pretty fMRI pictures. “Brain regions, like houses, have many functions,” they write, and just because there are people at someone’s house doesn’t mean you know what they’re doing. “While you can conclude that a party means there will be people,” they write, “you cannot conclude that people means a party.”

Rebecca Goldin, a mathematician-­writer who often criticizes “neurobabble,” points out that this is exactly what’s so enticing about this brand-new science: its mystery. Imagine that fMRI is a primitive telescope, and those clumps of neurons are like all the beautiful stars you can finally see up close, but “may in fact be in different galaxies.” You still can’t discern precisely how they’re interacting. Journalist David Dobbs recently asked a table full of neuroscientists: “Of what we need to know to fully understand the brain, what percentage do we know now?” They all gave figures in the single digits. Imagine makes it look like we’re halfway there.

If Lehrer was misusing science, why didn’t more scientists speak up? When I reached out to them, a couple did complain to me, but many responded with shrugs. They didn’t expect anything better. Mark Beeman, who questioned that “needle in the haystack” quote, was fairly typical: Lehrer’s simplifications were “nothing that hasn’t happened to me in many other newspaper stories.”

Even scientists who’ve learned to write for a broad audience can be fatalistic about the endeavor. Kahneman had a surprise best seller in 2011, Thinking, Fast and Slow. His writing is dense and subtle, as complicated as pop science gets. But as he once told Dan Ariely, his former acolyte, “There’s no way to write a science book well. If you write it for a general audience and you are successful, your academic colleagues will hate you, and if you write it for academics, nobody would want to read it.”

For a long time, Lehrer avoided the dilemma by assuming it didn’t apply to him, writing not for the scientists (who shrugged off his oversimplifications) or for the editors (who fixed his most obvious errors) but for a large and hungry audience of readers. We only wanted one thing from Jonah Lehrer: a story. He told it so well that we forgave him almost ­everything.

The Perks of Being a Bully

[R]esearch cited by Anthony Volk, Joseph Camilleri, Andrew Dane and Zopito Marini in a 2012 Aggressive Behavior journal article indicate that bullying-induced social dominance is correlated with reduced stress and improved physical health. Amazingly, “bullying is also positively linked with other positive mental traits such as … cognitive empathy, leadership, social competence, and self-efficacy.”

Bullies are not people who crave insight into human nature, in other words: They often have more insight into our true nature, at least in its Hobbesian, adolescent form, than the people around them.

People can be bullied in many ways, of course. But, as evolutionary psychology would indicate, bullying patterns among adolescents closely reflect the traits associated with mate selection. Girls bully one another by impugning their attractiveness and their sexual fidelity — the two traits most sought after by males. Boy bullies, on the other hand, impugn the manliness, strength and wealth of rivals, in a similar play to establish dominance with regard to traits deemed desirable among girls.

The strategy works: Studies show that boys who bully other boys, on average, gain status with girls, who perceive the boys as more dominant. And girls who bully other girls, on average, receive more positive attention from boys.

As the aforementioned authors report, “Dominance has been found to be positively associated with both bullying and peer nominations of dating popularity among adolescents. Bullying is also positively correlated with peer nominations of power, social prominence, student and teacher ratings of perceived popularity and peer leadership” — all of which translate to social capital, which in turn means social or mating opportunities with the opposite sex.

Jonathan Kay

The last thing I’ll read in the book I’ll never finish

“Don’t wear yourself out proving something with these giant broads. Remember, your great love was for me, just five feet tall.”

— Saul Bellow, Humboldt’s Gift
P. 308

Everything becomes a habit

When the critics, much more kindly disposed toward him now, asked what he was doing in Argentina in 1974, Amalfitano looked at them and then at his margarita and said, as if he had repeated it many times, that in 1974 he was in Argentina because of the coup in Chile, which had obliged him to choose the path of exile. And then he apologized for expressing himself so grandiloquently. Everything becomes a habit, he said, but none of the critics paid much attention to this last remark.

“Exile must be a terrible thing,” said Norton sympathetically.

“Actually,” said Amalfitano, “now I see it as a natural movement, something that, in its way, helps to abolish fate, or what is generally known as fate.”

“But exile,” said Pelletier, “is full of inconveniences, of skips and breaks that essentially keep recurring and interfere with anything you try to do that’s important.”

“That’s just what I mean by abolishing fate,” said Amalfitano. “But again, I beg your pardon.”

— Roberto Bolaño, 2664

And on the seventh day

Thus the heavens and the earth were finished.  And on the seventh day God rested from all the work that he had done.

— Genesis 2:1-3
thebricktestament

When the mind swings by a grass-blade

When the mind swings by a grass-blade

An ant’s forefoot shall save you
The clover leaf smells and tastes as its flower.

— Ezra Pound, Cantos

Books ain’t no good

S’pose you didn’t have nobody. S’pose you couldn’t go into the bunk house and play rummy ’cause you was black. How’d you like that? S’pose you had to sit out here an’ read books. Sure you could play horseshoes till it got dark, but then you got to read books. Books ain’t no good. A guy needs somebody–to be near him. A guy goes nuts if he ain’t got nobody. Don’t make no difference who the guy is, long’s he’s with you. I tell ya, I tell ya a guy gets too lonely an’ he gets sick.

— John Steinbeck, Of Mice and Men