Perhaps a marvelous illusion

I couldn’t tell them what I thought was really wrong with me: namely that I suffered from an illusion, perhaps a marvelous illusion, or perhaps only a lazy one, that by a kind of inspired levitation I could rise and dart straight to the truth. Straight to the truth. For I was too haughty to bother with Marxism, Freudianism, Modernism, the avante-garde, or any of these things that Humboldt, as a culture-Jew, took so much stock in.

— Saul Bellow, Humboldt’s Gift

Is the Internet Making Us Crazy? Yes!

By Tony Dokoupil
newsweek

Before he launched the most viral video in Internet history, Jason Russell was a half-hearted Web presence. His YouTube account was dead, and his Facebook and Twitter pages were a trickle of kid pictures and home-garden updates. The Web wasn’t made “to keep track of how much people like us,” he thought, and when his own tech habits made him feel like “a genius, an addict, or a megalomaniac,” he unplugged for days, believing, as the humorist Andy Borowitz put it in a tweet that Russell tagged as a favorite, “it’s important to turn off our computers and do things in the real world.”

But this past March Russell struggled to turn off anything. He forwarded a link to “Kony 2012,” his deeply personal Web documentary about the African warlord Joseph Kony. The idea was to use social media to make Kony famous as the first step to stopping his crimes. And it seemed to work: the film hurtled through cyberspace, clocking more than 70 million views in less than a week. But something happened to Russell in the process. The same digital tools that supported his mission seemed to tear at his psyche, exposing him to nonstop kudos and criticisms, and ending his arm’s-length relationship with new media.

He slept two hours in the first four days, producing a swirl of bizarre Twitter updates. He sent a link to “I Met the Walrus,” a short animated interview with John Lennon, urging followers to “start training your mind.” He sent a picture of his tattoo, TIMSHEL, a biblical word about man’s choice between good and evil. At one point he uploaded and commented on a digital photo of a text message from his mother. At another he compared his life to the mind-bending movieInception, “a dream inside a dream.”

On the eighth day of his strange, 21st-century vortex, he sent a final tweet—a quote from Martin Luther King Jr.: “If you can’t fly, then run, if you can’t run, then walk, if you can’t walk, then crawl, but whatever you do, you have to keep moving forward”—and walked back into the real world. He took off his clothes and went to the corner of a busy intersection near his home in San Diego, where he repeatedly slapped the concrete with both palms and ranted about the devil. This too became a viral video.

Afterward Russell was diagnosed with “reactive psychosis,” a form of temporary insanity. It had nothing to do with drugs or alcohol, his wife, Danica, stressed in a blog post, and everything to do with the machine that kept Russell connected even as he was breaking apart. “Though new to us,” Danica continued, “doctors say this is a common experience,” given Russell’s “sudden transition from relative anonymity to worldwide attention—both raves and ridicules.” More than four months later, Jason is out of the hospital, his company says, but he is still in recovery. His wife took a “month of silence” on Twitter. Jason’s social-media accounts remain dark.

Questions about the Internet’s deleterious effects on the mind are at least as old as hyperlinks. But even among Web skeptics, the idea that a new technology might influence how we think and feel—let alone contribute to a great American crack-up—was considered silly and naive, like waving a cane at electric light or blaming the television for kids these days. Instead, the Internet was seen as just another medium, a delivery system, not a diabolical machine. It made people happier and more productive. And where was the proof otherwise?

Now, however, the proof is starting to pile up. The first good, peer-reviewed research is emerging, and the picture is much gloomier than the trumpet blasts of Web utopians have allowed. The current incarnation of the Internet—portable, social, accelerated, and all-pervasive—may be making us not just dumber or lonelier but more depressed and anxious, prone to obsessive-compulsive and attention-deficit disorders, even outright psychotic. Our digitized minds can scan like those of drug addicts, and normal people are breaking down in sad and seemingly new ways.

In the summer of 1996, seven young researchers at MIT blurred the lines between man and computer, living simultaneously in the physical and virtual worlds. They carried keyboards in their pockets, radio-transmitters in their backpacks, and a clip-on screen in front of their eyes. They called themselves “cyborgs”—and they were freaks. But as Sherry Turkle, a psychologist at MIT, points out, “we are all cyborgs now.” This life of continuous connection has come to seem normal, but that’s not the same as saying that it’s healthy or sustainable, as technology—to paraphrase the old line about alcohol—becomes the cause of and solution to of all life’s problems.

In less than the span of a single childhood, Americans have merged with their machines, staring at a screen for at least eight hours a day, more time than we spend on any other activity including sleeping. Teens fit some seven hours of screen time into the average school day; 11, if you count time spent multitasking on several devices. When President Obama last ran for office, the iPhone had yet to be launched. Now smartphones outnumber the old models in America, and more than a third of users get online before getting out of bed.

Meanwhile, texting has become like blinking: the average person, regardless of age, sends or receives about 400 texts a month, four times the 2007 number. The average teen processes an astounding 3,700 texts a month, double the 2007 figure. And more than two thirds of these normal, everyday cyborgs, myself included, report feeling their phone vibrate when in fact nothing is happening. Researchers call it “phantom-vibration syndrome.”

Altogether the digital shifts of the last five years call to mind a horse that has sprinted out from underneath its rider, dragging the person who once held the reins. No one is arguing for some kind of Amish future. But the research is now making it clear that the Internet is not “just” another delivery system. It is creating a whole new mental environment, a digital state of nature where the human mind becomes a spinning instrument panel, and few people will survive unscathed.

“This is an issue as important and unprecedented as climate change,” says Susan Greenfield, a pharmacology professor at Oxford University who is working on a book about how digital culture is rewiring us—and not for the better. “We could create the most wonderful world for our kids but that’s not going to happen if we’re in denial and people sleepwalk into these technologies and end up glassy-eyed zombies.”

Does the Internet make us crazy? Not the technology itself or the content, no. But a Newsweek review of findings from more than a dozen countries finds the answers pointing in a similar direction. Peter Whybrow, the director of the Semel Institute for Neuroscience and Human Behavior at UCLA, argues that “the computer is like electronic cocaine,” fueling cycles of mania followed by depressive stretches. The Internet “leads to behavior that people are conscious is not in their best interest and does leave them anxious and does make them act compulsively,” says Nicholas Carr, whose book The Shallows, about the Web’s effect on cognition, was nominated for a Pulitzer Prize. It “fosters our obsessions, dependence, and stress reactions,” adds Larry Rosen, a California psychologist who has researched the Net’s effect for decades. It “encourages—and even promotes—insanity.”

Fear that the Internet and mobile technology contributes to addiction—not to mention the often related ADHD and OCD disorders—has persisted for decades, but for most of that time the naysayers prevailed, often puckishly. “What’s next? Microwave abuse and Chapstick addiction?” wrote a peer reviewer for one of the leading psychiatric journals, rejecting a national study of problematic Internet use in 2006. The Diagnostic and Statistical Manual of Mental Disorders has never included a category of machine-human interactions.

But that view is suddenly on the outs. When the new DSM is released next year, Internet Addiction Disorder will be included for the first time, albeit in an appendix tagged for “further study.” China, Taiwan, and Korea recently accepted the diagnosis, and began treating problematic Web use as a grave national health crisis. In those countries, where tens of millions of people (and as much as 30 percent of teens) are considered Internet-addicted, mostly to gaming, virtual reality, and social media, the story is sensational front-page news. One young couple neglected its infant to death while nourishing a virtual baby online. A young man fatally bludgeoned his mother for suggesting he log off (and then used her credit card to rack up more hours). At least 10 ultra-Web users, serviced by one-click noodle delivery, have died of blood clots from sitting too long.

Now the Korean government is funding treatment centers, and coordinating a late-night Web shutdown for young people. China, meanwhile, has launched a mothers’ crusade for safe Web habits, turning to that approach after it emerged that some doctors were using electro-shock and severe beatings to treat Internet-addicted teens.

“There’s just something about the medium that’s addictive,” says Elias Aboujaoude, a psychiatrist at Stanford University School of Medicine, where he directs the Obsessive Compulsive Disorder Clinic and Impulse Control Disorders Clinic. “I’ve seen plenty of patients who have no history of addictive behavior—or substance abuse of any kind—become addicted via the Internet and these other technologies.”

His 2006 study of problematic Web habits (the one that was puckishly rejected) was later published, forming the basis for his recent book Virtually You, about the fallout expected from the Web’s irresistible allure. Even among a demographic of middle-aged landline users—the average respondent was in his 40s, white, and making more than $50,000 a year—Aboujaoude found that more than one in eight showed at least one sign of an unhealthy attachment to the Net. More recent surveys that recruit people already online have found American numbers on a par with those in Asia.

Then there was the University of Maryland’s 2010 “Unplugged” experiment that asked 200 undergrads to forgo all Web and mobile technologies for a day and to keep a diary of their feelings. “I clearly am addicted and the dependency is sickening,” reported one student in the study. “Media is my drug,” wrote another. At least two other schools haven’t even been able to get such an experiment off the ground for lack of participants. “Most college students are not just unwilling, but functionally unable, to be without their media links to the world,” the University of Maryland concluded.

That same year two psychiatrists in Taiwan made headlines with the idea of iPhone addiction disorder. They documented two cases from their own practices: one involved a high-school boy who ended up in an asylum after his iPhone usage reached 24 hours a day. The other featured a 31-year-old saleswoman who used her phone while driving. Both cases might have been laughed off if not for a 200-person Stanford study of iPhone habits released at the same time. It found that one in 10 users feels “fully addicted” to his or her phone. All but 6 percent of the sample admitted some level of compulsion, while 3 percent won’t let anyone else touch their phones.

In the two years since, concern over the Web’s pathological stickiness has only intensified. In April, doctors told The Times of India about an anecdotal uptick in “Facebook addiction.” The latest details of America’s Web obsession are found in Larry Rosen’s new book, iDisorder, which, despite the hucksterish title, comes with the imprimatur of the world’s largest academic publisher. His team surveyed 750 people, a spread of teens and adults who represented the Southern California census, detailing their tech habits, their feelings about those habits, and their scores on a series of standard tests of psychiatric disorders. He found that most respondents, with the exception of those over the age of 50, check text messages, email or their social network “all the time” or “every 15 minutes.” More worryingly, he also found that those who spent more time online had more “compulsive personality traits.”

Perhaps not that surprising: those who want the most time online feel compelled to get it. But in fact these users don’t exactly want to be so connected. It’s not quite free choice that drives most young corporate employees (45 and under) to keep their BlackBerrys in the bedroom within arms’ reach, per a 2011 study; or free choice, per another 2011 study, that makes 80 percent of vacationers bring along laptops or smartphones so they can check in with work while away; or free choice that leads smartphone users to check their phones before bed, in the middle of the night, if they stir, and within minutes of waking up.

We may appear to be choosing to use this technology, but in fact we are being dragged to it by the potential of short-term rewards. Every ping could be social, sexual, or professional opportunity, and we get a mini-reward, a squirt of dopamine, for answering the bell. “These rewards serve as jolts of energy that recharge the compulsion engine, much like the frisson a gambler receives as a new card hits the table,” MIT media scholar Judith Donath recently told Scientific American. “Cumulatively, the effect is potent and hard to resist.”

Recently it became possible to watch this kind of Web use rewire the brain. In 2008 Gary Small, the head of UCLA’s Memory and Aging Research Center, was the first to document changes in the brain as a result of even moderate Internet use. He rounded up 24 people, half of them experienced Web users, half of them newbies, and he passed them each through a brain scanner. The difference was striking, with the Web users displaying fundamentally altered prefrontal cortexes. But the real surprise was what happened next. The novices went away for a week, and were asked to spend a total of five hours online and then return for another scan. “The naive subjects had already rewired their brains,” he later wrote, musing darkly about what might happen when we spend more time online.

The brains of Internet addicts, it turns out, look like the brains of drug and alcohol addicts. In a study published in January, Chinese researchers found “abnormal white matter”—essentially extra nerve cells built for speed—in the areas charged with attention, control, and executive function. A parallel study found similar changes in the brains of videogame addicts. And both studies come on the heels of other Chinese results that link Internet addiction to “structural abnormalities in gray matter,” namely shrinkage of 10 to 20 percent in the area of the brain responsible for processing of speech, memory, motor control, emotion, sensory, and other information. And worse, the shrinkage never stopped: the more time online, the more the brain showed signs of “atrophy.”

While brain scans don’t reveal which came first, the abuse or the brain changes, many clinicians feel their own observations confirmed. “There’s little doubt we’re becoming more impulsive,” says Stanford’s Aboujaoude, and one reason for this is technology use. He points to the rise in OCD and ADHD diagnosis, the latter of which has risen 66 percent in the last decade. “There is a cause and effect.”

And don’t kid yourself: the gap between an “Internet addict” and John Q. Public is thin to nonexistent. One of the early flags for addiction was spending more than 38 hours a week online. By that definition, we are all addicts now, many of us by Wednesday afternoon, Tuesday if it’s a busy week. Current tests for Internet addiction are qualitative, casting an uncomfortably wide net, including people who admit that yes, they are restless, secretive, or preoccupied with the Web and that they have repeatedly made unsuccessful efforts to cut back. But if this is unhealthy, it’s clear many Americans don’t want to be well.

Like addiction, the digital connection to depression and anxiety was also once a near laughable assertion. A 1998 Carnegie Mellon study found that Web use over a two-year period was linked to blue moods, loneliness, and the loss of real-world friends. But the subjects all lived in Pittsburgh, critics sneered. Besides, the Net might not bring you chicken soup, but it means the end of solitude, a global village of friends, and friends you haven’t met yet. Sure enough, when Carnegie Mellon checked back in with the denizens of Steel City a few years later, they were happier than ever.

But the black crow is back on the wire. In the past five years, numerous studies have duplicated the original Carnegie Mellon findings and extended them, showing that the more a person hangs out in the global village, the worse they are likely to feel. Web use often displaces sleep, exercise, and face-to-face exchanges, all of which can upset even the chirpiest soul. But the digital impact may last not only for a day or a week, but for years down the line. A recent American study based on data from adolescent Web use in the 1990s found a connection between time online and mood disorders in young adulthood. Chinese researchers have similarly found “a direct effect” between heavy Net use and the development of full-blown depression, while scholars at Case Western Reserve University correlated heavy texting and social-media use with stress, depression, and suicidal thinking.

In response to this work, an article in the journal Pediatrics noted the rise of “a new phenomenon called ‘Facebook depression,’?” and explained that “the intensity of the online world may trigger depression.” Doctors, according to the report published by the American Academy of Pediatrics, should work digital usage questions into every annual checkup.

Rosen, the author of iDisorder, points to a preponderance of research showing “a link between Internet use, instant messaging, emailing, chatting, and depression among adolescents,” as well as to the “strong relationships between video gaming and depression.” But the problem seems to be quality as well as quantity: bad interpersonal experiences—so common online—can lead to these potential spirals of despair. For her book Alone Together, MIT psychologist Sherry Turkle interviewed more than 450 people, most of them in their teens and 20s, about their lives online. And while she’s the author of two prior tech-positive books, and once graced the cover of Wired magazine, she now reveals a sad, stressed-out world of people coated in Dorito dust and locked in a dystopian relationship with their machines.

People tell her that their phones and laptops are the “place for hope” in their lives, the “place where sweetness comes from.” Children describe mothers and fathers unavailable in profound ways, present and yet not there at all. “Mothers are now breastfeeding and bottle-feeding their babies as they text,” she told the American Psychological Association last summer. “A mother made tense by text messages is going to be experienced as tense by the child. And that child is vulnerable to interpreting that tension as coming from within the relationship with the mother. This is something that needs to be watched very closely.” She added, “Technology can make us forget important things we know about life.”

This evaporation of the genuine self also occurred among the high-school- and college-age kids she interviewed. They were struggling with digital identities at an age when actual identity is in flux. “What I learned in high school,” a kid named Stan told Turkle, “was profiles, profiles, profiles; how to make a me.” It’s a nerve-racking learning curve, a life lived entirely in public with the webcam on, every mistake recorded and shared, mocked until something more mockable comes along. “How long do I have to do this?” another teen sighed, as he prepared to reply to 100 new messages on his phone.

Last year, when MTV polled its 13- to 30-year-old viewers on their Web habits, most felt “defined” by what they put online, “exhausted” by always having to be putting it out there, and utterly unable to look away for fear of missing out. “FOMO,” the network called it. “I saw the best minds of my generation destroyed by madness, starving hysterical naked,” begins Allen Ginsberg’s poem Howl, a beatnik rant that opens with people “dragging themselves” at dawn, searching for an “angry fix” of heroin. It’s not hard to imagine the alternative imagery today.

The latest Net-and-depression study may be the saddest one of all. With consent of the subjects, Missouri State University tracked the real-time Web habits of 216 kids, 30 percent of whom showed signs of depression. The results, published last month, found that the depressed kids were the most intense Web users, chewing up more hours of email, chat, videogames, and file sharing. They also opened, closed, and switched browser windows more frequently, searching, one imagines, and not finding what they hoped to find.

They each sound like Doug, a Midwestern college student who maintained four avatars, keeping each virtual world open on his computer, along with his school work, email, and favorite videogames. He told Turkle that his real life is “just another window”—and “usually not my best one.” Where is this headed? she wonders. That’s the scariest line of inquiry of all.

Recently, scholars have begun to suggest that our digitized world may support even more extreme forms of mental illness. At Stanford, Dr. Aboujaoude is studying whether some digital selves should be counted as a legitimate, pathological “alter of sorts,” like the alter egos documented in cases of multiple personality disorder (now called dissociative identity disorder in the DSM). To test his idea, he gave one of his patients, Richard, a mild-mannered human-resources executive with a ruthless Web poker habit, the official test for multiple personality disorder. The result was startling. He scored as high as patient zero. “I might as well have been … administering the questionnaire to Sybil Dorsett!” Aboujaoude writes.

The Gold brothers—Joel, a psychiatrist at New York University, and Ian, a philosopher and psychiatrist at McGill University—are investigating technology’s potential to sever people’s ties with reality, fueling hallucinations, delusions, and genuine psychosis, much as it seemed to do in the case of Jason Russell, the filmmaker behind “Kony 2012.” The idea is that online life is akin to life in the biggest city, stitched and sutured together by cables and modems, but no less mentally real—and taxing—than New York or Hong Kong. “The data clearly support the view that someone who lives in a big city is at higher risk of psychosis than someone in a small town,” Ian Gold writes via email. “If the Internet is a kind of imaginary city,” he continues. “It might have some of the same psychological impact.”

A team of researchers at Tel Aviv University is following a similar path. Late last year, they published what they believe are the first documented cases of “Internet-related psychosis.” The qualities of online communication are capable of generating “true psychotic phenomena,” the authors conclude, before putting the medical community on warning. “The spiraling use of the Internet and its potential involvement in psychopathology are new consequences of our times.”

So what do we do about it? Some would say nothing, since even the best research is tangled in the timeless conundrum of what comes first. Does the medium break normal people with its unrelenting presence, endless distractions, and threat of public ridicule for missteps? Or does it attract broken souls?

But in a way, it doesn’t matter whether our digital intensity is causing mental illness, or simply encouraging it along, as long as people are suffering. Overwhelmed by the velocity of their lives, we turn to prescription drugs, which helps explain why America runs on Xanax (and why rehab admissions for benzodiazepines, the ingredient in Xanax and other anti-anxiety drugs, have tripled since the late 1990s). We also spring for the false rescue of multitasking, which saps attention even when the computer is off. And all of us, since the relationship with the Internet began, have tended to accept it as is, without much conscious thought about how we want it to be or what we want to avoid. Those days of complacency should end. The Internet is still ours to shape. Our minds are in the balance.

A human soul

At this moment I must say, almost in the form of deposition, without argument, that I do not believe my birth began my first existence. Nor Humboldt’s. Nor anyone’s. One esthetic grounds, if on no others, I cannot accept the view of death taken by most of us, and taken by me during most of my life — on esthetic grounds therefore I am obliged to deny that so extraordinary a thing as a human soul can be wiped out forever. No, the dead are about us, shut out by our metaphysical denial of them. As we lie nightly in our hemispheres asleep by the billions, our dead approach us. Our ideas should be their nourishment. We are their grainfields. But we are barren and we starve them. Don’t kid yourself, though, we are watched by the dead, watched on this earth, which is our school of freedom. In the next realm, where things are clearer, clarity eats into freedom. We are free on earth because of cloudiness, because of error, because of marvelous limitation, and as much because of beauty as because of blindness and evil. These always go with the blessing of freedom. But this is all I have to say about the matter now, because I’m in a hurry, under pressure — all this unfinished business!

— Saul Bellow, Humboldt’s Gift

Specialization

A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.

— Robert Heinlein

“Moral from oral”

The new work is also part of a field called grounded or embodied cognition. Basically, this ever-growing understanding of human mental function notes that all ideas and concepts—including hard-to-grasp ones like personality and trustworthiness—are ultimately anchored in concrete situations that happen to flesh-bound bodies.

By Kai MacDonald
scientificamerican

What do a chilly reception, a cold-blooded murder, and an icy stare have in common? Each plumbs the bulb of what could be called your social thermometer, exposing our reflexive tendency to conflate social judgments—estimations of another’s trust and intent — with the perception of temperature. Decades of fascinating cross-disciplinary studies have illuminated the surprising speed, pervasiveness and neurobiology of this unconscious mingling of the personal and the thermal.

The blurring of ‘heat’ and ‘greet’ is highlighted in a recent experiment by Ohio University’s Matthew Vess, who asked whether this tendency is influenced by an individual’s sensitivity to relational distress. They found that people high in the psychological attribute called attachment anxiety (a tendency to worry about the proximity and availability of a romantic partner) responded to memories of a relationship breakup with an increased preference for warm-temperature foods over cooler ones: soup over crackers. Subjects low in attachment anxiety — those more temperamentally secure — did not show this “comfort food” effect.

In a related part of the same experiment, subjects were asked to reconstruct jumbled words into sentences that had either cold or warm evocations. (Sentence reconstruction tasks involving specific themes are known to unconsciously influence subsequent behavior.) After being temperature-primed, Vess’s subjects rated their perceptions of their current romantic relationship. As in the first condition, subjects higher in attachment anxiety rated their relationship satisfaction higher when prompted with balmier phrases than with frosty ones.

The fact that individual differences in a relationship-oriented trait (attachment anxiety) are related to a person’s sensitivity to unconscious temperature-related cues speaks to the “under the hood” unconscious mingling that occurs between our social perceptual system and our temperature perception system. Though people predisposed to worry about their relationships seem to be more sensitive to these cues, we are all predisposed to the blurring of different types of experience. Similar recent experiments have demonstrated, for example, that briefly holding a warm beverage buoys subsequent ratings of another’s personality, that social isolationsensitizes a person to all things chilly, that inappropriate social mimicry creates a sense of cold, and that differences in the setting of the experimental lab’s thermostat leads test subjects to construe social relationships differently.

Vess’s experiments follow a much longer line of psychological research exploring the reasons people avoid a cold shoulder, lament a frigid partner and have to “cool off” after a spat. In point of fact, the mercury in our social minds has been of interest since at least the 1940’s, when landmark work on impression-formation by the pioneering social psychologist Solomon Asch demonstrated the “striking and consistent differences of impression” created by substituting the words “warm” and “cold” into a hypothetical person’s personality profile. Since then, a panoply of studies of social perception in a host of cultures have validated the centrality of these temperate anchors in forming rapid unconscious impressions of a person.

It will come as no surprise that the ultimate confluence of the thermal and the personal happens between our ears. The neuroscientist Jaak Panksepp, for example, has noted that the neural machinery for attachment and bonding is actually cobbled together out of more primitive brain areas used for temperature regulation. Adding to this theme, the psychiatrist Myron Hofer’s seminal research in the 1970’s demonstrated that certain parameters of rodent maternal attachment behavior (e.g. variations in touch or warmth) act as “hidden regulators” of various physiological responses (e.g.digestion) in their pups. Around the same time, another psychiatrist, John Bowlby, penned his now-canon observations about the central importance of attachment for the social and psychological development of young humans, reminding us that we are just another part of a chain of mammals that depend on the care of others for survival.

The new work is also part of a field called grounded or embodied cognition. Basically, this ever-growing understanding of human mental function notes that all ideas and concepts—including hard-to-grasp ones like personality and trustworthiness—are ultimately anchored in concrete situations that happen to flesh-bound bodies. For example, our faces respond to disconcerting behavior (incest) in a similar way to disgusting ingestive behavior (crunching a cockroach). Moral from oral.

In the realm of relationships, this mingling of the less and more-concrete means that attachment experiences with specific others, their unconscious emotional tone, and the temperatures they actually feel activate and are stored in overlapping networks of brain areas. These are the so-called multimodal brain regions (a key one is called the insula) that blend different channels of sensory experience into a singular whole. In individual brains, then, the multichannel experience of being safe with another (initially, a mother) is forever fused with the experienced physical warmth that comes with being safe, fed and held. At the level of both brain and experience, this multi-sensory co-experiencing forms a wordless bedrock that implicitly grounds relational language, interpersonal evaluation, social cognition, and even imagined ingestion. “Warm” triggers “trust,” as well as the reverse.

From these insights, other questions abound. Regarding the biology of attachment and warmth, for example, one wonders whether certain molecules—the attachment hormone oxytocin, for example — bias cognition and behavior in a balmier direction. After all, given its vital role in birth, nursing, and early attachment bonds, oxytocin has a strong unconscious association with warm milk, a tendency to inspire trust and generosity, and the capacity to make more benign the valuation of others. Oxytocin even boosts a persons’ perception of the warmth of their own personality, and a ‘warm touch’ intervention in married couples enhances oxytocin levels. On the opposite hormonal pole, might testosterone—which decreases trust — also make things seem colder?

My late, beloved father—a psychologist — had a puckish propensity to char the toast that was a mandatory component of his hand-made hot breakfast ritual. Thanks to social neuroscience, I can now better appreciate (and perpetuate) the wisdom of this ritual linking hot food with heart-felt.

The ending no one wants

That is the thing that you begin to terrifyingly appreciate: Dementia is not absence; it is not a nonstate; it actually could be a condition of more rather than less feeling, one that, with its lack of clarity and logic, must be a kind of constant nightmare.

By Michael Wolff
theweek

ON THE WAY to visit my mother one recent rainy afternoon, I stopped in, after quite some constant prodding, to see my insurance salesman. He was pressing his efforts to sell me a long-term-care policy with a pitch about how much I’d save if I bought it now, before the rates were set to precipitously rise.

I am, as my insurance man pointed out, a “sweet spot” candidate. Not only do I have the cash (though not enough to self-finance my decline) but I also have a realistic view: Like so many people in our 50s — in my experience almost everybody — I have a parent in an advanced stage of terminal breakdown.

It’s what my peers talk about: our parents’ horror show. From the outside — at the office, restaurants, cocktail parties — we all seem perfectly secure and substantial. But in a room somewhere, hidden from view, we occupy this other, unimaginable life.

I didn’t need to be schooled in the realities of long-term care: The costs for my mother — who is 86 and who for the past 18 months has not been able to walk, talk, or address her most minimal needs and, to boot, is absent a short-term memory — come in at about $17,000 a month. And while her LTC insurance hardly covers all of that, I’m certainly grateful she had the foresight to carry such a policy.

And yet, on the verge of writing the first LTC check, I backed up.

We make certain assumptions about the necessity of care. It’s an individual and, depending on where you stand in the great health-care debate, a national responsibility. And yet, I will tell you, what I feel most intensely when I sit by my mother’s bed is a crushing sense of guilt for keeping her alive. Who can accept such suffering — who can so conscientiously facilitate it?

In 1990, there were slightly more than 3 million Americans over the age of 85. Now there are almost 6 million. By 2050 there will be 19 million — approaching 5 percent of the population.

By promoting longevity and technologically inhibiting death, we have created a new biological status held by an ever-growing part of the nation, a no-exit state that persists longer and longer, one that is nearly as remote from life as death, but which, unlike death, requires vast service, indentured servitude, really, and resources.

This is not anomalous; this is the norm. Contrary to the comedian’s maxim, comedy is easy, dying hard.

Seventy percent of those older than 80 have a chronic disability, according to one study; 53 percent in this group have at least one severe disability, and 36 percent have moderate to severe cognitive impairments.

Alzheimer’s is just one form — not, as it happens, my mother’s — of the -ever-more-encompassing conditions of cognitive collapse that are the partners and the price of longevity. There are now more than 5 million demented Americans. By 2050, upward of 15 million of us will have lost our minds.

This year, the costs of dementia care will be $200 billion. By 2050, $1 trillion.

Make no mistake, the purpose of long-term-care insurance is to help finance some of the greatest misery and suffering human beings have yet devised.

I HESITATE TO give my mother a personality here. It is the argument I have with myself every day — she is not who she was; do not force her to endure because of what she once was. Do not sentimentalize. And yet…that’s the bind: She remains my mother.

She graduated from high school in 1942 and went to work for the PatersonEvening News, a daily newspaper in New Jersey. In a newsroom with many of its men off to war, Marguerite Vander Werf — nicknamed “Van” in the newsroom and forevermore — shortly became the paper’s military reporter. Her job was to keep track of the local casualties. At 18, a lanky 95 pounds in bobby socks, my mother would often show up at a soldier’s parents’ front door before the War Department’s telegraph and have to tell these souls their son was dead.

She married my father, Lew Wolff, an adman, and left the paper after 11 years to have me — then my sister, Nancy, and brother, David. She did freelance journalism and part-time PR work (publicity, it was called then). She was a restless and compelling personality who became a civic power in our town, got elected to the board of education, and took charge of the public library, organizing and raising the money for its new building and expansion. She was the Pied Piper, the charismatic mom, a talker of great wit and passion — holding the attention of children and dinner-party drunks alike.

My father, whose ad agency had wide swings of fortune, died, suddenly, in that old-fashioned way, of a heart attack at 63, during one of the downswings. My mother was 58 — the age I am now — and left with small resources. She applied her charm and guile to a breathtaking reinvention and personal economic revival, becoming a marketing executive at first one and then another pharmaceutical company. At 72, headed to retirement but still restless, she capped off her career as the marketing head of an online-game company.

For 25 years, she lived in an apartment building in Ridgewood, N.J. Once a week, every week, she drove into Manhattan to cook dinner for my family and help my three children with their homework — I am not sure how I would have managed my life and raised children without her.

This is the woman, or what is left of the woman, who now resides in a studio apartment in one of those new boxy buildings that dot the Upper West Side — a kind of pre-coffin, if you will. It is even, thanks to my sister’s diligence, my mother’s LTC insurance and savings, and the contributions of my two siblings and me, what we might think of as an ideal place to be in her condition.

A painting from 1960 by March Avery, from the collection she and my father assembled — an Adirondack chair facing a blue sea — hangs in front of her. Below the painting is the flat-screen TV where she watches cooking shows with a strange intensity. She is attended 24/7 by two daily shifts of devoted caregivers.

It is peaceful and serene.

Except for my mother’s disquiet. She stares in mute reprimand. Her bewilderment and resignation somehow don’t mitigate her anger. She often tries to talk — desperate guttural pleas. She strains for cognition and, shockingly, sometimes bursts forward, reaching it — “Nice suit,” she said to me, out of the blue, a few months ago — before falling back.

That is the thing that you begin to terrifyingly appreciate: Dementia is not absence; it is not a nonstate; it actually could be a condition of more rather than less feeling, one that, with its lack of clarity and logic, must be a kind of constant nightmare.

“Old age,” says one of Philip Roth’s protagonists, “isn’t a battle, it’s a massacre.” I’d add, it’s a holocaust. Circumstances have conspired to rob the human person of all hope and dignity and comfort.

When my mother’s diaper is changed she makes noises of harrowing despair — for a time, before she lost all language, you could if you concentrated make out what she was saying, repeated over and over and over again: “It’s a violation. It’s a violation. It’s a violation.”

MY MOTHER’S CARDIOLOGIST had been for many years monitoring her for a condition called aortic stenosis — a narrowing of the aortic valve. The advice was do nothing until something had to be done. But now that she was showing symptoms that might suddenly kill her, why not operate and reach for another few good years? What’s to lose?

My siblings and I must take the blame here. It did not once occur to us to say, “You want to do major heart surgery on an 84-year-old woman showing progressive signs of dementia? What are you, nuts?”

Here’s what the surgeon said, defending himself, in perfect Catch-22-ese, against the recriminations that followed the stark and dramatic postoperative decline in my mother’s “quality-of-life baseline”: “I visited your mom before the procedure and fully informed her of the risks of such a surgery to someone showing signs of dementia.”

You fully informed my demented mom?

The operation absolutely repaired my mother’s heart — “She can live for years,” according to the surgeon. But where before she had been gently sinking, now we were in free fall.

Six weeks and something like $250,000 in hospital bills later (paid by Medicare — or, that is, by you), she was reduced to a terrified creature — losing language skills by the minute. “She certainly appears agitated,” the psychiatrist sent to administer anti-psychotic drugs told me, “and so do you.”

MY SISTER COMES over every morning. She brings the groceries, plans the menu, and has a daily routine for stretching my mother’s limbs. I’m here a few times a week (for exactly 30 minutes — no more, no less). Her grandchildren, with an unalloyed combination of devotion and horror, come on a diligent basis.

A few weeks ago, my sister and I called a meeting with my mother’s doctor. “It’s been a year,” I began, groping for what needed to be said. “We’ve seen a series of incremental but marked declines.” My sister chimed in with some vivid details.

We just wanted to help her go where she’s going. (Was that too much? Was that too specific?)

She does seem, the doctor allowed, to have entered another stage. “Perhaps more palliative care. This can ease her suffering, but the side effect can be to depress her functions. But maybe it is time to err on the side of ease.

“Your mom, like a lot of people, is what we call a dwindler,” said the doctor.

I do not know how death panels ever got such a bad name. Perhaps they should have been called deliverance panels. What I would not do for a fair-minded body to whom I might plead for my mother’s end.

The alternative is nuts: paying trillions and bankrupting the nation as well as our souls as we endure the suffering of our parents and our inability to help them get where they’re going. The single greatest pressure on health care is the disproportionate resources devoted to the elderly, to not just the old but the old old, and yet no one says what all old children of old parents know: This is not just wrongheaded but steals the life from everyone involved.

After due consideration, I decided that I plainly would never want what LTC insurance buys, and, too, that this would be a bad deal. My bet is that, even in America, we baby boomers watching our parents’ long and agonizing deaths won’t do this to ourselves. We will surely, we must surely, find a better, cheaper, quicker, kinder way out.

Meanwhile, since, like my mother, I can’t count on someone putting a pillow over my head, I’ll be trying to work out the timing and details of a do-it-yourself exit strategy. As should we all.

Humboldt’s basic texts

To follow his intricate conversation you had to know his basic texts. I knew what they were: Plato’s Timaeus, Proust on Combray, Virgil on farming, Marvell on gardens, Wallace Stevens’ Caribbean poetry, and so on. One reason why Humboldt and I were so close was that I was willing to take the complete course.

— Saul Bellow, Humboldt’s Gift

Throw over your man… and I’ll tell you all the things I have in my head

“Throw over your man….and I’ll tell you all the things I have in my head, millions, myriads,” wrote Virginia Woolf to her lover, the English poet Vita Sackville-West, in her exquisite 1927 love letter. But that missive was preceded by one from Vita herself, sent from Milan on January 21 the same year. Disarmingly honest, heartfelt, and unguarded, it stands in beautiful contrast with Virginia’s passionate prose:

…I am reduced to a thing that wants Virginia. I composed a beautiful letter to you in the sleepless nightmare hours of the night, and it has all gone: I just miss you, in a quite simple desperate human way. You, with all your undumb letters, would never write so elementary a phrase as that; perhaps you wouldn’t even feel it. And yet I believe you’ll be sensible of a little gap. But you’d clothe it in so exquisite a phrase that it should lose a little of its reality. Whereas with me it is quite stark: I miss you even more than I could have believed; and I was prepared to miss you a good deal. So this letter is really just a squeal of pain. It is incredible how essential to me you have become. I suppose you are accustomed to people saying these things. Damn you, spoilt creature; I shan’t make you love me any more by giving myself away like this — But oh my dear, I can’t be clever and stand-offish with you: I love you too much for that. Too truly. You have no idea how stand-offish I can be with people I don’t love. I have brought it to a fine art. But you have broken down my defenses. And I don’t really resent it.

— via brainpickings

This castle hath a pleasant seat

Perhaps, being lost, one should get loster; being very late for an appointment, it might be best to walk slower, as one of my beloved Russian writers advised.

— Saul Bellow, Humboldt’s Gift

Freedom absurdity

How absurd men are! They never use the liberties they have, they demand those they do not have. They have freedom of thought, they demand freedom of speech.

— Søren Kierkegaard
1843

via sleepzandthinkz

Calamus

In paths untrodden,
In the growth by margins of pond-waters,
Escaped from the life that exhibits itself,
From all the standards hitherto publish’d, from the
pleasures, profits, conformities,
Which too long I was offering to feed my soul,
Clear to me now standards not yet publish’d, clear to me
that my soul,
That the soul of the man I speak for rejoices in comrades,
Here by myself away from the clank of the world,
Tallying and talk’d to here by tongues aromatic,
No longer abash’d, (for in this secluded spot I can respond
as I would not dare elsewhere,)
Strong upon me the life that does not exhibit itself, yet
contains all the rest,
Resolv’d to sing no songs to-day but those of manly
attachment,
Projecting them along that substantial life,
Bequeathing hence types of athletic love,
Afternoon this delicious Ninth-month in my forty-first
year,
I proceed for all who are or have been young men,
To tell the secret of my nights and days,
To celebrate the need of comrades.

— Walt Whitman

To God Moses jotted several lines

The silence sustained him, and the brilliant weather, the feeling that he was easily contained by everything about him Within the hollowness of God, as he noted, and deaf to the final multiplicity of facts, as well as, blind to ultimate distances. Two billion light-years out. Supernovae.

Daily radiance, trodden here
Within the hollowness of God

To God he jotted several lines.

How my mind has struggled to make coherent sense. I have not been too good at it. But have desired to do your unknowable will, taking it, and you, without symbols. Everything of intensest significance. Especially if divested of me.

— Saul Bellow, Herzog

His state of simple, free, intense realization

Herzog could not say what the significance of such generalities might be. He was only vastly excited — in a streaming state — and intended mostly to restore order by turning to his habit of thoughtfulness. Blood had burst into his psyche, and for the time being he was either free or crazy. But then he realized that he did not need to perform elaborate abstract intellectual work — work he had always thrown himself into as if it were the struggle for survival. But not thinking is not necessarily fatal. Did I really believe that I would die when thinking stopped? Now to fear such a thing — that’s really crazy.

— Saul Bellow, Herzog