To copy oneself

Success is dangerous. One begins to copy oneself, and to copy oneself is more dangerous than to copy others. It leads to sterility.

— Pablo Picasso
Forbes Quotes

Dear Borges

13 June 1996 New York

Dear Borges,

Since your literature was always placed under the sign of eternity, it doesn’t seem too odd to be addressing a letter to you. (Borges, it’s 10 years!) If ever a contemporary seemed destined for literary immortality, it was you. You were very much the product of your time, your culture, and yet you knew how to transcend your time, your culture, in ways that seem quite magical. This had something to do with the openness and generosity of your attention. You were the least egocentric, the most transparent of writers, as well as the most artful. It also had something to do with a natural purity of spirit. Though you lived among us for a rather long time, you perfected practices of fastidiousness and of detachment that made you an expert mental traveller to other eras as well. You had a sense of time that was different from other people’s. The ordinary ideas of past, present and future seemed banal under your gaze. You liked to say that every moment of time contains the past and the future, quoting (as I remember) the poet Browning, who wrote something like, “the present is the instant in which the future crumbles into the past.” That, of course, was part of your modesty: your taste for finding your ideas in the ideas of other writers.

Your modesty was part of the sureness of your presence. You were a discoverer of new joys. A pessimism as profound, as serene, as yours did not need to be indignant. It had, rather, to be inventive – and you were, above all, inventive. The serenity and the transcendence of self that you found are to me exemplary. You showed that it is not necessary to be unhappy, even while one is clear-eyed and undeluded about how terrible everything is. Somewhere you said that a writer – delicately you added: all persons – must think that whatever happens to him or her is a resource. (You were speaking of your blindness.)

You have been a great resource, for other writers. In 1982 – that is, four years before you died – I said in an interview, “There is no writer living today who matters more to other writers than Borges. Many people would say he is the greatest living writer… Very few writers of today have not learnt from him or imitated him.” That is still true. We are still learning from you. We are still imitating you. You gave people new ways of imagining, while proclaiming over and over our indebtedness to the past, above all, to literature. You said that we owe literature almost everything we are and what we have been. If books disappear, history will disappear, and human beings will also disappear. I am sure you are right. Books are not only the arbitrary sum of our dreams, and our memory. They also give us the model of self-transcendence. Some people think of reading only as a kind of escape: an escape from the “real” everyday world to an imaginary world, the world of books. Books are much more. They are a way of being fully human.

I’m sorry to have to tell you that books are now considered an endangered species. By books, I also mean the conditions of reading that make possible literature and its soul effects. Soon, we are told, we will call up on “bookscreens” any “text” on demand, and will be able to change its appearance, ask questions of it, “interact” with it. When books become “texts” that we “interact” with according to criteria of utility, the written word will have become simply another aspect of our advertising-driven televisual reality. This is the glorious future being created, and promised to us, as something more “democratic”. Of course, it means nothing less then the death of inwardness – and of the book.

This time around, there will be no need for a great conflagration. The barbarians don’t have to burn the books. The tiger is in the library. Dear Borges, please understand that it gives me no satisfaction to complain. But to whom could such complaints about the fate of books – of reading itself – be better addressed than to you? (Borges, it’s 10 years!) All I mean to say is that we miss you. I miss you. You continue to make a difference. The era we are entering now, this 21st century, will test the soul in new ways. But, you can be sure, some of us are not going to abandon the Great Library. And you will continue to be our patron and our hero.

— Susan Sontag, Where the Stress Falls

Shown to me by the friendly neighborhood Jehovah’s Witnesses…

I am the LORD your God,
who teaches you what is best for you,
who directs you in the way you should go.
18 If only you had paid attention to my commands,
your peace would have been like a river,
your well-being like the waves of the sea.

— Isaiah 48.17, NIV

Words are flowing out like endless rain into a paper cup: Beatles critics across the decades

Critics can pick a fight over anything, even artists who have received nearly universal acclaim, like Mozart, Shakespeare and…the Beatles. Herewith a list of sharply dissenting views on the Fab Four.

Guitar groups are on the way out…the Beatles have no future in show business.

—Dick Rowe, head of Decca Records, 1962

[BEATLES1]Sean Connery, 1964

Drinking Dom Perignon ’53 above the temperature of 38 degrees” is “as bad as listening to the Beatles without earmuffs.

—James Bond, secret agent in “Goldfinger,” 1964

The noise was deafening throughout and I couldn’t hear a word they sang or a note they played, just one long ear-splitting din.

—Noel Coward, British composer and playwright, summarizing a Beatles concert in 1964

[BEATLES2]William F. Buckley Jr.

The Beatles are not merely awful, I would consider it sacrilegious to say anything less than that they are godawful. They are so unbelievably horrible, so appallingly unmusical, so dogmatically insensitive to the magic of the art, that they qualify as crowned heads of anti-music.

—William F. Buckley, author and commentator, 1964

Musically, they are a near disaster; guitars slamming out a merciless beat that does away with secondary rhythms, harmony and melody. Their lyrics (punctuated by nutty shouts of “yeah, yeah, yeah!”) are a catastrophe, a preposterous farrago of Valentine-card romantic sentiments.

—Newsweek reviewer, Feb. 24, 1964

[BEATLES3]David Susskind

The most repulsive group of men I’ve ever seen.

—David Susskind, American TV host, 1965

My teacher always talks about the Beatles but they have no lyrical skills, my cousin Rodney is a better songwriter than these clowns. I think they should remix these songs with 50 Cent or Snoop, then they’d really get some fans behind them.

—Amazon customer review, 2005

BEATLES4

The Beatles performing at the Prince of Wales Theatre.

The Beatles are the absolute curse of modern indie music… my favorite Beatle is Yoko Ono; without Yoko’s influence I don’t think there would be any Beatles music I could listen to.

—David Keenan, author and music critic, 2009

…why making records on drugs isn’t always a good idea and why you shouldn’t let Ringo sing… if not the worst, then certainly the most overrated album of all time.

—Richard Smith, reviewing “Sgt. Pepper’s Lonely Hearts Club Band,” 2007

[BEATLES5]Elvis Presley

The Beatles laid the groundwork for many of the problems we are having with young people by their filthy unkempt appearances and suggestive music while entertaining in this country during the early and middle 1960s.

—Elvis Presley, as recorded by an FBI memo, during a 1970 visit to Richard Nixon at the White House and FBI headquarters

‘Love Me Do’…a melody so weepingly banal it sounds like a fingering exercise for primary-school recorder practice.

—Michael Deacon, music critic, 2009

wsj

Varieties of Irreligious Experience: atheism through the ages

Socrates

Baruch Spinoza

Percy Bysshe Shelley

Pierre-Joseph Proudhon

George Eliot

Friedrich Nietzsche

John Stuart Mill

William James

— newhumanist.org.uk,

Scott Garrett

 

Mmmmm… Bananas…

With a clattering of chairs, upended shell cases, benches, and ottomans, Pirate’s mob gather at the shores of the great refectory table, a southern island well across a tropic or two from chill Corydon Throsp’s mediaeval fantasies, crowded now over the swirling dark grain of its walnut uplands with banana omelets, banana sandwiches, banana casseroles, mashed bananas molded in the shape of a British lion rampant, blended with eggs into batter for French toast, squeezed out a pastry nozzle across the quivering creamy reaches of a banana blancmange to spell out the words C’est magnifique, mais ce n’est pas la guerre (attributed to a French observer during the Charge of the Light Brigade) which Pirate has appropriated as his motto . . . tall cruets of pale banana syrup to pour oozing over banana waffles, a giant glazed crock where diced bananas have been fermenting since the summer with wild honey and muscat raisins, up out of which, this winter morning, one now dips foam mugsfull of banana mead . . . banana croissants and banana kreplach, and banana oatmeal and banana jam and banana bread, and bananas flamed in ancient brandy Pirate brought back last year from a cellar in the Pyrenees also containing a clandestine radio transmitter . . .

. . . Pirate takes up a last dipper of mead, feels it go valving down his throat as if it’s time, time in its summer tranquility, he swallows.

— Thomas Pynchon, Gravity’s Rainbow

 

Another great first sentence

For the awarding of the Grillparzer Prize of the Academy of Sciences in Vienna I had to buy a suit, as I had suddenly realized two hours before the presentation that I couldn’t appear at this doubtless extraordinary ceremony in trousers and a pullover, and so I had actually made the decision on the so-called Graben to go to the Kohlmarkt and outfit myself with appropriate formality, to which end, based on previous shopping for socks on several occasions, I picked the best-known gentleman’s outfitters with the descriptive name Sir Anthony, if I remember correctly it was nine forty-five when I went into Sir Anthony’s salon, the award ceremony for the Grillparzer Prize was at eleven, so I had plenty of time.

— Thomas Bernhard, My Prizes

Depression is not the soul’s annihilation

If our lives had no other configuration but this, we should want, and perhaps deserve, to perish; if depression had no termination, then suicide would, indeed, be the only remedy. But one need not sound the false or inspirational note to stress the truth that depression is not the soul’s annihilation; men and women who have recovered from the disease — and they are countless — bear witness to what is probably its only saving grace: it is conquerable.

For those who have dwelt in depression’s dark wood, and known its inexplicable agony, their return from the abyss is not unlike the ascent of the poet, trudging upward and upward out of hell’s black depths and at last emerging into what he saw as “the shining world.” There, whoever has been restored to health has almost always been restored to the capacity for serenity and joy, and this may be indemnity enough for having endured the despair beyond despair.

E quindi uscimmo a riveder le stelle.

And so we came forth, and once again beheld the stars.

— William Styron, Darkness Visible: A Memoir of Madness

What would a nonexpressive poetry look like? A poetry of intellect rather than emotion?

What would a nonexpressive poetry look like? A poetry of intellect rather than emotion? One in which the substitutions at the heart of metaphor and image were replaced by the direct presentation of language itself, with “spontaneous overflow” supplanted by meticulous procedure and exhaustively logical process? In which the self-regard of the poet’s ego were turned back onto the self-reflexive language of the poem itself? So that the test of poetry were no longer whether it could have been done better (the question of the workshop), but whether it could conceivably have been done otherwise.

Craig Dworkin

Uncreative Writing

By Kenneth Goldsmith
chronicle

In 1969 the conceptual artist Douglas Huebler wrote, “The world is full of objects, more or less interesting; I do not wish to add any more.” I’ve come to embrace Huebler’s idea, though it might be retooled as: “The world is full of texts, more or less interesting; I do not wish to add any more.”

It seems an appropriate response to a new condition in writing: With an unprecedented amount of available text, our problem is not needing to write more of it; instead, we must learn to negotiate the vast quantity that exists. How I make my way through this thicket of information—how I manage it, parse it, organize and distribute it—is what distinguishes my writing from yours.

The prominent literary critic Marjorie Perloff has recently begun using the term “unoriginal genius” to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is outdated. An updated notion of genius would have to center around one’s mastery of information and its dissemination. Perloff has coined another term, “moving information,” to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.

Perloff’s notion of unoriginal genius should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin’sArcades Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.

Today technology has exacerbated these mechanistic tendencies in writing (there are, for instance, several Web-based versions of Raymond Queneau’s 1961 laboriously hand-constructedHundred Thousand Billion Poems), inciting younger writers to take their cues from the workings of technology and the Web as ways of constructing literature. As a result, writers are exploring ways of writing that have been thought, traditionally, to be outside the scope of literary practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming, to name just a few.

In 2007 Jonathan Lethem published a pro-plagiarism, plagiarized essay in Harper’s titled, “The Ecstasy of Influence: A Plagiarism.” It’s a lengthy defense and history of how ideas in literature have been shared, riffed, culled, reused, recycled, swiped, stolen, quoted, lifted, duplicated, gifted, appropriated, mimicked, and pirated for as long as literature has existed. Lethem reminds us of how gift economies, open-source cultures, and public commons have been vital for the creation of new works, with themes from older works forming the basis for new ones. Echoing the cries of free-culture advocates such as Lawrence Lessig and Cory Doctorow, he eloquently rails against copyright law as a threat to the lifeblood of creativity. From Martin Luther King Jr.’s sermons to Muddy Waters’s blues tunes, he showcases the rich fruits of shared culture. He even cites examples of what he had assumed were his own “original” thoughts, only later to realize—usually by Googling—that he had unconsciously absorbed someone else’s ideas that he then claimed as his own.

It’s a great essay. Too bad he didn’t “write” it. The punchline? Nearly every word and idea was borrowed from somewhere else—either appropriated in its entirety or rewritten by Lethem. His essay is an example of “patchwriting,” a way of weaving together various shards of other people’s words into a tonally cohesive whole. It’s a trick that students use all the time, rephrasing, say, a Wikipedia entry into their own words. And if they’re caught, it’s trouble: In academia, patchwriting is considered an offense equal to that of plagiarism. If Lethem had submitted this as a senior thesis or dissertation chapter, he’d be shown the door. Yet few would argue that he didn’t construct a brilliant work of art—as well as writing a pointed essay—entirely in the words of others. It’s the way in which he conceptualized and executed his writing machine—surgically choosing what to borrow, arranging those words in a skillful way—that wins us over. Lethem’s piece is a self-reflexive, demonstrative work of unoriginal genius.

Lethem’s provocation belies a trend among younger writers who take his exercise one step further by boldly appropriating the work of others without citation, disposing of the artful and seamless integration of Lethem’s patchwriting. For them, the act of writing is literally moving language from one place to another, proclaiming that context is the new content. While pastiche and collage have long been part and parcel of writing, with the rise of the Internet plagiaristic intensity has been raised to extreme levels.

Over the past five years, we have seen a retyping of Jack Kerouac’s On the Road in its entirety, a page a day, every day, on a blog for a year; an appropriation of the complete text of a day’s copy of The New York Times published as a 900-page book; a list poem that is nothing more than reframing a listing of stores from a shopping-mall directory into a poetic form; an impoverished writer who has taken every credit-card application sent to him and bound them into an 800-page print-on-demand book so costly that he can’t afford a copy; a poet who has parsed the text of an entire 19th-century book on grammar according to its own methods, even down to the book’s index; a lawyer who re-presents the legal briefs of her day job as poetry in their entirety without changing a word; another writer who spends her days at the British Library copying down the first verse of Dante’s Inferno from every English translation that the library possesses, one after another, page after page, until she exhausts the library’s supply; a writing team that scoops status updates off social-networking sites and assigns them to the names of deceased writers (“Jonathan Swift has got tix to the Wranglers game tonight”), creating an epic, never-ending work of poetry that rewrites itself as frequently as Facebook pages are updated; and an entire movement of writing, called Flarf, that is based on grabbing the worst of Google search results: the more offensive, the more ridiculous, the more outrageous, the better.

These writers are language hoarders; their projects are epic, mirroring the gargantuan scale of textuality on the Internet. While the works often take an electronic form, paper versions circulate in journals and zines, purchased by libraries, and received by, written about, and studied by readers of literature. While this new writing has an electronic gleam in its eye, its results are distinctly analog, taking inspiration from radical modernist ideas and juicing them with 21st-century technology.

Far from this “uncreative” literature being a nihilistic, begrudging acceptance—or even an outright rejection—of a presumed “technological enslavement,” it is a writing imbued with celebration, ablaze with enthusiasm for the future, embracing this moment as one pregnant with possibility. This joy is evident in the writing itself, in which there are moments of unanticipated beauty—some grammatical, others structural, many philosophical: the wonderful rhythms of repetition, the spectacle of the mundane reframed as literature, a reorientation to the poetics of time, and fresh perspectives on readerliness, to name just a few. And then there’s emotion: yes, emotion. But far from being coercive or persuasive, this writing delivers emotion obliquely and unpredictably, with sentiments expressed as a result of the writing process rather than by authorial intention.

These writers function more like programmers than traditional writers, taking Sol Lewitt’s dictum to heart: “When an artist uses a conceptual form of art, it means that all of the planning and decisions are made beforehand and the execution is a perfunctory affair. The idea becomes a machine that makes the art,” and raising new possibilities of what writing can be. The poet Craig Dworkin posits:

What would a nonexpressive poetry look like? A poetry of intellect rather than emotion? One in which the substitutions at the heart of metaphor and image were replaced by the direct presentation of language itself, with “spontaneous overflow” supplanted by meticulous procedure and exhaustively logical process? In which the self-regard of the poet’s ego were turned back onto the self-reflexive language of the poem itself? So that the test of poetry were no longer whether it could have been done better (the question of the workshop), but whether it could conceivably have been done otherwise.

There’s been an explosion of writers employing strategies of copying and appropriation over the past few years, with the computer encouraging writers to mimic its workings. When cutting and pasting are integral to the writing process, it would be mad to imagine that writers wouldn’t exploit these functions in extreme ways that weren’t intended by their creators.

If we look back at the history of video art—the last time mainstream technology collided with art practices—we find several precedents for such gestures. One that stands out is Nam June Paik’s 1965 “Magnet TV,” in which the artist placed a large horseshoe magnet atop a black-and-white television, eloquently turning a space previously reserved for Jack Benny and Ed Sullivan into loopy, organic abstractions. The gesture questioned the one-way flow of information. In Paik’s version of TV, you could control what you saw: Spin the magnet, and the image changes with it. Until that point, television’s mission was as a delivery vehicle for entertainment and clear communication. Yet an artist’s simple gesture upended television in ways of which both users and producers were unaware, opening up entirely new vocabularies for the medium while deconstructing myths of power, politics, and distribution that were embedded—but hitherto invisible—in the technology. The cut-and-paste function in computing is being exploited by writers just as Paik’s magnet was for TV.

While home computers have been around for about two decades, and people have been cutting and pasting all that time, it’s the sheer penetration and saturation of broadband that makes the harvesting of masses of language easy and tempting. On a dial-up, although it was possible to copy and paste words, in the beginning texts were doled out one screen at a time. And even though it was text, the load time was still considerable. With broadband, the spigot runs 24/7.

By comparison, there was nothing native to typewriting that encouraged the replication of texts. It was slow and laborious to do so. Later, after you had finished writing, you could make all the copies you wanted on a Xerox machine. As a result, there was a tremendous amount of 20th-century postwriting print-based detournement: William S. Burroughs’s cutups and fold-ins and Bob Cobbing’s distressed mimeographed poems are prominent examples. The previous forms of borrowing in literature, collage, and pastiche—taking a word from here, a sentence from there—were developed based on the amount of labor involved. Having to manually retype or hand-copy an entire book on a typewriter is one thing; cutting and pasting an entire book with three keystrokes—select all / copy / paste—is another.

Clearly this is setting the stage for a literary revolution.

Or is it? From the looks of it, most writing proceeds as if the Internet had never happened. The literary world still gets regularly scandalized by age-old bouts of fraudulence, plagiarism, and hoaxes in ways that would make, say, the art, music, computing, or science worlds chuckle with disbelief. It’s hard to imagine the James Frey or J.T. Leroy scandals upsetting anybody familiar with the sophisticated, purposely fraudulent provocations of Jeff Koons or the rephotographing of advertisements by Richard Prince, who was awarded a Guggenheim retrospective for his plagiaristic tendencies. Koons and Prince began their careers by stating upfront that they were appropriating and being intentionally “unoriginal,” whereas Frey and Leroy—even after they were caught—were still passing off their works as authentic, sincere, and personal statements to an audience clearly craving such qualities in literature. The ensuing dance was comical. In Frey’s case, Random House was sued and had to pay hundreds of thousands of dollars in legal fees and thousands to readers who felt deceived. Subsequent printings of the book now include a disclaimer informing readers that what they are about to read is, in fact, a work of fiction.

Imagine all the pains that could have been avoided had Frey or Leroy taken a Koonsian tack from the outset and admitted that their strategy was one of embellishment, with dashes of inauthenticity, falseness, and unoriginality thrown in. But no.

Nearly a century ago, the art world put to rest conventional notions of originality and replication with the gestures of Marcel Duchamp’s ready-mades, Francis Picabia’s mechanical drawings, and Walter Benjamin’s oft-quoted essay “The Work of Art in the Age of Mechanical Reproduction.” Since then, a parade of blue-chip artists from Andy Warhol to Matthew Barney have taken these ideas to new levels, resulting in terribly complex notions of identity, media, and culture. These, of course, have become part of mainstream art-world discourse, to the point where counterreactions based on sincerity and representation have emerged.

Similarly, in music, sampling—entire tracks constructed from other tracks—has become commonplace. From Napster to gaming, from karaoke to torrent files, the culture appears to be embracing the digital and all the complexity it entails—with the exception of writing, which is still mostly wedded to promoting an authentic and stable identity at all costs.

I’m not saying that such writing should be discarded: Who hasn’t been moved by a great memoir? But I’m sensing that literature—infinite in its potential of ranges and expressions—is in a rut, tending to hit the same note again and again, confining itself to the narrowest of spectrums, resulting in a practice that has fallen out of step and is unable to take part in arguably the most vital and exciting cultural discourses of our time. I find this to be a profoundly sad moment—and a great lost opportunity for literary creativity to revitalize itself in ways it hasn’t imagined.

Perhaps one reason writing is stuck might be the way creative writing is taught. In regard to the many sophisticated ideas concerning media, identity, and sampling developed over the past century, books about how to be a creative writer have relied on clichéd notions of what it means to be “creative.” These books are peppered with advice like: “A creative writer is an explorer, a groundbreaker. Creative writing allows you to chart your own course and boldly go where no one has gone before.” Or, ignoring giants like de Certeau, Cage, and Warhol, they suggest that “creative writing is liberation from the constraints of everyday life.”

In the early part of the 20th century, both Duchamp and the composer Erik Satie professed the desire to live without memory. For them it was a way of being present to the wonders of the everyday. Yet, it seems, every book on creative writing insists that “memory is often the primary source of imaginative experience.” The how-to sections of these books strike me as terribly unsophisticated, generally coercing us to prioritize the theatrical over the mundane as the basis of our writings: “Using the first-person point of view, explain how a 55-year-old man feels on his wedding day. It is his first marriage.” I prefer the ideas of Gertrude Stein, who, writing in the third person, tells of her dissatisfaction with such techniques: “She experimented with everything in trying to describe. She tried a bit inventing words but she soon gave that up. The english language was her medium and with the english language the task was to be achieved, the problem solved. The use of fabricated words offended her, it was an escape into imitative emotionalism.”

For the past several years, I’ve taught a class at the University of Pennsylvania called “Uncreative Writing.” In it, students are penalized for showing any shred of originality and creativity. Instead they are rewarded for plagiarism, identity theft, repurposing papers, patchwriting, sampling, plundering, and stealing. Not surprisingly, they thrive. Suddenly what they’ve surreptitiously become expert at is brought out into the open and explored in a safe environment, reframed in terms of responsibility instead of recklessness.

We retype documents and transcribe audio clips. We make small changes to Wikipedia pages (changing an “a” to “an” or inserting an extra space between words). We hold classes in chat rooms, and entire semesters are spent exclusively in Second Life. Each semester, for their final paper, I have them purchase a term paper from an online paper mill and sign their name to it, surely the most forbidden action in all of academia. Students then must get up and present the paper to the class as if they wrote it themselves, defending it from attacks by the other students. What paper did they choose? Is it possible to defend something you didn’t write? Something, perhaps, you don’t agree with? Convince us.

All this, of course, is technology-driven. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part party host, part traffic cop, full-time enabler.

The secret: the suppression of self-expression is impossible. Even when we do something as seemingly “uncreative” as retyping a few pages, we express ourselves in a variety of ways. The act of choosing and reframing tells us as much about ourselves as our story about our mother’s cancer operation. It’s just that we’ve never been taught to value such choices.

After a semester of my forcibly suppressing a student’s “creativity” by making her plagiarize and transcribe, she will tell me how disappointed she was because, in fact, what we had accomplished was not uncreative at all; by not being “creative,” she had produced the most creative body of work in her life. By taking an opposite approach to creativity—the most trite, overused, and ill-defined concept in a writer’s training—she had emerged renewed and rejuvenated, on fire and in love again with writing.

Having worked in advertising for many years as a “creative director,” I can tell you that, despite what cultural pundits might say, creativity—as it’s been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the “creative class” but also as a member of the “artistic class.” At a time when technology is changing the rules of the game in every aspect of our lives, it’s time for us to question and tear down such clichés and reconstruct them into something new, something contemporary, something—finally—relevant.

Clearly, not everyone agrees. Recently, after I finished giving a lecture at an Ivy League university, an elderly, well-known poet, steeped in the modernist tradition, stood up in the back of the auditorium and, wagging his finger at me, accused me of nihilism and of robbing poetry of its joy. He upbraided me for knocking the foundation out from under the most hallowed of grounds, then tore into me with a line of questioning I’ve heard many times before: If everything can be transcribed and then presented as literature, then what makes one work better than another? If it’s a matter of simply cutting and pasting the entire Internet into a Microsoft Word document, where does it end? Once we begin to accept all language as poetry by mere reframing, don’t we risk throwing any semblance of judgment and quality out the window? What happens to notions of authorship? How are careers and canons established, and, subsequently, how are they to be evaluated? Are we simply re-enacting the death of the author, a figure that such theories failed to kill the first time around? Will all texts in the future be authorless and nameless, written by machines for machines? Is the future of literature reducible to mere code?

Valid concerns, I think, for a man who emerged from the literary battles of the 20th century victorious. The challenges to his generation were just as formidable. How did they convince traditionalists that disjunctive uses of language, conveyed by exploded syntax and compound words, could be equally expressive of human emotion as time-tested methods? Or that a story need not be told as strict narrative in order to convey its own logic and sense? And yet, against all odds, they persevered.

The 21st century, with its queries so different from those of the last, finds me responding from another angle. If it’s a matter of simply cutting and pasting the entire Internet into a Microsoft Word document, then what becomes important is what you—the author—decide to choose. Success lies in knowing what to include and—more important—what to leave out. If all language can be transformed into poetry by merely reframing—an exciting possibility—then she who reframes words in the most charged and convincing way will be judged the best.

I agree that the moment we throw judgment and quality out the window, we’re in trouble. Democracy is fine for YouTube, but it’s generally a recipe for disaster when it comes to art. While all words may be created equal, the way in which they’re assembled isn’t; it’s impossible to suspend judgment and folly to dismiss quality. Mimesis and replication don’t eradicate authorship; rather, they simply place new demands on authors, who must take these new conditions into account as part of the landscape when conceiving of a work of art: If you don’t want it copied, don’t put it online.

Careers and canons won’t be established in traditional ways. I’m not so sure that we’ll still have careers in the same way we used to. Literary works might function the same way that memes do today on the Web, spreading for a short period, often unsigned and unauthored, only to be supplanted by the next ripple. While the author won’t die, we might begin to view authorship in a more conceptual way: Perhaps the best authors of the future will be ones who can write the best programs with which to manipulate, parse, and distribute language-based practices. Even if, as Christian Bök claims, poetry in the future will be written by machines for other machines to read, there will be, for the foreseeable future, someone behind the curtain inventing those drones, so that even if literature is reducible to mere code—an intriguing idea—the smartest minds behind the machines will be considered our greatest authors.

In 1959 the poet and artist Brion Gysin claimed that writing was 50 years behind painting. He might still be right: In the art world, since Impressionism, the avant-garde has been the mainstream. Innovation and risk taking have been consistently rewarded. But, in spite of the successes of modernism, literature has remained on two parallel tracks, the mainstream and the avant-garde, with the two rarely intersecting. Now the conditions of digital culture have unexpectedly forced a collision, scrambling the once-sure footing of both camps. Suddenly we all find ourselves in the same boat, grappling with new questions concerning authorship, originality, and the way meaning is forged.

Kenneth Goldsmith teaches writing at the University of Pennsylvania, where he is a senior editor of PennSound, an online poetry archive. This article is an excerpt from his book Uncreative Writing: Managing Language in the Digital Age, just published by Columbia University Press.