In the world of academe, Web clicking would be too easy.
By MARK BAUERLEIN
Wall Street Journal
JANUARY 28, 2010, 7:38 P.M. ET, wsj.com
Is there a college graduate alive who doesn’t react to the words “footnotes” and “bibliography” with at least a small shiver of lingering dread? But while the citations, the style manuals, the numbering, the numbing tedium are just unpleasant memories to folks who now labor far from the groves of academe, they are a continuing affliction for scholars in the humanities. Thanks to the Internet, it doesn’t have to be that way.
Accurate sourcing guarantees that researchers make rigorous inquiries, handle evidence responsibly, and give credit where it’s due. In my first year of graduate school back in the early 1980s, a professor I worked for gave me a stern lesson in research. He was wrapping up an essay on Romantic poetry that borrowed upon recent critical studies, each one of them citing, in turn, previous works of scholarship and primary texts. We went up to the library one afternoon to make sure his immediate sources had reproduced earlier sources accurately. That meant collecting old books and journals one by one and finding relevant pages and passages. “Can’t you just assume they did it right?” I asked. He looked at me, raised a finger, and replied: “Never trust a footnote—always check the original.” After an hour we found one slight misquotation. He corrected it in his text and proved the point: Do your homework and get it right.
It can be so much easier now. If you’re in the middle of a 400-page tome and want to verify a footnote on an essay by John Dewey from 1905, you don’t have to leave your desk. Almost all scholarly journals are now online. A click to the library Web site, then to the electronic version of the Journal of Philosophy, Psychology, and Scientific Methods gives you Dewey’s piece “The Realism of Pragmatism” in handy pdf format. Thousands of old books important to scholars are available, too. If you need to check a quote from William James’s volume “Pragmatism,” there it sits at Google Books, and a word search carries you right to the relevant passage. As more e-versions become available, only books still in copyright and unique archival documents require a day trip to the library or a trip across the country to a special collection. All those hours pulling old periodicals off the shelves, recalling old books and poring through 45 pages to find one sentence shrink to minutes of online delving.
The process works most efficiently, of course, when scholarly writing is online and the scholar embeds a link into the text so that readers can click directly to the original source. But even in hardcover books and paper-and-ink periodicals, a Web page reference inserted at the end of a sentence allows readers to type it into a laptop and have both works at hand for comparison. One of the prime reasons for documentation, the evaluation of a scholar’s evidence, happens more smoothly—and collegially. Researchers who don’t have access to print versions of the works cited in a book they are reading can download them and join professional conversations well-equipped and up to speed. Graduate students who are pressed for time and money but have to complete a reference-heavy dissertation move swiftly toward that blessed ascension to Ph.D. status.
Yet digital modes of citation have not been standardized as scholarly practice in the humanities. The 2008 MLA Handbook still spends 97 pages detailing nondigital formats and protocols of documentation. This means that scholars must devote long hours to an apparatus that in a digital age is inefficient and increasingly unnecessary. As links have multiplied and become standard practice in blogs, online magazines and e-commerce, the academic model appears ever more just that—academic. Citations are supposed to enhance scholarly communication, but as you read one monograph after another, wading through notes and works cited until the eyes glaze over, you sense a different objective at work. Authors clog the pages with notes not to facilitate peer review, but to demonstrate their place at the table, their ability to speak an obscure language that few bother to translate for the world at large.
Canny novices discover the game early in graduate school. Load up your papers and chapters with references, position your work against a dozen figures in the subfield, for every major point give a minilineage of precursors, and sprinkle allusions to Big Thinkers such as Michel Foucault (and, of course, your local mentors). Readers won’t examine the notes closely. They’ll just chalk up the lines by number and names, and recognize you as a legit practitioner.
If we shrink the apparatus with digital links, we’ll lose that brand of professional gamesmanship. Yes, digital citation has its problems. The Web contains lots of unreliable sources and the environment fluctuates over time. The same Web page can say one thing one day, another thing on another day, as in the case of a Wikipedia entry. But the utility outweighs the downside. One of the worst trends in the humanities in recent decades has been the insularity of scholarly expression. The conversation is so mannered and self-involved, so insider-like, that it has no readership beyond a few dozen colleagues in the same sub-sub-field. How many people could stomach a sentence like this one, a runner-up in the celebrated Bad Writing Contest a few years back: “Punctuated by what became ubiquitous sound bites—Tonya dashing after the tow truck, Nancy sailing the ice with one leg reaching for heaven—this melodrama parsed the transgressive hybridity of un-narrativized representative bodies back into recognizable heterovisual codes”?
If digital technology streamlines documentation, we might get fewer hyperacademic treatises weighed down with notes, numbers, allusions and other machinery and find more books and articles built on interesting ideas presented in accessible paragraphs. The MLA and other professional organizations could do a great service by authorizing and standardizing the practice.
Mr. Bauerlein is the author of “The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future.”