Bloomsday Countdown

What, reduced to their simplest reciprocal form, were Bloom’s thoughts about Stephen’s thoughts about Bloom and about Stephen’s thoughts about Bloom’s thoughts about Stephen?

He thought that he thought that he was a jew whereas he knew that he knew that he knew that he was not.

[17.527-531]

A lonely impulse of delight

A lonely impulse of delight
Drove to this tumult in the clouds;
I balanced all, brought all to mind,
The years to come seemed waste of breath,
A waste of breath the years behind
In balance with this life, this death.

— William Butler Yeats (From “An Irish Airman Foresees His Death”)

Bloomsday Countdown

It must be a movement then, an actuality of the possible as possible. Aristotle’s phrase formed itself within the gabbled verses and floated out into the studious silence of the library of Saint Genevieve where he had read, sheltered from the sins of Paris, night by night. By his elbow a delicate Siamese conned a handbook of strategy. Fed and feeding brains about me: under glowlamps, impaled, with faintly beating feelers: and in my mind’s darkness a sloth of the underworld, reluctant, shy of brightness, shifting her dragon scaly folds. Thought is the thought of thought. Tranquil brightness. The soul is in a manner all that is: the soul is the form of forms. Tranquility sudden, vast, candescent: form of forms.

[2.67-76]

Rapture of the Nerds

The Coming Superbrain

By JOHN MARKOFF
Published: May 23, 2009
The New York Times

Mountain View, Calif. — It’s summertime and the Terminator is back. A sci-fi movie thrill ride, “Terminator Salvation” comes complete with a malevolent artificial intelligence dubbed Skynet, a military R.&D. project that gained self-awareness and concluded that humans were an irritant — perhaps a bit like athlete’s foot — to be dispatched forthwith.

The notion that a self-aware computing system would emerge spontaneously from the interconnections of billions of computers and computer networks goes back in science fiction at least as far as Arthur C. Clarke’s “Dial F for Frankenstein.” A prescient short story that appeared in 1961, it foretold an ever-more-interconnected telephone network that spontaneously acts like a newborn baby and leads to global chaos as it takes over financial, transportation and military systems.

Today, artificial intelligence, once the preserve of science fiction writers and eccentric computer prodigies, is back in fashion and getting serious attention from NASA and from Silicon Valley companies like Google as well as a new round of start-ups that are designing everything from next-generation search engines to machines that listen or that are capable of walking around in the world. A.I.’s new respectability is turning the spotlight back on the question of where the technology might be heading and, more ominously, perhaps, whether computer intelligence will surpass our own, and how quickly.

The concept of ultrasmart computers — machines with “greater than human intelligence” — was dubbed “The Singularity” in a 1993 paper by the computer scientist and science fiction writer Vernor Vinge. He argued that the acceleration of technological progress had led to “the edge of change comparable to the rise of human life on Earth.” This thesis has long struck a chord here in Silicon Valley.

Artificial intelligence is already used to automate and replace some human functions with computer-driven machines. These machines can see and hear, respond to questions, learn, draw inferences and solve problems. But for the Singulatarians, A.I. refers to machines that will be both self-aware and superhuman in their intelligence, and capable of designing better computers and robots faster than humans can today. Such a shift, they say, would lead to a vast acceleration in technological improvements of all kinds.

The idea is not just the province of science fiction authors; a generation of computer hackers, engineers and programmers have come to believe deeply in the idea of exponential technological change as explained by Gordon Moore, a co-founder of the chip maker Intel.

In 1965, Dr. Moore first described the repeated doubling of the number transistors on silicon chips with each new technology generation, which led to an acceleration in the power of computing. Since then “Moore’s Law” — which is not a law of physics, but rather a description of the rate of industrial change — has come to personify an industry that lives on Internet time, where the Next Big Thing is always just around the corner.

Several years ago the artificial-intelligence pioneer Raymond Kurzweil took the idea one step further in his 2005 book, “The Singularity Is Near: When Humans Transcend Biology.” He sought to expand Moore’s Law to encompass more than just processing power and to simultaneously predict with great precision the arrival of post-human evolution, which he said would occur in 2045.

In Dr. Kurzweil’s telling, rapidly increasing computing power in concert with cyborg humans would then reach a point when machine intelligence not only surpassed human intelligence but took over the process of technological invention, with unpredictable consequences.

Profiled in the documentary “Transcendent Man,” which had its premier last month at the TriBeCa Film Festival, and with his own Singularity movie due later this year, Dr. Kurzweil has become a one-man marketing machine for the concept of post-humanism. He is the co-founder of Singularity University, a school supported by Google that will open in June with a grand goal — to “assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies and apply, focus and guide these tools to address humanity’s grand challenges.”

Not content with the development of superhuman machines, Dr. Kurzweil envisions “uploading,” or the idea that the contents of our brain and thought processes can somehow be translated into a computing environment, making a form of immortality possible — within his lifetime.

That has led to no shortage of raised eyebrows among hard-nosed technologists in the engineering culture here, some of whom describe the Kurzweilian romance with supermachines as a new form of religion.

The science fiction author Ken MacLeod described the idea of the singularity as “the Rapture of the nerds.” Kevin Kelly, an editor at Wired magazine, notes, “People who predict a very utopian future always predict that it is going to happen before they die.”

However, Mr. Kelly himself has not refrained from speculating on where communications and computing technology is heading. He is at work on his own book, “The Technium,” forecasting the emergence of a global brain — the idea that the planet’s interconnected computers might someday act in a coordinated fashion and perhaps exhibit intelligence. He just isn’t certain about how soon an intelligent global brain will arrive.

Others who have observed the increasing power of computing technology are even less sanguine about the future outcome. The computer designer and venture capitalist William Joy, for example, wrote a pessimistic essay in Wired in 2000 that argued that humans are more likely to destroy themselves with their technology than create a utopia assisted by superintelligent machines.

Mr. Joy, a co-founder of Sun Microsystems, still believes that. “I wasn’t saying we would be supplanted by something,” he said. “I think a catastrophe is more likely.”

Moreover, there is a hot debate here over whether such machines might be the “machines of loving grace,” of the Richard Brautigan poem, or something far darker, of the “Terminator” ilk.

“I see the debate over whether we should build these artificial intellects as becoming the dominant political question of the century,” said Hugo de Garis, an Australian artificial-intelligence researcher, who has written a book, “The Artilect War,” that argues that the debate is likely to end in global war.

Concerned about the same potential outcome, the A.I. researcher Eliezer S. Yudkowsky, an employee of the Singularity Institute, has proposed the idea of “friendly artificial intelligence,” an engineering discipline that would seek to ensure that future machines would remain our servants or equals rather than our masters.

Nevertheless, this generation of humans, at least, is perhaps unlikely to need to rush to the barricades. The artificial-intelligence industry has advanced in fits and starts over the past half-century, since the term “artificial intelligence” was coined by the Stanford University computer scientist John McCarthy in 1956. In 1964, when Mr. McCarthy established the Stanford Artificial Intelligence Laboratory, the researchers informed their Pentagon backers that the construction of an artificially intelligent machine would take about a decade. Two decades later, in 1984, that original optimism hit a rough patch, leading to the collapse of a crop of A.I. start-up companies in Silicon Valley, a time known as “the A.I. winter.”

Such reversals have led the veteran Silicon Valley technology forecaster Paul Saffo to proclaim: “never mistake a clear view for a short distance.”

Indeed, despite this high-technology heartland’s deeply held consensus about exponential progress, the worst fate of all for the Valley’s digerati would be to be the generation before the generation that lives to see the singularity.

“Kurzweil will probably die, along with the rest of us not too long before the ‘great dawn,’ ” said Gary Bradski, a Silicon Valley roboticist. “Life’s not fair.”

[A version of this article appeared in print on May 24, 2009, on page WK1 (Week in Review, page 1) of the New York edition (of the New York Times).]

death lies near at hand

A tiny blade will sever the sutures of the neck, and when that joint, which binds together head and neck, is cut, the body’s mighty mass crumples in a heap. No deep retreat conceals the soul, you need no knife at all to root it out, no deeply driven wound to find the vital parts; death lies near at hand. . . . Whether the throat is strangled by a knot, or water stops  the breathing, or the hard ground crushes in the skull of one falling headlong to its surface, or flame inhaled cuts off the course of respiration — be it what it may; the end is swift.

— Seneca

judgment of generations

The people who are least capable of judging the worth of individuals are also the most inclined to adopt fashion as a principal by which to classify them; they have not exhausted, or even grazed the surface of, the talented men of one generation, when suddenly they are obliged to condemn them all en bloc, for here is a new generation with a new label which will be no better understood than its predecessor.

[Proust, Time Regained]

Bloomsday Countdown

—History, Stephen said, is a nightmare from which I am trying to awake.

From the playfield the boys raised a shout. A whirring whistle: goal.

What if that nightmare gave you a back kick?

—The ways of the Creator are not our ways, Mr. Deasy said. All human history moves towards one great goal, the manifestation of God.

Stephen jerked his thumb towards the window, saying:

—That is God.

Hooray! Ay! Whrrwhee!

—What? Mr. Deasy asked.

—A shout in the street, Stephen answered, shrugging his shoulders.

[2.377-386]

Death-Blood

I have a violence in me that is hot as death-blood. I can kill myself or — I know it now — even kill another. I could kill a woman, or wound a man. I think I could. I gritted to control my hands, but had a flash of bloody stars in my head as I stared that sassy girl down, and a blood-longing to [rush] at her and tear her to bloody beating bits.

— Sylvia Plath

every morning war is declared afresh

“Be honest, my friend, you yourself once propounded a theory to me about things existing only in virtue of a creation which is perpetually renewed. The creation of the world did not take place once and for all, you said, it is, of necessity, taking place every day.  Well, if you are sincere, you cannot except war from this theory. . . . [T]he truth is that every morning war is declared afresh. And the men who wish to continue it are as guilty as the men who began it, more guilty perhaps, for the latter perhaps did not forsee all its horrors.”

[Proust, Time Regained]

Bloomsday Countdown

As we, our mother Dana, weave and unweave our bodies, Stephen said, from day to day, their molecules shuttled to and fro, so does the artist weave and unweave his image. And as the mole on my right breast is where it was when I was born, though all my body has been woven of new stuff time after time, so through the ghost of the unquiet father the image of the unliving son looks forth. In the intense instant of imagination, when the mind, Shelly says, is a fading coal, that which I was is that which I am and that which in possibility I may come to be. So in the future, the sister of the past, I may see myself as I sit here now but by reflection from that which then I shall be.

[9.376-385]

the invisible line of this falling bomb

For the novel reality of a danger is perceived only through the medium of that new thing, not assimilable to anything that we already know, to which we give the name “an impression” and which is often, as in the present case, epitomised in a line, a line which defines an intention and possesses the latent potentiality of the action which has given it its particular form, like the invisible line of this falling bomb . . . .

[Proust, Time Regained]

Bloomsday Countdown

After he woke me last night same dream or was it? Wait. Open hallway. Street of harlots. Remember. Haroun al Raschid. I am almosting it. That man led me, spoke. I was not afraid. The melon he had he held against my face. Smiled: creamfruit smell. That was the rule, said. In. Come. Red carpet spread. You will see who.

[3.365-369]

Procrastination

No doubt, my idleness having given me the habit, when it was a question of my work, of putting it off from one day to another, I imagined that death too might be postponed in the same fashion.

[Proust, Time Regained, 163]

Words don’t mean what they mean

Words let us say the things we want to say and also the things we would be better off not having said. They let us know the things we need to know, and also the things we wish we didn’t. Language is a window into human nature, but it is also a fistula, an open wound through which we’re exposed to an infectious world.

[From Steve Pinker’s The Stuff of Thought]