November 6, 1996



By ASHLEY DUNN

 

Writing That Defined Computing

efore the device, there was the word.

It was the word that allowed humans to store ephemeral knowledge and emotion, and pass that on to future generations. The thoughts of Plato are as alive today as Jonathan Livingston Seagull and Hiro Protagonist.

But in the new technological world, the word has lost much of its power to devices that have exerted an enormous influence on the way we perceive ourselves and our world. We are the generations of atomic bomb drills, Barney in 100 languages and live video feeds from the blackness of space. In many ways, the modern device has become part of our senses, from digital watches that parse the world into seconds, when primitive humans counted moons, to televisions that provide background noise to shield out the maddening silence of a room.

It may sound a bit like an oxymoron, but there actually is a literature of computing.


The present- and future-obsessed technological world has little use for old words. Nowhere is this truer than in the field of computing, where old papers and essays occupy a position of importance nestled somewhere between the 8-inch floppy disk and Icelandic folk dancing.

It may sound a bit like an oxymoron, but there actually is a literature of computing. Its core is made up of tens of thousands of research papers, the traditional method of spreading information and claiming credit. They are joined by a group of futurist essays and science fiction stories that outline a budding culture of builders.

The pieces are, in general, not blazing examples of writing. In most cases the research papers are terse, dense and so utterly lacking in style that they can make even DOS batch files seem prosaic in comparison.

But within these documents are ideas that have shaped the second most powerful device of the modern age -- the one that doesn't incinerate cities or leave you glowing in the dark. The documents trace an evolving relationship between humans and technology.

There are hundreds of great pieces. I have chosen four of the most accessible to start. The list is necessarily selective because some papers -- like the 1943 article on neurons by Warren McCulloch and Walter Pitts and Alan Turing's piece on computable numbers -- are just too complicated for anyone except the truly obsessed to enjoy. Others, like John von Neumann's 1946 "Preliminary Discussion of the Logical Design of Computing Instruments," which laid out the architecture of modern digital computers, are too long to reproduce.

While these four are not necessarily the most earth-shattering in the body of computer literature, they do mark important points of development that are often forgotten in the blind race of technology toward the bigger, the faster and the more dominant in market share.

Charles Babbage (1791-1871)
Of the Analytical Engine
"Passages from the Life of a Philosopher," Chapter 8

(Written 1864)

Charles Babbage


Charles Babbage dedicated much of his life to the construction of two devices -- The Difference and Analytical engines -- neither of which was built in his lifetime.

The Difference Engine was designed to automatically calculate tables of data. Babbage eventually abandoned the machine for a variety of reasons and turned his attention to the more advanced Analytical Engine.

He came upon the idea for his new machine after arranging the gears and axles of the Difference Engine in a circular, instead of linear, fashion. In this way, the Analytical Engine could take the results of one series of calculations and feed them back to itself. It was described as the machine biting its own tail.

This may seems like a minor change, but it marked the first leap from mere calculators to computers. The Analytical Engine essentially had all the components of a modern computer. It could be programmed using punch cards, had a rudimentary memory and could execute conditional procedures.

For Babbage, the device was mainly just a machine -- a cleaner, more intricate version of the belching factories of the Industrial Revolution. He called the processing section of the engine a "mill" and the memory "the store." In his mind it was a factory that produced numbers instead of rolls of cotton.

But in the same way that industrialists of the time marveled at the enormous production of the new mills, Babbage, too, exhibited a glimmering of awe at the capabilities of his device.

"The whole of arithmetic now appeared within the grasp of mechanism," he wrote in his 1864 memoirs, Passages from the Life of a Philosopher. "I pursued with enthusiasm the shadowy vision."

Isaac Asimov (1920 - 1992)
Runaround
Astounding Science Fiction
(Published March 1942)

Isaac Asimov


In the introduction to The Complete Robot, Isaac Asimov wrote that there were basically two types of robot stories when he was growing up. The first he called robot-as-menace. "Such stories were a mixture of 'clank-clank' and 'aarghh' and 'There are some things man was not meant to know,' " Asimov wrote.

The other was the robot-as-pathos story. These were tales of machines that were actually more human than humans. They were lovable, sensitive and usually treated like dirt because they were so saintly.

Asimov eventually developed a vision of robots as neither menace nor teddy bear. He saw them as complex industrial products. Among the first of these industrial stories was "Runaround," most notable for introducing Asimov's Three Laws of Robotics, which formed the core of his new vision of devices (and have become so ingrained in contemporary views of automatons that they are cited even in the context of robotics engineering).

For Asimov, the robot was a servant of human needs. They were slavish and stupid, but also complex in ways that humans could not fully grasp. As "Runaround" showed, it was not always an optimum combination of attributes.

Vannevar Bush (1890-1974)
As We May Think
The Atlantic Monthly
(Published July 1945)

Vannevar Bush


Bush published "As We May Think," one of the most forward-looking essays of any age about computing, one month before the atomic bomb was dropped on Hiroshima.

As the man who created and headed the Office of Scientific Research and Development, which oversaw all government science projects devoted to the war effort, Bush was a hawkish advocate of the use of science in warfare. His essay, written as the war drew to a close, begs the question, "What are the scientists to do next?"

Bush's answer was that scientists should turn their attention to devices that could help store, manipulate and retrieve information. He argued that "new and powerful instrumentalities" could help humans push to a new level of intellectual development.

Bush talks about numerous ideas that have now become common: hypertext, windows, web browsers, document scanning and voice recognition. His vision is one of an almost organic connection between humans and information machines. The final paragraphs of his essay, in fact, describe direct neural connections between machines and humans -- one of the few ideas in his essay not yet realized.

Alan Mathison Turing (1912 - 1954)
Computing Machinery and Intelligence
Mind, Vol. 59, pp. 433-460
(Published October 1950)

Alan Turing


Alan Turing is probably best known outside the world of computers for his work on cracking the Nazi codes during World War II. In the world of computing, however, Turing has near god-like status for his theoretical contributions.

He is best known for two pieces. The first is his 1936 paper, "On Computable Numbers, with an Application to the Entscheidungsproblem," which demonstrated that any complex computation could be accomplished through a series of small binary operations. It is the basic theory that underlies all modern computers.

The other is titled, "Computing Machinery and Intelligence." It is a classic that is still debated today with a fervor that makes the Mac versus PC holy war seem trivial.

Turing begins his paper with a simple question: "Can machines think?" He has no answer, but he proposes the now famous Turing Test to determine if a machine achieves intelligence.

Since defining intelligence is virtually impossible, Turing proposed that any computer that could fool a human in a conversation be deemed intelligent. It may seem like an silly proposal, but the more you think about the difficulty of defining intelligence, the more elegant Turing's test becomes.

In the years that have followed, an even more interesting debate has evolved over whether human intelligence should even be the standard.

 

Copyright 1996 The New York Times Company


Introduction - Table of Contents | Some History