Analog-digital dichotomy

The discussion on the analog-digital dichotomy has a surprisingly long history, going back to Immanuel Kant, Ludwig Wittgenstein, Samuel Beckett, Martin Heidegger, Soren Kierkegaard and  Gregory Bateson.  Can the analog-digital dichotomy shed light on the limits of Artificial Intelligence in its current (binary) iteration, as implied by Freeman Dyson? Does the distinction between analog and digital (binary) have a cultural dimension, as implied by Carol Wilders? I am preparing a database on the issue. If you have relevant information, articles, or thoughts on the issue, please contribute them for inclusion. Below is a first summary of the current state of the discussion.

* * *

The first computers developed in the 1930s were analog. Starting in the 1950s, most computer design moved to the digital, and analog computing was largely forgotten. But that may be changing. Computer scientists are realizing analog system have qualities digital systems do not.


"A transistor, conceived of in digital terms, has two states: on and off, which can represent the 1s and 0s of binary arithmetic," Larry Hardesty wrote last year in TechExplore. "But in analog term, the transistor has an infinite number of states, which could, in principle, represent an infinite range of mathematical values. Digital computing, for all its advantages, leaves most of transistors' informational capacity on the table."


Missing information is a factor in "digital" music." An analog wave is sampled 40.000 a second and each sample is given a binary number. A digital sound system converts the binary code back into an analog wave. The high sampling rate obscures the "missing" information in the analog-digital conversion to make it nearly undetectable to the human ear. But this missing information can be an important factor in Artificial Intelligence as it advances to ever more complex abilities.

Columbia University researchers are working on the merger of analog and digital computing on a single chip. An article on the CU website points out that "the discrete step-by-step methodology of digital computing was never a good fit for dynamic or continuous problems—modeling plasmas or running neural networks—or for robots and other systems that react in real time to real-world inputs. A better approach may be analog computing, which directly solves the ordinary differential equations at the heart of continuous problems."

Freeman Dyson, professor of physics at the Institute for Advanced Study at Princeton made a similar point in Is life analog or digital?  "Nobody yet knows how the human memory works," says Dyson. "It seems likely that memories are recorded in variations of the strengths of synapses connecting the billions of neurons in the brain with one another, but we do not know how the strengths of synapses are varied. It could well turn out that the processing of information in our brains is partly digital and partly analog. If we are partly analog, the down-loading of a human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities."


Carol Wilder, professor of Media Studies at The New School in New York discussed the aesthetic dimension of analog and digital in her paper Being Analog. She wrote: "It has become apparent that analog/digital carry both precise meanings at the level of physiological, chemical, and electrical processes and broadly metaphorical meanings when applied to human communication and behavior. In order to explore the limits of a postmodern multiplicity of meanings, I engaged friends and colleagues in an exercise to expand upon the standard examples, the result of which is reported here as the dozens of pairs listed in the following table. (Which you can surely add to.) What struck me was how easy it was for people entirely unfamiliar with the analog/digital concepts to pick up on the gist of this list after being given only a few examples. And, yes, while the list itself is 'binary,' I was taken with how many stories of the digital age these pairs tell."

* * *