Intelligence is not Artificial

Why the Singularity is not Coming any Time Soon And Other Meditations on the Post-Human Condition and the Future of Intelligence

by piero scaruffi
Cognitive Science and Artificial Intelligence | Bibliography and book reviews | My book on consciousness | Contact/feedback/email

(These are excerpts from my book "Intelligence is not Artificial")

Analog vs Digital

Most machines in the history of human civilization were and are analog machines, from the waterwheel to your car's engine. A lot of the marvel about computers comes from the fact that they are digital devices: once you digitize texts, sounds, images, films and so forth the digital machine can perform, at incredible speed, operations that used to take armies of human workers or specialists armed with expensive specialized machines. Basically, digitizing something means reducing it to numbers, and therefore the mind-boggling speed of computers in performing calculation gets automatically transferred to other fields, such as managing texts, audio and video. The "editing" feature is, in fact, one of the great revolutions that came with the digital world. Previously, it was difficult and time-consuming to edit anything (text, audio, photos, video). Filing, editing and transmitting are operations that have been dramatically revolutionized by technological progress in digital machines and by the parallel process of digitizing everything, one process fueling the other.

Now that television broadcasts, rented movies, songs and books are produced and distributed in digital formats, i wonder if people of the future will even know what "analog" means. Analog is any physical property whose measurable values vary in a continuous range. Everything in nature is analog: the weight of boulders, the distance between cities, the color of cherries, etc. (At microscopic levels nature is not so analog, hence Quantum Theory, but that's another story). Digital is a physical property whose measurable values are only a few. The digital devices of today can typically handle only two values: zero and one. Actually, i don't know any digital device that is not binary. Hence, de facto, in our age "digital" and "binary" mean the same thing. Numbers other than zero and one can be represented by sequences of zeroes and ones (e.g. a computer internally turns 5 into 101). Texts, sounds and images are represented according to specific codes (such as ASCII, MP3 and MP4) that turn texts, sounds and images into strings of zeroes and ones.

The easiest way to visualize the difference between analog and digital is to think of the century-old bell-tower clocks (with the two hands crawling between the 12 Roman numerals) and the digital clock (that simply displays the time as hour/minutes).

When we turn a property from analog to digital we enable computers to deal with it. Therefore you can now edit, copy and email a song (with simple commands) because it has been reduced to a music file (to a string of zeroes and

Audiophiles still argue whether digital "sounds" the same as analog. I personally think that it does (at today's bit rates) but the stubborn audiophile has a point: whenever we digitize an item, something is lost. The digital clock that displays "12:45" does not possess the information of how many seconds are missing to 12:46. Yesterday's analog clock contained that information in the exact position of the minute hand. That piece of information may have been useless (and obtainable only by someone equipped with a magnifying glass and a pocket calculator) but nonetheless the device had it. The music file is not an exact replica of the song: when the musicians performed it, they were producing an analog object. Once that analog object is turned into a digital file, an infinite number of details have been lost. The human ear is limited and therefore won't notice (except the abovesaid stubborn audiophiles). We don't mind because our senses can only experience a limited range of audio and visual frequencies. And we don't mind because amazing features become available with digital files, for example the ability to improve the colors of a photograph so we can pretend that it was a beautiful vacation when in fact it rained all the time.

When machines carry out human activities, they are "digitizing" those activities; and they are digitizing the "mental" processes that lie behind those activities. In fact, machines can manage those human activities only after humans digitized (turned into computer files) everything that those human activities require, for example maps of the territory.

Using digital electronic computers to mimic the brain is particularly tempting because it was discovered that neurons work like on/off switches. They "fire" when the cumulative signal that they receive from other neurons exceeds a certain threshold value, otherwise they don't. Binary logic, invented in 1854 by the British philosopher George Boole in a book titled "The Laws of Thought", seems to lie at the very foundation of human thinking. In fact, as early as 1943, Warren McCulloch, in cooperation with Walter Pitts, described mathematically an "artificial" neuron that can only be in one of two possible states. A population of artificial binary neurons can then be connected in a very intricate network to mimic the way the brain works. When signals are sent into the network, they spread to its neurons according to the simple rule that any neuron receiving enough positive signals from other neurons sends a signal to other neurons. It gets better: McCulloch and Pitts proved that such a network of binary neurons is fully equivalent to a Universal Turing Machine.

Upon opening a panel titled "The Design of Machines to Simulate the Behavior of the Human Brain" during the Institute of Radio Engineers' Offsite Link Convention, held in New York in 1955, McCulloch confidently stated that "we need not ask, theoretically, whether machines can be built to do what brains can do" because, in creating our own brain, Nature already showed us that it is possible. The issue is not "if", it is only "how". (Of course, the McCulloch-Pitts theorem still fails because of Goedel's theorem, but that's a detail).

There is, however, a catch: McCulloch's binary neurons integrate their input signals at discrete intervals of time, rather than continuously as our brain's neurons do. Every computer has a central clock that sets the pace for its logic, whereas the brain relies on asynchronous signaling because there is no synchronizing central clock. If you get into more details of how the brain works, there are more "analog" processes at work, and there are analog processes inside the neuron itself (which is not just an on/off switch).

One could argue that the brain is regulated by the body's internal clocks (that regulate every function, from your heart to your vision) and therefore the brain behaves like a digital machine; and that everything is made of discrete objects all the way down to quarks and leptons; hence nothing in nature is truly analog. Even if you want to be picky and invoke Quantum Theory, the fact remains that a brain uses a lot more than zeroes and ones; a computer can only deal with zeroes and ones. As tempting as it is to see the brain as a machine based on binary logic, the difference between the human brain and any computer system (no matter how complex the latter can become) is that a computer is way more "digital" than a brain. We know so little about the brain that it is difficult to estimate how many of its processes involve a lot more than on/off switching, but a safe guess is that there are several hundreds. Despite the illusion created by the McCulloch-Pitts neuron, a computer is a binary machine and a brain is not.

There might be a reason if a brain operates at 10-100 Hz whereas today's common microprocessors need to operate at 2-3 Gigahertz (billions of Hz), hundreds of millions of times faster, to do a lot less; and if human brains consume about 20 watts and can do a lot more things than a supercomputer that consume millions of watts. Biological brains need to be low-power consumption machines or they would not survive. There are obviously principles at work in a brain that have eluded computer scientists.

Carver Mead's "neuromorphic" approach to machine intelligence is not feasible for the simple reason that we don't know how the brain works. Based upon the Human Genome Project (that successfully decoded the human genome in 2003), the USA launched the "Brain Initiative" in April 2013 to map every neuron and every synapse in the brain.

There are also government-funded projects to build an electronic model of the brain: Europe's Human Brain Project and the USA's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SYNAPSE), sponsored by the same agency, DARPA, that originally sponsored the Arpanet/Internet. Both Karlheinz Meier in Germany and Giacomo Indiveri in Switzerland are toying with analog machines. The signaling from one node to the others better mimics the "action potentials" that triggers the work of neurons in the human brain and requires much less power than the ones employed in digital computers. SYNAPSE (2008) spawned two projects in California, one run by Narayan Srinivasa at Hughes Research Laboratories (HRL) and the other one run by Dharmendra Modha at IBM's Almaden Labs in Silicon Valley. The latter announced in 2012 that a supercomputer was able to simulate 100 trillion synapses from a monkey brain, and in 2013 unveiled its "neuromorphic" chip TrueNorth (not built according to the traditional John von Neumann architecture) that can simulate 1 million neurons and 256 million synapses. This represented the first building block to push computer science beyond the Von Neumann architecture that Has ruled since the early days of electronic computation. Interestingly, this chip (consuming only 70 milliwatts of power) was also one of the most power-efficient chips in the history of computing... just like the human brain.

Back to the Table of Contents


Purchase "Intelligence is not Artificial"
Back to Cognitive Science | My book on consciousness | My reviews of books | Contact