Ken Steiglitz discusses the transition from the analog world to the digital world, from a world in which quantities had a continous range of values to a world in which quantities have a discrete range of values, ultimately only two values (zero and one).
One intriguing aspect of this techno-social revolution is that it mirrors (50 years later) what happened in physics: Newton's world was a world of continuous values, but Planck and Einstein discovered that energy and other quantities are exchanged in discrete amounts. Einstein discovered that light is made of discrete particles just like a digital signal is made of a discrete series of zeroes and ones.
Even before them, Thomson showed that matter is made of discrete particles called electrons.
The advantage of digital signals is that they get rid of the fundamental problem of analog signals: noise. When a signal is transmitted over a cable or the ether, it is always affected by noise. Any physical system, whether very small or very large, is affected by noise (an interesting detour in the book). Signals that only carry zeroes and ones get rid of the issue of noise if they limit themselves on extremes: the zeroes and the ones come with noise but, if something can only be zero or one, it is easy to decide whether it's a one or a zero (whether it is closer to one or zero). The book introduces the reader to vacuum tubes and transistors, and to Moore's law and points out that quantum mechanics poses a limit to how small transistors can become. Joseph Fourier is the hero of many books that deal with signal processing (see especially Alvy Ray Smith's "A Biography of the Pixel", 2021). In 1827 he showed that any signal can be expressed as a sum of different frequencies. This applies to any kind of signal, including audio and video signals. Almost exactly one century later, in 1928, a Swedish-born researcher at Bell Labs, Harry Nyquist published a theorem that showed a remarkable property of signals: one can capture all the information of a signal by "sampling" it at a rate "at least" twice the highest frequency present in the signal. The idea had already been published by Edmund Whittaker in 1915. Once sampled, the signal is no longer analog but digital, and, again, without losing any information of the analog original. This was the foundation of digital signal processing: it explained how to convert analog signals into digital signals. Claude Shannon completed Nyquist's proof in 1949 showing that "twice" is not only necessary but sufficient. Meanwhile in 1945 Shannon had published (only internally) his study of information, defining information as the removal of uncertainty. "noisy coding theorem" that coding allows to protect information in noisy environments Steiglitz guides us through the early analog "computers", the Jacquard’s loom, Charles Babbage’s analytical machine, Ada Lovelace's first program, Alan Turing’s universal machine (and his key insight that the machine needs to keep a "state"), the Church-Turing theses that the Turing machine can do all digital computation, and so on, basically an introduction to the theory of digital computers. In 1971 Stephen Cook published a paper which introduced a still unsolved question: does NP = P? The "P versus NP" problem has never been solved although most scientists believe that NP does NOT equal P. Some problems can never be solved by any algorithm, some problems can be solved only in exponential time by an algorithm (they are "intractable"), some problems (NP-problems) can be verified (but not necessarily solved) in polynomial time (i.e. they can be verified "quickly"), and some problems (P-problems) can be solved in polynomial time (i.e. they can be solved "quickly"). P is a subset of NP. (Confusingly, NP does not mean "non-polynomial" but "non-deterministic", which is also confusing). "NP-complete" is a family of NP problems such that if one of them has a polynomial-time solution then all the others have one too (they can be "reduced" to each other) and they would all be P-problems. We know thousands of NP-complete problems. For example, it can be proven that the traveling salesman problem is NP-complete. If just one of these NP-complete problems can be solved in polynomial time, then all of them can. But nobody has solved one yet in polynomial time, and most scientists think that NP does not equal P. Steiglitz mentions that many analog (not digital) machines have been proposed for solving NP-complete problems in polynomial times, but so far they all failed, so it doesn't look like analog machines are more powerful than digital ones. However, here Steiglitz introduces quantum computing, an idea that dates back to a Richard Feynman paper of 1982. Feynman's original idea couldn't possibly work because he assumed that it was possible to build a quantum computer that was only "locally" connected. Bell's theorem of 1964 proved that quantum mechanics is nonlocal (the property that Einstein didn't believe). In 1992 David Deutsch, a physicist at Oxford University, published a class of problems that can be solved more efficiently by quantum computation than by classical methods. In 1994 Peter Schor showed one such problem that is particularly relevant: a quantum algorithm for finding the prime factors of an integer in polynomial time. The encryption algorithm RSA is based on the assumption that factoring large integers is computationally intractable. Here something that classical computers cannot do and would be trivial for a quantum computer. So in the end quantum computers can do some things that classical computers (Turing Machines) cannot do but that's very little. NP-complete problems seem to be intractable for any kind of computer. Steiglitz argues that there must be some fundamental physical principle at work here that we haven't discovered yet. The book ends with an interesting discussion about light as a medium of information transmission. He explains why light (photons) in glass wires is better than electricity (electrons) in copper wires at transmitting signals over long distances: optical fibers offer much more bandwidth. Photons do not interact strongly with other matter and are therefore immune to most noise. Electrons, on the other hand, are better suited for logic operations and memory precisely because they interact strongly with other matter. Oddly enough, the book never mentions the telegraph, the first form of digital communication. And of course the alphabet itself, invented by the Phoenicians more than two thousand years ago, is an example of dicrete signaling: whenever you write down something you turn continuous speech into a discrete signal made of characters. TM, ®, Copyright © 2022 Piero Scaruffi |