Norbert Wiener:
CYBERNETICS (John Wiley, 1948)

(Copyright © 2014 Piero Scaruffi | Legal restrictions )
This is the book that launched a formal study of "intelligent" machines. Wiener recognized the importance of feedback for any meaningful behavior in the environment: a system that has to act on the environment must be able to continously compare its performed action with the intended action and then infer the next action from their difference. Feedback is crucial for homeostasis, which is crucial for survival.
Wiener emphasized that communication in nature is never perfect: every message carries some involuntary "noise" and in order to understand the communication the original message must be restored. This leads to a statistical theory of amount of information. A theory of information turns out to be the dual of a theory of entropy, another statistical concept: if information is a measure of order, entropy is a measure of disorder.
Wiener understood the essential unity of communication, control and statistical mechanics, which is the same whether the system is an artificial system or a biological system. This unitarian field became "cybernetics".

The Hungarian physicist Leo Szilard, trying to solve the paradox of "Maxwell's demon" (a thought experiment in which measurements cause a decrease of entropy), calculated the amount of entropy generated as the demon stores information in its memory ("On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings", 1929), thereby establishing a connection between information and entropy: information was shown to increase when entropy decreases and viceversa. The tendency of systems to drift from the low-probability state of organization and individuality to the high-probability state of chaos and sameness could be interpreted as a decline in information.

Wiener conceived of information as the opposite of entropy. To him the amount of information in a system was a measure of its degree of organization. Hence, the entropy of a system was a measure of its degree of disorganization. The higher the entropy the lower the information (technically, information is a negative logarithm whereas entropy is a positive logarithm). A process that loses information is a process that gains entropy. Information is a reduction in uncertainty, i.e. of entropy: the quantity of information produced by a process equals the amount of entropy that has been reduced.

An unlikely unusual message is a state of low entropy because there are relatively few ways to compose that message. Its information, however, is very high, precisely because it is so unusual.

The second law of Thermodynamics, one of the fundamental laws of the universe, responsible for our dying among other things, states that an isolated system always tends to maximize its entropy (i.e., things decay). Since entropy is a measure of the random distribution of atoms, maximizing it entails that the distribution has to become as homogeneous as possible. The more homogeneous, the less informative a distribution of probabilities is. Therefore, entropy, a measure of disorder, is also a measure of the lack of information.

Hence, a theory of information turns out to be related to a theory of entropy: if information is ultimately a measure of order, entropy is ultimately a measure of disorder, and, indirectly, a measure of the lack of information; if information is ultimately a measure of chaos, then it is basically entropy. Either way, there is a direct connection between the two.

The second edition, in 1961, added a chapter on self-reproducing machines and one on self-organizing systems.

TM, ®, Copyright © 2014 Piero Scaruffi