Book Reviews

Additions to the Bibliography on Mind and Consciousness

compiled by Piero Scaruffi

My book on Consciousness | My essays | Cognitive Science news | Contact
My seminar on Mind/Consciousness | My seminar on History of Knowledge

(Copyright © 2000 Piero Scaruffi | Legal restrictions )


Talbot Michael: THE HOLOGRAPHIC UNIVERSE (Harper, 1991)

Click here for the full review


Tarski Alfred: LOGIC, SEMANTICS, METAMATHEMATICS (Clarendon, 1956)

A collection of the historical papers by Tarski, in particular "On the concept of truth", which advanced the correspondence theory of truth: a statement is true if it corresponds to reality. Tarski's semantics has the goal of reducing all concepts to physical concepts. All semantic concepts are defined in terms of truth, and truth is defined in terms of satisfaction, and satisfaction is defined in terms of physical concepts.
Tarski created the first model theory for quantified predicate logic.


Taylor Charles: THE EXPLANATION OF BEHAVIOR (Routledge & Kegan, 1964)

Behavior is a function of the state of the system and its environment; but what brings behavior about is its being required to achieve the goals.


Taylor, Timothy: THE ARTIFICIAL APE (MacMillan, 2010)


Click here for the full review


Teilhard de Chardin, Pierre: "The Phenomenon of Man" (1955)


Click here for the full review


Thagard Paul: MIND (MIT Press, 1996)

A clear and well-organized textbook on cognitive science.


Thelen Esther & Smith Linda: A DYNAMIC SYSTEMS APPROACH TO THE DEVELOPMENT OF COGNITION AND ACTION (MIT Press, 1994)


Click here for the full review


Thom Rene': SEMIOPHYSICS (Addison-Wesley, 1990)

The English translation of a book published in 1988 in France. Semiophysics is the physics of meaning, of significant form. Thom identifies the quantities that define what is relevant for meaning.
Thom rediscovers an ancient theory of Aristotle, which bases Mathematics on the concept of continuum, rather than on the generative properties of numbers, and shows that this approach better suits the biological domain.


Thom Rene': MATHEMATICAL MODELS OF MORPHOGENESIS (Horwood, 1983)

A collection of papers on catastrophe theory written between 1967 and 1981, Thom was interested in including "A dynamic theory of morphogenesis", commonly considered the birth of catastrophe theory. Thom was interested in structural stability in topology (stability of topological form) and was convinced of the possibility of finding general laws of form evolution regardless of the underlying substance of form, as already stated at the beginning of the century by D'Arcy Thompson. Esistence is determined by essence.
Thom takes issue with general systems theory. A system is the content of a region of space-time, but, topologically speaking, this is not a set of objects.
Thom models the seat of the morphogenetic process into domains of different attractors, separated by shock waves. Shock wave surfaces are singularities called "catastrophes". A catastrophe is a state beyond which the system is detroyed in an irreversible manner. In a 4-dimensional world there are 7 types of elementary catastrophes. Elementary catastrophes include: "fold", destruction of an attractor which is captured by a lesser potential; "cusp", bufurcation of an attractor into two attractors; etc. From these singularities, more and more complex catastrophes unfold, until the final catastrophe.
Thom's immediate goal was embryology: he proves that the adult organism is a product of the unfolding of the dynamics which is already in the egg.
All morphogenesis is due to a conflict between attractors. What catastrophe theory does is to "geometrize" the concept of "conflict".
Incidentally, catastrophe theory provides a mathematical justification for Waddington's "epigenetic landscape".
Applications to Physics, Linguistics and Biology are also reviewed.


Thom Rene': STRUCTURAL STABILITY AND MORPHOGENESIS (Benjamin, 1975)

Thom states his goal as to explain the "succession of form". Our universe presents us with forms (that we can perceive and name), A form is defined, first and foremost, by its stability: a form lasts in space and time. Forms change. The history of the universe, insofar as we are concerned, is a ceaseless creation, destruction and transformation of form. Life itself is, ultimately, creation, growth and decaying of form.
Every physical form is represented by a mathematical quantity called "attractor" in a space of internal variables. If the attractor satisfies the mathematical property of being "structurally stable", thenthe physical form is the stable form of an object. Changes in form, or morphogenesis, is due to the capture of the attractors of the old form by the attractors of the new form. This is the process called "catastrophe". All morphogenesis is due to the conflict between attractors.
Thom's basic tenet is that any system is associated to a "catastrophe set", a set of the values that would cause an irreversible change in its form, a "morphogenesis". Thom's is a purely geometric theory of morphogenesis, His laws are independent of the substance, structure and internal forces of the system. .LP Elementary catastrophes are "local accidents". The form of an object is due to the accumulation of many elementary catastrophes.
Local forms are defined by closed sets of points called attractors. Each attractor defines a "basin". Thom proves that in a 4-dimensional space there exist only 7 elementary types of catastrophe.
The difference between static and metabolic form is due to the nature of the attractor: static form is due to an attractor of the space of internal states. Static form is a solid. Metabolic form is smoke.
Thom relates catastrophe theory to Physics and to Information Theory. Then applies catastrophe theory to biological morphogenesis. Thom thinks that the fundamental problem of biology is a topological problem: how form is built. The biochemistry of life should therefore be explained by morphogenesis, not the other way around. He goes on to propose a detailed model of the global evolution of a cell (division, mitosis, meiosis, etc.) Death is easily defined: the trasformation of a metabolic field into a static field. But life would require an "infinite" number of local transformations in order to achieve the anabolic transformation from static to metabolic. Furthermore, once life occurs it is not clear why it stops at all: the underlying processes are reversible, therefore life should continue forever.


Thom Rene: APOLOGIE DU LOGOS (Hachette, 1990)

A huge collection of articles that span Thom's interests, from morphology to catastrophe theory. Thom is the founder of catastrophe theory. In 1973 he wrote the influential paper on semiotics "De l'icone au symbole" in which showed that human sign behavior has nothing special that can distinguish it from animal sign behavior or even from inanimated matter.


Thompson D'Arcy: ON GROWTH AND FORM (Cambridge University Press, 1917)


Click here for the full review


Tipler Frank: THE PHYSICS OF IMMORTALITY (Doubleday, 1994)

Click here for the full review


Samuel Todes: BODY AND WORLD (MIT Press, 2001)

Click here for the full review


Tomasello, Michael: THE CULTURAL ORIGINS OF HUMAN COGNITION (Harvard University Press, 1999)

Click here for the full review


Toffoli Tommaso & Margolus Norman: CELLULAR AUTOMATA MACHINES (MIT Press, 1987)

The book contains an introduction to cellular automata ("discrete dynamical systems whose behavior is completely specified in terms of a local relation). "Cellular automata are the computer scientist's counterpart to the physicist's concept of field". Space is represented by a uniform grid and time advances in discrete steps. Each cell of space contains bits of information. Laws of nature express what operation must be performed on each cell's bits of information, based on its neighbor's bits of information. Laws of nature are local and uniform.
Many chapters detail applications of cellular automata, particularly to Physics.


Tononi, Giulio: "Phi, A Voyage from the Brain to the Soul" (Pantheon, 2012)

Click here for the full review


Touretzky David: THE MATHEMATICS OF INHERITANCE SYSTEMS (Morgan Kaufman, 1986)

Touretzky's inheritance theory shows the similarities between logical proof (which is a tree of formulas, with the theorem at the root and the axioms as the leaves) and paths (sequences of nodes) that are explored during a search within a network.
Touretzky argues that there is a natural partial ordering of defaults in inheritance systems that is implicit in the hierarchical structure of the inheritance graph: the inferential distance, which determines subclass/superclass ordering (a class is a subclass of another class if there is an inheritance path from the former to the latter). Touretzky claims that default rules about subclasses should override default rules about the superclasses that contain them. Subclasses override superclasses.
The best path in a network is the one that minimizes inferential distance (as opposed to the shortest path method of traditional inheritance systems, i.e., the shortest proof is not always the best proof).


Trefil, James S.: Are We Unique?; A Scientist Explores the Unparalleled Intelligence of the Human Mind (Wiley, 1997)

A popular science writer discusses the achievements of modern science while trying to prove the uniqueness of humans (versus animals and machines). His overview of cognitive science and the likes is more a collection of articles on popular buzzwords than an organized overview. And it misses the majority of today's important research, while still recounting obsolete debates.


Trehub Arnold: THE COGNITIVE BRAIN (MIT Press, 1991)

Click here for the full review


Trivers, Robert: SOCIAL EVOLUTION (Benjamin/Cummings, 1985)

Click here for the full review


Tulving Endel & Craik Fergus: THE OXFORD HANDBOOK OF MEMORY (Oxford Univ Press, 2000)

Click here for the full review


Tulving Endel: ORGANIZATION OF MEMORY (Academic Press, 1972)

A collection of articles on memory. Tulving distinguishes between episodic memory (which receives and stores information about temporally dated episodes and temporal-spatial relations among them) and semantic memory (organized knowledge about the world). Episodic memory is a faithful record of a person's experience.

In a subsequent paper Tulving proposed to distinguish different memory systems based on the following characteristics: kinds of information they process, operations that can be performed, neural substrates that are affected, timing of appearance in phylogenetic and ontogenetic development, and format of representation. A memory system can therefore be defined in terms of its brain mechanisms, the information it processes and the principles of its operation.


Tulving Endel: ELEMENTS OF EPISODIC MEMORY (Oxford Univ Press, 1983)

Click here for the full review


Turbayne Colin Murray: THE MYTH OF METAPHOR (Yale Univ Press, 1962)

Turbayne treats metaphor not as a linguistc phenomenon, but as a philosophical one.
Descartes and Newton founded modern science on the basis of a metaphysics of mechanism. Turbayne presents a different metaphor: he treats events in nature as if they compose a language, and the world as a universal language.


Turchin Valentin: PHENOMENON OF SCIENCE (Columbia Univ Press, 1977)

The Russian physicist Turchin works out an evolutionary model of the universe, heavily influenced by cybernetics. The emergence of life and consciousness and culture are reduced to the formation of new systems out of more basic systems within a hierarchy of levels of cybernetic control.


Turing Alan Mathison: MORPHOGENESIS (North-Holland, 1992)

A collection of historical papers by Turing. In "The chemical basis of morphogenesis" (1952) he advanced the reaction-diffusion theory of pattern formation, based on the bifurcation properties of the solutions of differential equations.
Turing devised a model to generate stable patterns:
X catalyzes itself: X diffuses slowly X catalyzes Y: Y diffuses quickly Y inhibits X Y may or may not catalyze or inhibit itself
Some reactions might be able to create ordered spatial schemes from disordered schemes. The function of genes is purely catalytic: they catalyze the production of new morphogenes, which will catalyze more morphogenes until eventually form emerges.


Turing Alan: PURE MATHEMATICS (Elsevier Science, 1992)

A collection of historical papers by Turing.
In 1936 with his seminal paper "On computable numbers" Alan Turing defined computation as the formal manipulation of symbols by the application of formal rules.
A Turing machine is capable of performing all the operations that are needed to perform logical calculus: read current symbols, process them, write new symbols, examine new symbols. Depending on the symbol that it is reading and on the state in which it is, the Turing machine decides whether it should move on, backwards, write a symbol, change state or stop. Turing's machine is an automatic formal system: a system to automatically compute an alphabet of symbols according to a finite set of rules.
The universal machine is a Turing's machine capable of simulating all possible Turing's machines. It contains a sequence of symbols that describes the specific Turing machine that must be simulated. For each computational procedure the universal machine is capable of simulating a machine that performs that procedure. The universal machine is therefore capable of computing any computational function.


Turing Alan: MECHANICAL INTELLIGENCE (Elsevier Science, 1992)

A collection of historical papers by Turing.
In "Computing machinery and intelligence" (1950) Turing proposed a famous test to verify whether a machine is intelligent or not: ask the same questions of a machine and a human being, without being told which one is which, and if you can't tell which one is which, then the machine is intelligent.


Turner Raymond: LOGICS FOR ARTIFICIAL INTELLIGENCE (Ellis Horwood, 1985)

A short, but clear, introduction to non-standard logics: modal logic, epistemic logic, multi-valued logics, intuitionistic logic, theory of types, non-monotonic reasoning, temporal logic and fuzzy logic.


Turner, Scott: THE EXTENDED ORGANISM (Harvard Univ Press, 2000)

Click here for the full review


Turner Scott: THE CREATIVE PROCESS (Lawrence Erlbaum, 1994)

A theory of creativity and a case-based computer prototype ("Minstrel") that generates stories. Art is viewed as a problem solving activity, and an author as a problem solver who employs knowledge encoded in cases. Creativity is an integrated process of search and adaptation guided by creativity heuristics: it is an extension of problem solving that is driven by the failure of problem solving and creative alternatives are created by using old knowledge in new ways.
The architecture employs for classes of goals: thematic goals (development of the story theme, point, moral), concistency goals (plausibility constraints), drama goals (artistic quality) and presentation goals (effective style).


Turvey Michael: PERCEIVING, ACTION AND KNOWING (Lawrence Erlbaum, 1977)

A psychological theory of how cognition and action interact. An action can be performed in many different ways, i.e. the nervous system has to deal with degrees of freedom. It solves the problem through a hierarchical command structure. Every level of the hierarchy adds detail to the overall goal of the action. Lower levels have a degree of autonomy, higher levels exert control over lower units by tuning the parameters that define the features of the lower units and by tuning the pathways connecting them.


Tversky Amos, Kahnemann Daniel & Slovic Paul: JUDGMENT UNDER UNCERTAINTY (Cambridge University Press, 1982)

A collection of essays on heuristics and biases, as introduced by Tversky. The fundamental assumption is that people rely on a limited set of heuristic principles which greatly reduces the task of assessing probabilities: representativeness (the degree to which an event is representative of a class of events), availability (the degree to which past occurrences of an event can be brought to mind) and adjustment (the degree to which the initial approximate value must be changed). Representativeness can be viewed as "connotative" distance, availability can be viewed as "associative" distance.
People employ heuristics to answer questions such as: what is the probability that an object belongs to a given class? that an event originates from a given event? that a process will generate a given event? Heuristics that affect the decision include prior probabilities of outcome, sample size, predictability; but they are not reflected in the theory of probability.
At the same time, deviations of subjective probability from objective probability are systematic. Experiments show that people predict by similarity (representativeness). Experiments also show that causal inferences have greater efficacy than diagnostic inferences.
Tversky criticizes probabilistic reasoning as a way to describe human thinking as it is subject to "framing effects". Tversky & Shafer offered a "constructivist" theory of probabilities in which probabilities describe an ideal situation that can still be related to the real situation.


Tye Michael: THE METAPHYSICS OF MIND (Cambridge University Press, 1989)

There are no mental events (beliefs or desires) and no mental objects (such as pain or images). Drawing from Sellar's "adverbial" theory of sensing, Tye develops his own "operator" theory in which sensory adverbs are analyzable as predicate operators added to a standard predicate calculus.
Tye thinks that the phenomenal aspects of experience ("what it is like") are unrelated to their representational contents.


Tye Michael: THE IMAGERY DEBATE (MIT Press, 1991)

After a sloppy survey of mental-imagery theories over the centuries, Tye proposes a unified theory of mental imagery that embraces both the the visual stance and the linguistic stance, that tries to bridge Stephen Kosslyn's pictorialism and Zenon Pylyshyn's descriptionalism (the two main opposite schools of thought on what kind of representational structures images exactly are). Tye believes that the experimental evidence supports a mixed theory of pictorialism and descriptionalism.

The main flaw of the book (besides misrepresenting some of the ancient thinkers) is that it neglects too much of modern experimental research and theoretical approaches to the field for a book whose title is "the imagery debate". Thus his attempt at unifying the two main schools ends up sounding a bit amateurish.


Tye Michael: TEN PROBLEMS OF CONSCIOUSNESS (MIT Press, 1995)

Click here for the full review



Home | The whole bibliography | My book on Consciousness

(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )