Book Reviews

Additions to the Bibliography on Mind and Consciousness

compiled by Piero Scaruffi

My book on Consciousness | My essays | Cognitive Science news | Contact
My seminar on Mind/Consciousness | My seminar on History of Knowledge

(Copyright © 2000 Piero Scaruffi | Legal restrictions )

Waddington Conrad: THE EVOLUTION OF AN EVOLUTIONIST (Edinburgh University Press, 1975)

An accessible exposition of canalization, which Waddington discovered in the 1950s. Waddington was looking for an explanation to the apparent paradox that different genetic programs can produce the same organism (in most cases, far less than 50% of the genes of an individual are shared with individuals of the same species). The individuals of a species differ in all sorts of ways, but somehow their genetic programs are tolerant to such differences and eventually yield the same species. Development of an individual is immune to the pull of the genes, or it is "canalized". Waddington imagined an "epigenetic landscape" created by the concurrent pressures of the environment and the genetic program. Development occurs as a traversing of this landscape. The landscape varies from individual to individual, but it always maintains its fundamental shape of a gently sloping surface, that ends in the same valley. No matter how the landscape is traversed, the motion will always end in that valley.


By analyzing the processes of differentiation in time (histogenesis), in space (regionalization) and in shape (morphogenesis) during embryo development, Waddington argues that development must be genetically determined, as a ball rolling into progressively-deepening valleys as time progresses. Once they start, developmental processes become more and more stable and more and more differentiated. The "epigenetic landscape" depicts the process of "canalization" (increasing differentiation of tissues and organs during embryogenesis).


A general introduction to the themes of artificial intelligence and cognitive science, from Turing's test to problem solving in production systems, from conceptual dependency systems to learning systems. Each historical system/project of artificial intelligence (BORIS, CYRUS, ACT, LEX, AM, BACON) is briefly described, together with its cognitive implications.

Waldrop Mitchell: MAN-MADE MINDS (Walker, 1987)

An accessible introduction to the ideas, the history and the systems of artificial intelligence.

Waldrop Mitchell: COMPLEXITY (Simon & Schuster, 1992)

Complexity is presented as a discipline that can unify the laws of physical, chemical, biological, social and economic phenomena through the simple principle that all things in nature are driven to organize themselves into patterns. The book, written in plain english, focuses on the Santa Fe` Institute school of thought. Lots of biographies and a history of the field.

Walker, Evan: THE PHYSICS OF CONSCIOUSNESS (Perseus, 2000)

Click here for the full review


Ivan Wallin was the first biologist to propose that bacteria may represent the fundamental cause of the "origin of species" (Darwin's unsolved mystery) and that the creation of a species may occur via endosymbiosis.

Waltz David: SEMANTIC STRUCTURES (Lawrence Erlbaum, 1989)

A collection of articles on natural language processing. Michael Dyer discusses BORIS, a system for story understanding based on Schank's conceptual dependency. Wendy Lehnert discusses "plot units" for discourse analysis.

Way Ellen Cornell: KNOWLEDGE REPRESENTATION AND METAPHOR (Kluwer Academic, 1991)

Metaphor is the essence of our ability to represent the world, to assimilate new knowledge into the old. Metaphor is better suited than logic to represent knowledge.
Still, metaphor presents a number of obvious problems: how to determine its truth value (literally, metaphors are almost always false) and how to recognize an expression as a metaphor (metaphors have no consistent syntactic form).
Way claims that literal language is not context-free either. Literal and figurative language are both context-dependent. Figurative cannot be reduced to literal, because literal is not primitive either. What determines literal or figurative speech is the intent of the speaker to select a particular perspective of a type hierarchy and how the concepts which are employed in the speech relate to how they are located in the hierarchy.
The perspective intended by the speaker is revealed by the context, which is represented by a "mask" on the type hierarchy. If the perspective invoked by the context complies with the classification of natural kinds, speech is literal.
Sentences translate into conceptual graphs, and conceptual graphs relate the concepts of the sentence to a type hierarchy. The meaning of a concept is a partial function of its location in a type hierarchy.
The type hierarchy changes dynamically because of the continous change in cultural and social conventions.
Way's formalism is based on Sowa's conceptual graphs, modified so that more perspectives ("masks") are possible. Way's model of metaphor is based on Black's interactionist model (metaphor involves a transfer of knowledge and actually creates similarity).

Webelhuth Gert: GOVERNMENT & BINDING THEORY (MIT Press, 1995)

A collection of articles by authoritative researchers who describe different approaches to equip Chomsky's universal grammar with constraints. The editor surveys progress made in the field since its invention.
The other articles provide an updated view on current research. Drawing from Fillmore's cases and Gruber's thematic relations, Edwin Williams discusses "theta theory" (the theory of thematic roles with respect to a predicate, or theta roles). James Huang examines the relationship between syntax (linguistic form) and semantics (logical form).

Weber Bruce, Depew David & Smith James: ENTROPY, INFORMATION AND EVOLUTION (MIT Press, 1988)

The thesis of this book is that biological phenomena are governed by laws that are purely physical. Evolutionary change results from the interplay of two elementary and independent processes: genetic variation and differential reproduction (natural selection).
A number of essays provide historical surveys of nonequilibrium thermodynamics applied to evolutionary and ecological topics.
By focusing on entropy, structure and information, the essays of this book shed some light on the relationship between cosmological evolution and biological evolution. Thanks to the advent of non-equilibrium thermodynamics, it is now possible to bridge thermodynamics and evolutionary biology. This step might prove as powerful as the synthetic theory of evolution, which merged the Mendelian genetics (a theory of inheritance) and evolutionary biology (a theory of species).
Equilibrium is the state of maximum entropy: uniform temperature and maximum disorder. Entropy is a measure of disorder and it decreases with time, according to the second law of thermodynamics.
Steven Frautschi points out that there is a striking parallelism between the evolution of the expanding universe and the evolution of life on earth: because life on earth has a steady free energy source (the sun), it does not need to come to equilibrium and may even evolve away from it (as it did when it created more and more complex beings, such as ourselves); because the universe has a steady free energy source (the uniform expansion itself), it does not need to come to equilibrium and may even evolve away from it (as it did when it created more and more complex clumps of matter, such as galaxies). Biological evolution and universe evolution are consequences of nonequilibrium processes.
Dilip Kondepudi analyzes Louis Pasteur's discovery that living systems prefer molecules with a certain handedness (all proteins are made of L-amicoacids and genetic material is made of D-sugars), actually that this molecular asymmetry is the only difference between the chemistry of the living and of the dead matter. By looking for the origins of biomolecular chirality (i.e., of chiral-symmetry breaking in chemical systems), he finds similarities with parity violation in weak interactions and posits a fundamental asymmetry of the universe.
Lionel Johnson thinks that emergent properties of biological systems reflect a response both to the physical environment in which the systems are currently existing and to the changing environments in which they have existed over the course of evolutionary time. Emergent properties include that: diversity increases over time (i.e., the number of species existing in the world during any one time period has increased over evolutionary time), diversity increases from the poles to the equator, complexity of evolutionary lines increases over time, the production/biomass ratio (a measure of the rate of energy flow through an ecosystem relative to the energy accumulated in the biomass, i.e. a measure of the rate at which new material must be produced to replace that lost through natural death, i.e. a measure of the rate of energy dissipation, i.e. a measure of the rate of entropy production) declines over time. Johnson defines diversity in a fashion similar to Shannon-Weaver's definition of information, which is similar to Boltzmann's definition of entropy.
Eric Schneider shows that the initial stages of ecological succession are involved in growth and maximization of free energy and structure (Lotka's power law) while later stages involve the development of complexity and efficiency, which in turn require minimization of entropy production.
Lionel Harrison suggests that increases of biological order can be explained in terms of kinetic theory as the result of diffusion and self-catalysis.
Depew and Weber survey the problems encountered by neo-darwinism: the relation with theories of the origin of life, the complex structure of the genome, the punctuated pattern of the archeological record, etc.

Weinberg Steven: DREAMS OF A FINAL THEORY (Pantheon, 1993)

Weinberg, a theoretical physicist who was awarded the Nobel prize for the unification of the electromagnetic and weak forces, believes that a unified theory of all theories exists that would explain the behavior of all animate and inanimate systems in the universe. Such a "grand grand" unification theory should arise from today's theories of elementary particles, and from quantum theory in particular. Weinberg discusses at length the super-string theories as the first step towards such a unification process. Weinberg does not seem to consider the mind a system worth of studying, therefore he never mentions the discrepancies between today's Physics and the disciplines that study the mind. The reader is left with the feeling that, if such a grand-grand unification theory is possible, it is highly unlikely that a physicist will ever discovered it, even by mistake. In his previous book, "The First Three Minutes", Weinberg stated: "The more the universe seems comprehensible, the more it seems pointless". I would suggest that he replaces the word "it" with the word "Physics".

Weld Daniel & DeKleer Johan: QUALITATIVE REASONING ABOUT PHYSICAL SYSTEMS (Morgan Kaufman, 1990)

A collection of seminal papers on qualitative reasoning, which follows Daniel Bobrow's book with the same title: a general survey of the state of the art by Ken Forbus, Pat Hayes' new updated "naive physics manifesto", Johan DeKleer's "A qualitative physics based on confluences", Ken Forbus' "Qualitative process theory" and Benjamin Kuipers' "Qualitative simulation". Each of the classical papers is revised and followed by an update that provides more details.
Also includes Brian Williams' "Temporal qualitative analysis" and James Allen's "Maintaining knowledge about temporal intervals", which provide techniques for reasoning about events taking place over time.
Boi Faltings introduces a graph of places that share important features. For examples, places where parts touch each other are more relevant to the development of the world. Common sense perceives the world as connections between its parts.

Wellman Henry: THE CHILD'S THEORY OF MIND (MIT Press, 1990)

Human knowledge is organized around naive theories that encompass specific domains. Such theories provide constraints for daily actions. One such theory is the theory of the mind (of the mental world of thoughts, beliefs, fantasies, reasoning, etc). The book analyzes how children develop a commonsense understanding of the mind.


A study of "learnability" (the process by which a child learns a natural language when placed in the appropriate environment) in the context of Chomsky's theory (that the child has innate universale principles, or a "universal grammar", with open "parameters" that are set by experience).

Whorf Benjamin Lee: LANGUAGE, THOUGHT AND REALITY (MIT Press, 1956)

A collection of essays by Whorf.
All higher thinking is dependent upon language. Language influences thought because it contains a hidden metaphysics. The structure of the language influences the way its speakers understand the environment.
Whorf formulated the principle of linguistic determinism: grammatical and categorial patterns of language embody cultural models. Language contains an implicit classification of experience, and the language system as a whole contains an implicit world view. Every language is a culturally determined system of patterns that creates the categories by which individuals not only communicate but also think. Language therefore influences thinking.


Wicken thinks that the most general entities subject to natural selection are neither genes nor populations but information patterns of thermodynamic flows, such as ecosystems and socioeconomic systems. Natural selection is not an external force, but an internal process such that macromolecules are accrued in proportion to their usefulness for the efficiency of the global system.
Wicken distinguishes between order (a statistical concept referring to the regularity in a sequence) and organization (which involves spatio-temporal and functional relationships among parts). Thermodynamics can only account for for the generation of structural complexity, but not for functional organization.
Wicken proposes a generalized Lotka law: for any evolving system strategies that focus resources into the system while stabilizing its energetic interconnections will be preferred. Such a process increases biomass/throughput ratios and decreases specific entropy production.
Wicken aims at bridging Darwin and Boltzmann by showing that the thermodynamic forces underlying the principles of variation and selection begin their operation in prebiotic evolution and lead to the emergence and development of individual, ecological and socioeconomic life. The prebiosphere is treated as a nonisolated closed system in which energy sources create steady thermodynamic cycles. Some of this energy is captured and dissipated through the formation of ever more complex chemical structures. Soon autocatalytic systems capable of reproduction appears. Living systems are but "informed autocatalytic systems".

Wiener Norbert: "The Human Use of Human Beings" (Avon, 1950)

Click here for the full review

Wiener Norbert: CYBERNETICS (John Wiley, 1948)

Click here for the full review

Wierzbicka Anna: SEMANTICS, CULTURE, AND COGNITION (Oxford University Press, 1992)

Language is not just a tool for communication, but a tool to express meaning. To what extent meaning is language-independent depends on to what extent is is innate and to what extent it is shaped by culture. Meaning can be transferred from one language to another to some degree, but not fully. There exist a broad variety of semantic differences among languages (even emotions seem to be cultural artefacts), but a few semantic primitives have been proposed. Such universal semantic primitives make up a semantic metalanguage that could be used to explicate all other concepts in all languages.
Wierzbicka therefore explores the languages of the world for the building blocks of emotions, moral concepts, names, etc.

Wierzbicka, Anna: UNDERSTANDING CULTURES THROUGH THEIR KEY WORDS (Oxford University Press, 1997)

The Polish linguist shows how language embeds and influences a culture. Each language uses key concepts that are at the core of the corresponding culture.

Wierzbicka Anna: THE SEMANTICS OF GRAMMAR (Benjamins, 1988)

Language is a tool to communicate meaning, semantics is the study of meaning encoded in language, syntax is a piece of semantics. Corresponding to the three types of tools employed by language to convey meaning (words, grammatical constructions and illocutionary devices), linguistics can be divided in lexical semantics, grammatical semantics and illocutionary semantics. The division in syntax, semantics and pragmatics makes no sense because every element and aspect of language carries meaning. Meaning is an individual's interpretation of the world. It is subjective and depends on the social and cultural context. Therefore, semantics encompasses lexicon, grammar and illocutionary structure.
Grammatical semantics is divided in semantics of syntax and semantics of morphology. A metalanguage is defined to express the meaning of an expression.
Wierzbicka also proves that constructions peculiar to a language embody a view of the world specific to the culture of that language. Therefore, she argues for an "ethno-syntax".

Wilensky Robert: PLANNING AND UNDERSTANDING (Addison Wesley, 1983)

A pragmatic essay on planning techniques applied to natural language understanding.

Wilczek, Frank: "The Lightness of Being" (Basic Books, 2008)

Click here for the full review


A collection of articles on techniques for natural language processing, including connectionist models, discourse theory and approaches to metaphor.
Wilks discusses his "preference semantics", which expouses a constraint-based approach. Natural language understanding comes from the integration of language constraints (syntactic and semantic) with context contraints. One type of semantic constraint is "preferences". Similar to Schanks' expectations, they restrict the selection of senses of lexical entities. In preference semantics each sense of a word is associated to a structured semantic formula. During parsing formulas are bound together into templates and syntax plays a minor role. Semantic deviance considers a metaphor as a violation of restriction rules within a context. Metaphors are intentionally ungrammatical.

Williams George: ADAPTATION AND NATURAL SELECTION (Princeton University Press, 1966)

Click here for the full review

Wills, Christopher & Bada, Jeffrey: THE SPARK OF LIFE (Perseus, 2000)

Click here for the full review

Wilson Edward Osborne: CONSILIENCE (Knopf, 1998)

Click here for the full review

Wilson Edward Osborne: "The Social Conquest of Earth" (Liveright, 2012)

Click here for the full review

Wilson Edward Osborne: SOCIOBIOLOGY (Belknap, 1975)

Click here for the full review

Wilson Edward Osborne: THE DIVERSITY OF LIFE (Harvard University Press, 1992)

Click here for the full review

Wilson Edward & Lumsden Charles: GENES, MIND AND CULTURE (Harvard Univ Press, 1981)

Click here for the full review

Wilson, Frank: THE HAND (Pantheon Books, 1998)

Click here for the full review

Winograd Terry: LANGUAGE AS A COGNITIVE PROCESS (Addison Wesley, 1983)

A textbook for natural language processing: grammars, parsing, transformations, ATNs, case grammar, lexical-functional grammar, generalized phrase-structure grammar. Techniques are detailed for computer implementation.

Winograd Terry: UNDERSTANDING NATURAL LANGUAGE (Academic Press, 1972)

A description and discussion of a natural lnaguage understanding program (SHRDLU) based on an integrated model of syntax, semantics and inference and applied to the blocks world.

Winograd Terry & Flores Fernando: UNDERSTANDING COMPUTERS AND COGNITION (Ablex, 1986)

Drawing from Heidegger's phenomenology and Maturana's cognitive biology, Winograd denies that intelligence can be due to processes of the type of production systems, i.e. to the systematic manipulation of representations. Intelligent systems act, don't think. They think when action does not yield the desired result. Only then do they decompose the situation and try to infer action from knowledge.
In language the role of the listener is emphasized for the active generation of meaning. Language is ultimately based on social interactions, as proved by the speech act theory of Austin and Searle.
The book concludes that the program of Artificial Intelligence must be changed to view the computer merely as a tool to improve the life of humans.

Winson Jonathan: BRAIN AND PSYCHE (Anchor Press, 1985)

Click here for the full review

Winston Patrick: ARTIFICIAL INTELLIGENCE (Addison Wesley, 1993)

Third edition of one of the earliest textbooks on artificial intelligence.

Wittgenstein Ludwig: PHILOSOPHICAL INVESTIGATIONS (Macmillan, 1953)

One of the milestone books of modern philosophy, it contains a wealth of ideas.
Foremost is the theory of family resemblance. A category like "game" does not fit the classical idea of categories being closed by clear boundaries and defined by common properties of their members. What unites the category is family resemblance, plus sets of positive and negative examples; and boundaries may be extended at any time.
About language in generale, Wittgenstein argues that to understand a word is to understand a language and to understand a language is to master the linguistic skills.
Wittgenstein systematically demolishes all pre-existing theories of meaning. In particular, he abandons Frege's notion of sense (and any intensionalist notion of sense).

Wolf, Fred Alan: MIND INTO MATTER (Moment Point, 2001)

Click here for the full review


Click here for the full review

Wolfram, Stephen: CELLULAR AUTOMATA AND COMPLEXITY (Addison-Wesley, 1994)

A collection of papers by Wolfram from 1982 to 1986. A number of studies present a general mathematical model for cellular automata viewed as discrete self-organizing dynamical systems. They can be organized in four classes, which behave respectively like limit points, limit cycles, chaotic attractors and universal computating machine. Their evolution is almost always irreversible. Entropies and Lyapunov exponents measure the information content and rate of information transmission in cellular automata.

Wood Mary McGee: CATEGORIAL GRAMMARS (Routledge, 1993)

A short, compact but very technical manual that summarizes the state of the art in categorial grammars.
Categorial grammars, which originated from the logic of Adjukiewicz (1935) and the algebraic calculus of Joachim Lambek (1958), represent semantics directly in syntax. Categorial grammars represent a refinement of phrase-structure grammars as they assign an internal structure to category symbols. The set of categories is defined recursively: if X and Y are categories, then any function from X into Y is also a category.
The book sketches the history of the field, from Bar-Hillel to Montague. The various types of categorial grammars, from Lambek calculus to more complex variants, are introduced.


This question-answering system employed the first computational model for natural-language semantic interpretation. It defined a procedural semantics and introduced the ATN grammar.

Wrangham, Richard: CATCHING FIRE (Basic, 2009)

Click here for the full review

Wright Larry: TELEOLOGICAL EXPLANATIONS (Univ of California Press, 1976)

This "etiological analysis of goals and functions" employs a slight variation of Charles Taylor's definition of behavior (a goal-directed function of the state of the system and the environment). Wright thinks that any feature of a species exists because it was needed to overcome natural selection. Evolution is the fundamental criterion to determine the function of a property.

Wright Robert: "The Evolution of God" (Little Brown & Co, 2009)

Click here for the full review

Wright Robert: THE MORAL ANIMAL (Random House, 1994)

Click here for the full review

Home | The whole bibliography | My book on Consciousness

(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )