: Book reviews - Cognitive Science

Book Reviews

Additions to the Bibliography on Mind and Consciousness

compiled by Piero Scaruffi

My book on Consciousness | My essays | Cognitive Science news | Contact
My seminar on Mind/Consciousness | My seminar on History of Knowledge

(Copyright © 2000 Piero Scaruffi | Legal restrictions )


Kafatos, Menas & Nadeau Robert: CONSCIOUS UNIVERSE (Springer Verlag, 1990)

The authors review the birth of Quantum Physics and Relativity Theory and focus on John Bell's theorem, which proved non-locality to be a feature of reality. They reject speculations about faster-than-light signals and argue that Bell's theorem highlights the holistic structure of our universe, in which all parts are connected at all times. They point out that Bohr applied his complimentary principle to psychology (thought and feeling are complimentary the same way that position and momentum are), that complementarity emerges in Linguistics between signified and signifier, in Neurophysiology between the two brain hemispheres, in Biology between organic and inorganic matter, in Thermodynamics between reversible and irreversible processes... They conclude that the universe must be conscious. They try to reconcile modern science and religion.


Kaku, Michio: "Hyperspace" (Oxford University Press, 1994)

Click here for the full review


Kaku, Michio: "The Future of the Mind" (Doubleday, 2014)

Click here for the full review


Kandell Abraham: FUZZY MATHEMATICAL TECHNIQUES (Addison Wesley, 1986)

A very technical and very well organized introduction to the concepts and theorems of fuzzy logic: fuzzy sets, theory of possibility, fuzzy functions (integration and differentiation), multivalent logics, linguistic approximation and applications.


Kanerva Pentti: SPARSE DISTRIBUTED MEMORY (MIT Press, 1988)

The sparse distributed memory is a model of long-term memory in which situations are encoded by patterns of features and episodes are encoded by sequences of them. Any pattern in a sequence can be used to retrieve the entire sequence. Memories are stored based on features. The senses must extract the invariant features of objects to retrieve the corresponding memories. The motor system is also controlled by sequences of patterns in memory. A central site, the "focus", stores all the features that are needed to define the specific moment in time, to account for subjective experience. The model is capable of learning.
Most of the study is a computational analysis of the feasibility of a very large address space whose units of address decoding are linear threshold functions (neurons).


Kaplan David: THEMES FROM KAPLAN (Oxford Univ Press, 1989)

This book is a tribute to Kaplan by a number of thinkers (Castaneda, Church, Deutsch, etc), but also contains Kaplan's famous "Demonstratives" (1977).
Indexicals include the personal pronouns, the demonstrative pronouns, some adverbs ("here", "now", "tomorrow"), etc, i.e. words whose referent depends on the context of use (whose meaning provides a rule which determines the referent in terms of the context). The logic of demonstratives, based on first-order predicate logic, is a theory of word meaning, not speaker's meaning, based on linguistic rules shared by all linguistic users.
Indexicals are "directly referential", i.e. refer directly to individuals without the mediation of Fregean sense (unlike nonindexical definite descriptions, which denote their referent through their sense). Kaplan's indexicals are similar to Kripke's "rigid designators", expressions that designate the same thing in every possible world in which they exist and designate nothing elsewhere. Indexicals provide directly that the referent in every circumstance is fixed to be the actual referent. In Kaplan's case, though, the expression is the "device" of direct reference.
Kaplan distinguishes between the "character" of a linguistic expression (its grammatical meaning, i.e. what the hearer learns when she learns the meaning of that expression) and its "content" in a context (the proposition, the primary bearer of truth-values, the object of thought). Indexicals have a context-sensitive character, nonindexicals have a fixed character. Characters are functions that map contexts into contents.
The theory of direct reference for indexicals includes: the language system (to which meanings and characters belong), the contexts of uses (through which referents are assigned to expressions) and the circumstances of evaluation (at which truth-values are allocated to sentential referents).


Karmiloff-Smith Annette: BEYOND MODULARITY (MIT Press, 1992)

Click here for the full review


Kastner, Ruth: "The Transactional Interpretation of Quantum Mechanics" (Cambridge Univ Press, 2013)

Click here for the full review


Katz Jerrold: THE METAPHYSICS OF MEANING (MIT Press, 1990)

A critique of naturalism, particularly Wittgenstein's argument against intensionalist theories of meaning and Quine's argument for indeterminacy. By examining Wittgenstein's own critique of pre-existing theories of meaning, Katz salvages a theory of meaning (the "proto-theory") which postulates underlying sense structure (just like Chomsky's postulation of underlying syntactic structure) and constructs a decompositional semantics (i.e., provides a preliminary theory of decompositional sense structure).
Katz replaces Frege's referentially defined notion of sense with a notion defined in terms of sense properties and relations internal to the grammar of the language, thereby accomplishing a separation of sense structure and logical structure (a separation of grammatical meaning from reference and use).
Katz thinks that words' meaning can be decomposed in atoms of meaning that are universal for all languages.
This may well be the most detailed critique ever of Wittgenstein's thought.


Katz Jerrold: AN INTEGRATED THEORY OF LINGUISTIC DESCRIPTIONS (MIT Press, 1964)

Two components are necessary for a theory of semantics: a dictionary, which provides for every lexical item a phonological description, a syntactic classification ("grammatical marker", e.g. noun or verb) and a specification of its possible distinct senses ("semantic marker", e.g. light as in color and light as the opposite of heavy); and projection rules, which produce all valid interpretations of a sentence.


Katz Jerrold: THE PHILOSOPHY OF LANGUAGE (Harper & Row, 1966)

According to Katz, a theory of language is a theory of linguistic universals (features that all languages have in common). Katz argues that the basic ontological categories are those semantic markers that are implied by other semantic markers but never imply other markers themselves.


Katz Jerrold: SEMANTIC THEORY (Harper & Row, 1972)

Two components are necessary for a theory of semantics: a dictionary, which provides for every lexical item a phonological description, a syntactic classification ("grammatical marker", e.g. noun or verb) and a specification of its possible distinct senses ("semantic marker", e.g. light as in color and light as the opposite of heavy); and projection rules, which produce all valid interpretations of a sentence.
"The logical form of a sentence is identical with its meaning as determined compositionally from the senses of its lexical items and the grammatical relations between its syntactic constituents."


Kaufmann Arnold & Gupta Madan: INTRODUCTION TO FUZZY ARITHMETICS (Van Nostrand Reinhold)

A technical (and one of the most rigorous) introduction to the properties of fuzzy numbers. A fuzzy number is viewed as an extension of an interval of confidences, once it is related to a level of presumption. The addition of fuzzy numbers and random data yields hybrid numbers, which transform a measurement of an objective data into a valuation of a subjective value without any loss of information. Definitions are provided for derivatives of functions of fuzzy numbers, fuzzy trigonometric functions, etc.


Kauffman Stuart: THE ORIGINS OF ORDER (Oxford University Press, 1993)

Click here for the full review


Kauffman Stuart: AT HOME IN THE UNIVERSE (Oxford Univ Press, 1995)

Click here for the full review


Kauffman Stuart: REINVENTING THE SACRED (Basic, 2008)

Click here for the full review


Kaye Jonathan: PHONOLOGY (Lawrence Erlbaum, 1989)

A cognitive approach to phonology. Besides reviewing the history of the field and the recent developments (syllable structure, tones and nonlinear phonology, harmony, parametrized systems), Kaye advances his own theory that the function of phonological processes is to help process language in a fashion similar to punctuation by providing information about domain boundaries. A theory of markedness was also sketched to explain the fact that certain features condition other features.


Kearns Michael & Varizani Umesh: INTRODUCTION TO COMPUTATIONAL LEARNING THEORY (MIT Press, 1994)

A very technical survey of the main issues of learning theory, built around Valiant's "probably approximately correct" model (1992), which defines learning in terms of the predictive power of the hypothesis output by the learning algorithm. Notions such as the Vapnik & Chervonenkis dimension, a measure of the sample complexity of learning, and various extensions to Valiant's algorithm are presented.


Keenan, Julian: THE FACE IN THE MIRROR (2003)

Click here for the full review


Keil Frank: SEMANTIC AND CONCEPTUAL DEVELOPMENT (Harvard Univ Press, 1979)

Following Fred Sommers, Keil develops a formal theory of the innate constraints that guide and limit the acquisition of ontological knowledge (knowledge about the basic categories of the world). Two terms are of the same type if all predicates that span one of them also span the other one; and two predicates are of the same type if they span exactly the same sets of terms. No two terms have intersecting predicates. No two predicates span intersecting sets of terms (the "M constraint"). Ontological knowledge is therefore organized in a rigid hierarchical fashion.


Keil Frank: CONCEPTS, KINDS AND COGNITIVE DEVELOPMENT ( (Cambridge University Press, 1989)

Concepts are always related to other concepts. No concept can be understood in isolation from all other concepts. Concepts are not simple sets of features. Concepts embody "systematic sets of causal beliefs" about the world and contain implicit explanations about the world. Concepts are embedded in theories about the world, and they can only be understood in the context of such theories.
In contrast with stage-based developmental theories, Keil argues for the continuity of cognition across development. Continuity is enforced by native constraints on developmental directions.
Perceptual procedures through which objects are categorized are not part of the categories: an animal is a skunk if its mother is a skunk regardless of what it looks like.
Keil refines Quine's ideas. Natural kinds are not defined by a set of features or by a prototype: they derive their concept from the causal structure that underlies them and explains their superficial features. They are defined by a "causal homeostatic system", which tends to stability over time in order to maximize categorizing. Nominal kinds (e.g., "odd numbers") and artifacts (e.g., "cars") are similarly defined by the theories they are embedded in, although such theories are qualitatively different. There is a continuum between pure nominal kinds and pure natural kinds with increasing well-definedness as we move towards natural kinds. What develops over time is the awareness of the network of causal relations and mechanisms that are responsible for a natural kind's essential properties. The theory explaining a natural kind gets refined over the years.


Keller, Evelyn: THE CENTURY OF THE GENE (Harvard Univ Press, 2000)

A philosophical discussion on how genes have been "over-rated". Genes need many other entities to perform their job. Knowing only the genes will not explain any of life's mysteries.


Kelso Scott & Mandell Arnold: DYNAMIC PATTERNS IN COMPLEX SYSTEMS (World Scientific, 1988)

Proceedings of a 1988 conference on self-organizing systems.
Hermann Haken discusses the dualism between pattern recognition and pattern formation.
Kelso shows that the brain exhibits processes of self-organization that obey to nonlinear dynamics features (multistability, abrupt phase transitions, crises and intermittency). The human behavior is therefore also subject to nonlinear dynamics.


Kelso Scott: DYNAMIC PATTERNS (MIT Press, 1995)

Kelso believes that all levels of behavior, from neural processes to mind, are governed by laws of self-organization. He explains human behavior from phenomena of multistability, phase transitions, etc.


Kessel Frank: SELF AND CONSCIOUSNESS (Lawrence Erlbaum, 1993)

A collection of essays on the subject, with contributions by Dennett, Neisser and Gazzaniga.


Kim Jaegwon: MIND IN A PHYSICAL WORLD (MIT Press, 1998)

Click here for the full review


Kim Jaegwon: SUPERVENIENCE AND MIND (Cambridge University Press, 1993)

A collection of philosophical essays, particularly on supervenience.
The world has a structure: the existence of an object and its properties depend on, or are determined by, the existence and the properties of other objects. With Hume, "causation is the cement of the universe". Supervenience is a type of relation between objects that occurs between their properties: if two individuals are alike in all their physical properties, then they must be alike also in their nonphysical properties, i.e. the set of valuational (nonphysical) properties supervenes on the set of nonvaluational (physical) ones.
"Supervenience" theory assumes that objects with the same physical properties also exhibit the same mental properties. A causal relation between two states can be explained both in mental terms and in physical terms. The mental and the physical interact only to guarantee consistence. The mental supervenes on the physical, just like the macroscopic properties of objects supervene on their microscopic structures.
In general, supervenience is a relation between two sets of properties over a single domain (e.g., mental and physical properties over the domain of organisms). Weak supervenience occurs when indiscernibility with respect to a class of properties entails indiscernibility with respect to another class of properties. Strong supervenience claims that if individuals share the same physical properties, then they must share the same mental properties. Global supervenience occurs when worlds that are indiscernible with respect to an individual are also indiscernible with respect to another individual.
Kim is a physicalist (the world is a physical world governed by physical laws) and a mental realist (mentality is a real feature of the world and has the power to cause events of the world). His goal is to understand how the mind can "cause" anything in the physical world.


Kirkham Richard: THEORIES OF TRUTH (MIT Press, 1992)

A philosophical (and probably unique) introduction to a variety of modern theories of truth: Charles Peirce's pragmaticism, William James' instrumentalism, Brand Blanshard's coherence theory (truth as a fully coherent set of beliefs), Russell's congruence theory,nd theory of types Austin's correlation theory, Tarski's correspondence theory. Theories of justification (how to identify the properties of true statements by reference to which the truth of a statement can be judged) are treated as separated from theories of truth, as well as theories of speech acts. The systems of Davidson, Dummett, Kripke, Prior are reviewed and criticized.


Kirschner, Mark and Gerhart, John: THE PLAUSIBILITY OF LIFE (2005)

Click here for the full review


Kitchener Robert: PIAGET'S THEORY OF KNOWLEDGE (Yale University Press, 1986)

One of the best introduction to genetic epistemology.


Kittay Eva: METAPHOR (Clarendon Press, 1987)

Click here for the full review


Klahr David: PRODUCTION SYSTEM MODELS OF LEARNING AND DEVELOPMENT (MIT Press, 1987)

A set of articles that provide an overview of production systems from the perspective of cognitive psychology and in the context of working computer programs. Includes Pat Langley's "A general theory of discrimination learning" (the PRISM project) and Paul Rosenbloom's "Learning by chunking" (the XAPS project).


Kleene Stephen: INTRODUCTION TO METAMATHEMATICS (North-Holland, 1964)

Kleene's three-valued logic was conceived to accomodate undediced mathematical statements. The third truth value signals a state of partial ignorance. The undecided value is assigned to any well-formed formula that has at least one undecided component.


Koch Christof: "Consciousness" (MIT Press, 2012)

Click here for the full review


Koch Christof: THE QUEST FOR CONSCIOUSNESS (Roberts, 2003)

Click here for the full review


Kodratoff Yves: INTRODUCTION TO MACHINE LEARNING (Morgan Kaufman, 1988)

A technical, Prolog-oriented textbook on machine learning that starts with the theoretical foundations of production systems, deals with truth maintenance and then surveys a number of learning methods: Mitchell's version spaces, explanation-based (deductive) learning, analogical learning, clustering.


Klopf Harry: THE HEDONISTIC NEURON (Hemisphere, 1982)

Organisms actively seek stimulation. If homeostasis is the seeking of a steady-state condition, "heterostasis" is the seeking of a maximum stimulation. All parts of the brain are independently seeking positive stimulation (or "pleasure") and avoiding negative stimulation (or "pain"). All parts are goal-driven in that, when responding to a given stimulus leads to "pleasure", the brain part will respond more frequently to that stimulus in the future; and viceversa.
In his neural model cognition and emotion cohexist and complement each other. Emotion provides the sense of what organisms need. Cognition provides the means for achieving those needs.


Koestler Arthur: THE GHOST IN THE MACHINE (Henry Regnery, 1967)

Click here for the full review


Kohonen Teuvo: ASSOCIATIVE MEMORY (Springer Verlag, 1977)

The retrieval of information in memory occurs via associations. An associative memory is a system from which a set of information can be recalled by using any of its members. An adaptive associative network is viewed as a reasonable model for biological memory. Kohonen also argues for the biological plausibility of holographic associative memories. For each model a thorough mathematical treatment is provided.


Kohonen Teuvo: SELF-ORGANIZATION AND ASSOCIATIVE MEMORY (Springer Verlag, 1984)

A formal study of memory from a system theory's viewpoint.
Kohonen built a psychologically-plausible model of how the brain represents topographically the world, with nearby units responding similarly. His model is therefore capable of self-organizing in regions.
Kohonen's connectionist architecture, inspired by Malsburg's studies on self-organization of cells in the cerebral cortex, is able to perform unsupervised training, i.e. it learns categories by itself.
Instead of using Hebb's learning, Kohonen assumes that the overall synaptic resources of a cell are approximately constant and what changes is the relative efficacies of the synapses. A neural network has learned a new concept when the weights of connections converge towards a stable configuration. This model exhibits mathematical properties that set it apart: the layering of neurons plays a specific role (the wider the intermediate layer, the faster but the more approximate the process of categorization).
A variant of Hebb's law yields competitive behavior.
Kohonen also reviews classical learning systems (Adaline, Perceptron) and holographic memories.


Kohonen Teuvo: SELF-ORGANIZING MAPS (Springer Verlag, 1995)

The Adaptive-Subspace Self Organizing Map (ASSOM) is an algorithm for neural networks that combines Learning Subspace Method (LSM), the first supervised competitive-learning algorithm ever, and Self Organizing Map (SOM), another algorithm invented by Kohonen, that maps patterns close to each other in the input space onto contiguous locations in the output space (topology preserving). The new algorithm is capable of detecting invariant features.


Kolmogorov Andrei: SELECTED WORKS (Reidel, 1998)

Selected papers by the inventor of the discipline of algorithmic complexity. Also see Li, Ming.


Kolodner Janet & Riesbeck Christopher: EXPERIENCE, MEMORY, AND REASONING (Lawrence Erlbaum, 1986)

An introduction to computational theories of memory that are derived from the conceptual dependency theory. Each article is written by an expert in the field. Schank writes about explanation-based learning. Lebowitz describes his RESEARCHER project. Lytinen discusses his word-based parsing technique. Riesbeck introduces to his direct memory access parsing system.


Kolodner Janet: CASE-BASED REASONING (Morgan Kaufmann, 1993)

A monumental summary of the discipline of case-based systems that also attempts ot lay logical foundations for the field. Emphasis is placed on the views of learning as a by-product of reasoning, and reasoning as remembering; on the essential task of adapting old solutions to solve new problems (old cases to explain new situations). Schank's cognitive model of dynamic memory (MOPs and the likes) is introduced at length. Some of the historical systems (CHEF, CYRUS, etc) are discussed. The book provides detailed techniques for storing, indexing, retrieving, matching and using cases.


Kolodner Janet: RETRIEVAL AND ORGANIZATIONAL STRATEGIES IN CONCEPTUAL MEMORY (Lawrence Erlbaum, 1984)

A description of the CYRUS system, which was based on Schank's conceptual dependency theory.


Kosko Bart: NEURAL NETWORKS AND FUZZY SYSTEMS (Prentice Hall, 1992)

Click here for the full review


Kosko Bart: FUZZY THINKING (Hyperion, 1993)

Fuzziness is pervasive in nature ("everything is a matter of degree"), while science does not admit fuzziness.
Even probability theory still assumes that properties are crisp. And probability (according to Kosko's "subsethood" theorem) can be interpreted as a measure of how much the whole (the space of all events) is contained in the part (the event). Kosko shows how logical paradoxes such as Russell's can be interpreted as "half truths" in the context of fuzzy logic. Heisenberg's uncertainty principle (the more a quantity is accurately determined, the less accurately a conjugate quantity can be determined, which holds for position and momentum, time and energy) can be reduced to the Cauchy-Schwarz inequality (which is related to Pythagora's theorem, which is in turn related to the subsethood theorem).
Applications such as fuzzy associative memories, adaptive fuzzy systems and fuzzy cognitive maps are discussed at length.
Kosko even discusses why the universe exists (because otherwise the fuzzy entropy theorem would exhibit a singularity) and speculates that the universe is information and maybe God himself is information.
Too much autobiography and too many references to eastern religion try to make the book more accessible but probably merely detract from the subject.


Kosslyn Stephen: IMAGE AND MIND (Harvard University Press, 1980)

"Mental imagery" is seeing something in the absence of any sensory signal, such as the perception of a memory. Kosslyn analyzes what is seen when in the brain there is no such image, and why we need mental imagery at all.
Based on numerous psychological experiments, Kosslin maintains that mental imagery is pictorial in character, i.e. that mental imagery involves scanning an internal picture-like entity. Mental images can be inspected and classified using pretty much the same processes used to inspect and classify visual perceptions.
To explain the structure of mental imagery Kosslyn puts forth a representational theory of the mind of a "depictive" type, as opposed to Fodor's propositional theory and related to Johnson-Laird's models. Kosslyn thinks that the mind can build visual representations, which are coded in parts of the brain, and which reflect what they represent. Such representations can be inspected by the mind and transformed (rotated, enlarged, reduced).
There exist two levels of visual representation: a "geometric" level, which allows one to mentally manipulate images, and an "algebric" one, which allows one to "speak" about those images.
Kosslyn thinks that mental imagery achieves two goals: retrieve properties of objects and predict what would happen if the body or the objects should move in a given way. Reasoning on shapes and dimensions is far faster when we employ mental images rather than concepts.


Kosslyn Stephen: GHOSTS IN THE MIND'S MACHINE (W. Norton, 1983)

An introduction to Kosslyn's theory of mental imagery oriented towards a computer implementation.


Kosslyn Stephen & Koenig Olivier: WET MIND (Free Press, 1992)

An overview of cognitive neuroscience, i.e. of psychological studies based on the principle that "the mind is what the brain does", i.e. theories that describe mental events by means of brain activities.
Chapters on neural computation, vision, language, movement, memory.


Kosslyn Stephen: IMAGE AND BRAIN (MIT Press, 1994)

This book revises and expands the contents and conclusions of "Image and Mind".
Kosslyn's proposal for the resolution of the imagery debate is an interdisciplinary theory of high-level vision in which perception and representation are inextricably linked. Visual perception (visual object identification) and visual mental imagery share common mechanisms.
Visual processing is decomposed in a number of subsystems, each a neural network: visual buffer (located in the occipital lobe), attention window (selects a pattern of activity in the visual buffer), two cortical visual systems, the ventral system (inferior temporal lobe, encodes object properties) and the dorsal system (posterior paretal lobe, encodes spatial properties), associative memory (which integrates the two classes of properties), information lookup subsystem (dorsolaterla prefrontal cortex, accesses information about the most relevant object in associative memory), attention shifting subsystems (frontal, parietal and subcortical areas, directs the attention window to the appropriate location). The subsystems may overlap and exchange feedback. More detailed analysis of the visual recognition process identify more specialized subsystems. The model is therefore gradually extended to take into account the full taxonomiy of visual abilities.
Mental imagery shares most of this processing architecture with high-level visual perception.
During the course of the development of the theory, a wealth of psychological and neurophysiological findings is provided.


Kotre John: WHITE GLOVES (Norton, 1996)

Click here for the full review


Koza John: GENETIC PROGRAMMING (MIT Press, 1992)

One of the seminal books on "genetic" programming by means of natural selection. The solution to a problem is found by genetically breeding populations of computer programs. A computer is therefore enabled to solve problems without being explicitly programmed to solve them. The process of finding a solution to a problem is turned into the process of searching the space of computer programs for a highly fit individual computer program to solve such a problem.


Koza John: GENETIC PROGRAMMING II (MIT Press, 1994)

Focuses on automatic function definition for the decomposition of complex problems.


Kripke Saul: NAMING AND NECESSITY (Harvard University Press, 1980)

Click here for the full review


Kuipers Benjamin: QUALITATIVE REASONING (MIT Press, 1994)

A unified theory of qualitative reasoning.
Qualitative reasoning is viewed as a set of methods for representing and reasoning with incomplete knowledge about physical systems. A qualitative description of a system allows for common sense reasoning that overcomes the limitations of rigorous logic. Qualitative descriptions capture the essential aspects of structure, function and behavior, at the expense of others. Since most phenomena that matter to ordinary people depend only on those essential aspects, qualitative descriptions are enough for moving about in the world.
Kuipers presents his QSIM algorithm and representation for qualitative simulation. His model deals with partial knowledge of quantities (through landmark values and fuzzy values) and of change (by using discrete state graphs and qualitative differential equations). A qualitative differential equation is a quadruple of variables, quantity spaces (one for each variable), constraints (that apply to the variables) and transitions (rules to define the domain boundaries).
The framework prescribes a number of constraint propagation techniques, including for higher-order derivatives and global dynamics. First of all, it is necessary to build a model which includes all the elements needed for simulating the system (close-world assumption). Then the model can be simulated. The ontological problem is solved drawing from varius techniques (Forbus' qualitative process theory, Sussman's device modeling approach, DeKleer's "no function in structure").


Kulas Jack, Fetzer James & Rankin Terry: PHILOSOPHY, LANGUAGE AND ARTIFICIAL INTELLIGENCE (Kluwer, 1988)

A collection of historical articles on semantics, including Davidson's "Truth and meaning" (1967), Grice's "Utterer's meaning" (1968), Hintikka's "Semantics for propositional attitudes" (1969), Montague's "The proper treatment of quantification in ordinary english" (1973), Gazdar's "Phrase-structure grammar" (1982), Stalnaker's "Possible worlds and situations".
Kulas provides a historical introduction to the field, starting with Aristotle.


Kuppers Bernd-Olaf: INFORMATION AND THE ORIGIN OF LIFE (MIT Press, 1990)

Click here for the full review


Kurzweil, Ray: "The Age of Intelligent Machines" (1990)

Click here for the full review


Kurzweil, Ray: "The Age of Spiritual Machines" (1999)

Click here for the full review


Kurzweil, Ray: "The Singularity is Near" (2005)

Click here for the full review



Home | The whole bibliography | My book on Consciousness

(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )