Matthew Ginsberg:
READINGS IN NONMONOTONIC LOGIC (Morgan Kaufmann, 1987)


Home | The whole bibliography | My book on Consciousness

(Copyright © 2000 Piero Scaruffi | Legal restrictions - Termini d'uso )


Ginsberg introduces the problems that led to the development of nonmonotonic logics, first and foremost property inheritance by default (such as "Tweety is a bird" implies that "Tweety can fly"). Then he classifies formal approaches to nonmonotonic inference: proof-based approaches (Raymond Reiter's default logic), modal approaches (Drew McDermott's nonmonotonic logic, Robert Moore's autoepistemic logic) and minimization approaches (John McCarthy's circumscription).
Reiter's "A logic for default reasoning" (1980) introduces the "closed world axiom" (what is not true is false), or negation as failure to derive (if a ground atomic formula cannot be proved using the premises, then assume the formula's negation), then defines his default logic with the following inference rule: "if A is true and is consistent that B is true, then assume that B is also true" (or "if a premise if true, then the consequence is also true unless a condition contradicts what is known").
McDermott's formulation of modal logic (1980) is based on a coherence operator ("P is coherent with what is known" if P cannot be proven false by what is known).
Moore's "Semantical consideration of nonmonotonic logic" (1985) removed some of the problems with McDermott's with his "autoepistemic logic", based on the notion of belief (related to McDermott's coherence), which extends Hintikka's epistemic modal logic to incorporate action. The logic models the beliefs of an agent reflecting upon his own beliefs. Moore also provides a possible-world semantics for his logic.
In 1987 Kurt Konolige proved that autoepistemic and default logic are formally identical.
Yoav Shoham (1987) argues that all approaches to nonmonotonic reasoning can be reduced to a partial ordering on the set of all models for a theory.
Ginsberg argues that a variety of approaches to nonmonotonic reasoning can be unified by resorting to multi-valued logics.
Jon Doyle's "Truth Maintenance System" (1979) was the first effective computational frameworks for default reasoning. It consists of a problem solver that draws inferences and a system that records those inferences (or "justifications"). It maintains beliefs and justifications for beliefs. It ensures that the database is free of contradictions by identifying and adding justifications to remove contradictions whenever they are discovered (dependency-directed backtracking).
Johan DeKleer (1986) improved the concept with his "assumption-based" TMS which labels each proved proposition with the sets of premises needed to derive it (its "context").
In 1986 McDermott introduced the "temporal projection problem" (which occurs when trying to infer which facts are true once a sequence of events have occurred) and proved that none of the nonmonotonic approaches can deal with it. The logic should select not the minimal models, but the chronologically minimal models.
Shoham's "Chronological ignorance" (1986) formalizes the idea of chronological minimization by temporally ordering the conflicting extensions that underlie it and preferring the later ones (those in which abnormality occurs as late as possible).
By employing possible worlds, Ginsberg's "Reasoning about action" (1987) solves the frame, ramification and qualification problems and circumvents the temporal projection problem. As is Richard Fikes' STRIPS, a single model of the world is updated when actions are performed by constructing the nearest world to the current one in which the consequences of the actions under consideration hold. The nearest world is found by constructing proofs of the negation of the explicit consequences of the expected action and by removing a premise in each proof from the current world.
The book contains McCarthy's "Some philosophical problems from the standpoint of Artificial Intelligence" (1969), "Epistemological problems of Artificial Intelligence" (1977), and "Circumscription"" (1980).
The first article is the one that introduced situation calculs and the frame problem. The second identifies the qualification problem and the third one details his theory of circumscription.
According to McCarthy, knowledge representation must satisfy three fundamental requirements: ontological (must allow one to describe the relevant facts), epistemological (allow one to express the relevant knowledge) and heuristic (allow one to perform the relevant inference). Artificial Intelligence can be defined as the discipline that studies what can be represented in a formal manner (epistemology) and computed in an efficient manner (heuristic). The language of logic satisfies those requirements: it allows us to express everything we know and it allows us to make computations on what is expressed by it. Each set of knowledge is in fact a mathematical theory.
McCarthy's situation calculus represents temporally limited events as "situations" (snapshots of the world at a given time), by associating a situation of the world (set of facts that are true) to each moment in time. Actions and events are functions from states to states. An interval of time is a sequence of situations, a "chronicle" of the world. The history of the world is a partially ordered sequence of states and actions. The property of states is permanence, the property of actions is change. Each situation is expressed in a formula of first-order predicate logic. Causal relations between two situations can then be computed. A state is expressed by means of logical expression that relate objects in that state. An action is expressed by a function that relates each state to another state.
McCarthy's "frame problem" states that it is not possible to represent what does not change in the universe as a result of an action, as that is an infinite set of facts. Complementary paradoxes are the "ramification problem" (infinite things change because one can go into greater and greater detail of description) and the "qualification problem" (the number of preconditions to an action is also infinite).
Circumscription deals with default inference by minimizing abnormality: an axiom that states what is abnormal is added to the theory of what is known (predicate circumscription). The objects that can be shown to have a certain property, from what is known of the world, are all the objects that satisfy that property (or, the only individuals for which that property holds are those individuals for which it must hold). This definition involves a second-order quantifier. This is analogous to Frege's method of forming the second-order definition of a set of axioms: such a definition allows both the derivation of the original recursive axioms and an induction scheme stating that nothing else satisfies those axioms.