# Against Nonmonotonic Logic (DRAFT)

Michael J. O'Donnell1

8 April, 1993

### Abstract:

The monotonicity property of inference in conventional formal systems of logic ( and implies ) appears to prohibit certain crucial steps in practical intuitive reasoning. For example, reasoning by default (using assertions that are assumed to hold until explicitly contradicted) appears to be impossible in monotonic systems. Many crucial steps in practical reasoning certainly require nonmonotonic behaviors in inference procedures. But, I argue that it is highly misleading to describe these nonmonotonic behaviors in terms of nonmonotonic inference relations . Rather, inference procedures should use metatheoretic operators, with conventionally monotonic inference relations, to support nonmonotonic behavior.

# Defeasible Reasoning Requires Nonmonotonicity

Most formal systems for logical reasoning are essentially definitions of a relation , called logical inference. When is a set of propositions expressed in the language used by such a formal system, and is another such proposition, means that the formal system allows to be inferred from hypotheses in . In the vast majority of formal systems studied by logicians, the relation is required to be monotonic in its left argument.

Definition 1   A logical inference relation is monotonic if and only if, for all sets of propositions and , and for all propositions , if and , then .

Certain practical considerations appear to argue in favor of formal systems with nonmonotonic logical inference relations. I will sketch these practical considerations, but argue nonetheless that they are best accommodated by logical inference relations that are conventionally monotonic.

Practical reasoning often involves defeasible [#!defeasible!#] (retractable) steps. In particular, there are often default assumptions [#!default!#] about typical cases, that are accepted until specific reasons to reject them are found. The example of concluding that a given bird flies until learning that the bird is a penguin has achieved chestnut status.

Example 1 (Penguins [#!penguins!#])   Consider a formal language with formulae expressing at least the following six propositions

Given knowledge of the set of three propositions

it seems quite rational to suppose that George is in fact a typical bird, and conclude that . That relation between a set of propositions and another proposition is often expressed as

But, given the additional knowledge that , it seems irrational to conclude that . Letting

this relation is expressed as

But

so the appropriate formal system of logic in which to perform practical reasoning about birds, penguins, and ability to fly seems to be nonmonotonic.

The desire to have a reasoning procedure that assumes typical conditions until finding evidence of a special case seems to prohibit monotonicity in the penguin example above. But, a closer analysis reveals more than one formal site for the nonmonotonicity of the procedure.

# Two Formal Sites for Nonmonotonic Behavior

I believe that the penguin example, and many similar examples, provide a strong case for nonmonotonic behavior in a reasoning procedure. That is, there are certainly cases in practice where the discovery of new knowledge requires a retraction of old assertions. But, the explication of this nonmonotonic behavior can be attached to at least two different sites in a formal system of reasoning: the inference relation , and the formulae themselves. I will argue for the latter choice, but first I will reject some particular forms for doing so, in spite of their initial appeal.

In principle, nonmonotonic behavior of procedures using a monotonic inference relation may be explained by weakening the conclusions of inference. For example, instead of , we might have

or

Nonmonotonic behavior is produced by introducing a large or even infinite set of propositions, such as , as default assumptions, then removing them as soon as counterevidence appears. The extra clauses in conclusions serve as switches to enable or disable the real conclusions, such as , by respectively including or excluding the default assumptions.

The basic idea of weakening conclusions to maintain monotonicity may be refined [#!weakening!#], but I believe that most reasoners rightly reject this complication of the form of final conclusions. Since the normal mode of assertion of a proposition in everyday life implicitly means something like, As well as can be determined, holds,'' it seems inappropriate to introduce explicit formal disclaimers into every proposition. The best known formal systems for mathematical reasoning (the Classical First-Order Predicate Calculus heads the list, but the same holds for higher-order classical logics, set theories, and formalizations of intuitionistic or constructive reasoning) all treat the formal assertion of a proposition as meaning something like,  is absolutely true, with no possibility of error.'' But, nowhere in popular usage, not even in mathematical journals, is that formal notion actually practiced. If it were, Russell's antinomy would have been a catastrophe rather than a stimulus to further research. So, I reject the weakening of conclusions by explicit formal disclaimers, since it forces a detailed accounting for assumptions that are made uniformly and universally in the practice of language, and thereby obscures the natural structure of practical assertions of propositions.

Rather than explaining nonmonotonic behavior by weakening the statements of the conclusions of inference, we need to strengthen the hypotheses. An obvious possibility is to add the same sorts of propositions that are used to weaken the conclusions, yielding

or the more transparent

The trouble is that assertions of typicality, such as , do not represent direct intuitive judgements, but rather are themselves the results of sophisticated epistemological reasoning. Instead of begging the question of how one decides to assume , we should reduce that judgement to concepts that can be referred more directly to observations and intuition. The use of avoids the explicit reference to typicality, but even more obviously begs the question of how we establish the hypothesis that , which is essentially as hard as the final conclusion .

Another way of augmenting the hypotheses is to use a formal operator , where for propositions and sets of propositions, means that a reasonably diligent search for known propositions relevant to yields only . Then, the inference that is represented by

while

This approach captures the idea that the information in alone does not allow us to conclude rationally that ; rather, the fact that a diligent search for knowledge bearing on came up with nothing more than is required to support the conclusion. is unusual in that it asserts something about the very inference procedure that is considering propositions built up from , so the operator is metatheoretic and reflective [#!reflection!#], in some ways analogous to the negation as failure used in certain implementations of Prolog [#!negation-as-failure!#].

can trigger nonmonotonic behavior, since the addition of new information to the knowledge base, or even the discovery of more information that was already in it, can change from true to false. In the penguin example, once we discover the proposition among our knowledge, the metatheoretic proposition becomes false, and the conclusion can no longer be drawn, because the hypothesis required to infer it does not hold. becomes true at this point, but it does not support the inference that .

Further refinement of these ideas may show how to avoid the annoying repetition of on both sides of , or may reveal that search keys have a different type than propositions, with some useful relations determining which search keys are relevant to inferring which propositions. One natural operator to consider is the consistency operator  [#!consistent!#]. When is a set of propositions, asserts that is internally consistent (noncontradictory). Using , we might approach the penguin problem by allowing

In a reasonable system,

holds, but

may not hold. In spite of the attractive conceptual simplicity of reasoning using , the huge cost of testing consistency, and the lack of apparent flexibility for dealing with conflicting defaults, suggest that we need an operator that refers somehow to the results of the limited searches for knowledge carried out by practical inference procedures, rather than the more abstract notion of consistency in a logical system. On the other hand, some sort of resource-limited test of consistency might be the right basis for introducing nonmonotonicity. Much further thought is required to determine whether expressions describing resource limitations need to be included as arguments to the operator, or whether they may be fixed or quantified out in a standard way.

In the next two sections, I argue that the use of a nonmonotonic logical inference relation is highly misleading, and not the appropriate way to support nonmonotonic behavior in a procedure for reasoning. Rather, I argue that nonmonotonic behavior should be supported by including, in the hypotheses of a logical inference, an explicit formal representation of the metatheoretic fact that certain propositions contain all of the knowledge relevant to a given problem discovered by a reasonably diligent search. I use the form , where is a proposition and is a finite set of propositions, to represent such metatheoretic assertions. I do not believe that this form is exactly the right one in practice, but it has enough of the essential elements to serve for the current comparative argument.

# Two Interpretations of the Logical Inference Relation

The crux of my argument depends on choosing between two interpretations of the logical inference relation denoted by , one of which allows the relation to be nonmonotonic, and the other of which demands monotonicity. Roughly, means that, given knowledge of the propositions in , it is rational to infer as well. Two different ways of making this rough idea more precise yield radically different results for practical reasoning.

Definition 2   Let be a proposition, and be a set of propositions.

means that, whenever an agent knows each of the propositions in , the agent may rationally conclude that holds as well (the subscript stands mnemonically for local'').

means that, whenever an agent knows each of the propositions in , and the agent knows that includes at least all the results of some reasonably diligent search for all available information relevant to judging the correctness of , then the agent may rationally conclude that holds as well (the subscript stands for global'').

The only distinction between and relevant to my discussion is in the different modes of asserting --as merely some set of known propositions in one case, or a set containing all known relevant propositions in the other case. The use of propositions rather than formulae, and sets rather than some other structures of propositions or formulae, makes no difference to the present discussion. By knowledge'' I mean reasonably reliable information that is relatively easily accessible to an agent. In particular, I do not assume that all knowledge is true. Some readers may wish to substitute rational and informed belief'' for knowledge.''

Since knowing each of the formulae in a set implies knowing each of the formulae in every subset of , must be monotonic. is typically nonmonotonic, since as we make the set larger, the information that contains all available relevant knowledge becomes weaker, and therefore supports fewer inferences. In principle, logicians may choose to study any interesting relation between propositions and sets of propositions that they please. But, by focusing attention on an infelicitously chosen relation, and referring to it as logical inference,'' we may mislead those who are trying to apply formalisms to practical problems. An intuitively well grounded focus of attention may clarify thinking. In the next section, I analyze the concrete operations performed by a reasoning agent, and argue that is a useful relation for studying such agents, while is not.

# The Case for Monotonic Logical Inference, with Nonmonotonic Procedures

The choice of an inference relation such as or as the basis for studying practical reasoning should depend on the following properties of a typical useful reasoning agent.

1. The agent has access to a huge set of propositions representing knowledge about the world, through some combination of queries to an internal database, observations of the world, and receipt of communications from other agents.
2. Access to available knowledge takes time, and consumes valuable resources, in a manner that is generally monotonic in some reasonable measure of the size of the set of propositions accessed.
3. The agent must do essentially all of its reasoning from small subsets of propositions selected from the available knowledge.
4. The agent must often interleave inference and access to knowledge--that is, an inference from knowledge accessed so far will often be required in order to determine how to search for further useful knowledge.
From these properties of a reasoning agent, it follows that the individual steps by which inferred propositions are added to current knowledge are of the form
• from some subset of current knowledge, find a proposition such that , and add to current knowledge.
Property 4 requires here, instead of , since some inferences must be made before a reasonably diligent search is completed. essentially forces a reasoner into a two-phase procedure of search followed by inference.

In principle, propositions might be stratified hierarchically, so that whenever a judgement about the truth of is required for a reasonably diligent search for all information relevant to , then no judgement about is required for a reasonably diligent search for all information relevant to . Then might conceivably work, if we require to include all the propositions found by the latest search. That is, the two-phase procedure using may be repeated to simulate interleaving of search and inference. But, experience doesn't suggest such a natural stratification, and it seems pointlessly expensive to require completion of a reasonably diligent search in the large number of cases where the inference does not require it semantically.

Certainly, is an interesting relation, and it is normally a stronger inference tool (not a logically stronger relation), in the sense that there are normally and such that , but , while the reverse will not happen with conventional modes of assertion. So, in those cases where an appropriate search has been completed, and all of its results included in , an agent should be allowed to add such that . But, the agent must notice the fact that a diligent search is complete (this is necessary even if all inference steps use , since the agent must decide when to stop gathering members of , and start applying ), so it seems natural to encode the act of noticing completion of the search as a metatheoretical proposition added to current knowledge. This leads to the encoding of

in a form such as

Certainly, the syntactic form for representing the proposition can be refined to be compatible with some useful data structure for noticing search completion, so this encoding need not reduce the efficiency of a reasoning procedure originally based directly on .

So, it appears that the information carried by is essential in the design of practical reasoning procedures, and that the information in may be encoded into by adding appropriate metatheoretic operators, which construct assertions that certain sets of propositions contain all of the results of certain searches, to our formal language. On the other hand, I doubt that can be defined efficiently and naturally from . An obvious candidate definition is

• if and only if for all .
A naive application of this definition leads to an outrageously expensive implementation of . It seems unlikely that there is an efficient general implementation. Furthermore, it is not even clear that the definition is extensionally correct in practically interesting formal systems of logic. The only if part probably holds in most reasonable systems. Monotonicity of requires that . When we normally expect that as well, since the in the second case the hypothesis carries the information given directly by propositions in as well as the information that contains all results of some appropriate search. But, the if part seems to depend in a delicate way on the range of quantification of extended sets of propositions, which varies depending on the generality of the formal language in use.

I do not claim that the particular syntactic form of is the right one for practical reasoning. But, this form illustrates some of the crucial properties of the right solution.

1. Inferences using default reasoning must have metatheoretic hypotheses, since the basis for using a default assumption is the inability to discover a counterargument, rather than some direct observation about the world outside of a reasoning agent.
2. Some sort of search parameter must be involved in such metatheoretic hypotheses, since all practical searches are guided by some description of a goal, and the results of search depend critically on this description. The in is a very naive presentation of this search parameter.
3. Some sort of description of the results of search must be involved in such metatheoretic hypotheses, since it is the absence of a counterindication in those results that enables the default assumption. The is a very naive presentation of the results of the search.
4. Although the extensional contents of the knowledge base available to a reasoning agent determines the truth or falsehood of such metatheoretic hypotheses, the contents of that knowledge base must not be mentioned explicitly in the formal presentation of the hypotheses, since it is too big. The whole point of a formal logical language is to represent the information actually manipulated by a reasoning agent, which should normally be much smaller than the world about which the agent is reasoning. For comparative discussion of the results of reasoning from different knowledge bases, some description of the knowledge base may be treated as an additional parameter to the logical inference relation , but it must be understood that the reasoning agent has no direct access to it, only to the formal propositions that result from searching it. In particular, useful definitions of must not depend on qualities of the knowledge base other than those derived by the search procedures used by the reasoning agent. If several different knowledge bases are available simultaneously to a single agent, the name of a knowledge base to be searched may be modelled as part of the search parameter mentioned in item 2, but the contents of the knowledge base must not be an explicit parameter.

# The Hard Problem Remains

Of course, arguments of the sort advanced above do not solve any hard technical problems. The hard problem to solve in automating practical reasoning is the problem of reacting to updates (changes in knowledge) [#!update-problem!#]. That problem is essentially the same at a technical level, whether we use a monotonic inference relation or a nonmonotonic . In conventional monotonic systems, without metatheoretic operators, the addition of new knowledge leads only to the addition of new inferences. The hard part of the update problem is how to handle retraction of previously accepted knowledge. There is no clear consensus even on the proper form in which to express retractions.

Both nonmonotonic inference relations, such as , and metatheoretic operators, such as , seem to add a complication to the problem of changing knowledge, since they both lead to cases where added knowledge requires conclusions to be retracted. I conjecture, though, that once the basic problem of managing retraction of knowledge is solved, the extra interaction of additions and retractions introduced by nonmonotonicity will become much clearer. I propose that, by representing nonmonotonic reasoning behavior through metatheoretic operators, such as , with the monotonic inference relation , we can improve the flexibility and transparency by which future breakthroughs in the update problem are applied to a wide range of interesting domains of reasoning.

While the 4 properties of useful reasoning agents mentioned in Section 4 support the use of formal systems that are conventional to the extent of being monotonic, they by no means support the particular choices of classical or intuitionistic reasoning. In fact, an additional property of reasoning agents argues strongly against such choices:

1. Some of the knowledge provided to the agent is wrong, and possibly even contradictory.
Both classical and intuitionistic logic trivialize in the presence of contradictory information--everything follows logically from any contradiction, and the mere fact that classically or intuitionistically does not make it safe for a reasoning agent to conclude given knowledge of each proposition in . Even in the absence of contradiction, classical and intuitionistic logic seem to be too sensitive to errors, due to their conceptual foundation on the assertion of absolute truth rather than of rational belief based on fallible information, although I can find no rigorous discussion of this sensitivity in the literature.

An obvious fix is to have the agent test the consistency of before applying logical inference. Unfortunately, consistency checking ranges from intractably expensive to fundamentally impossible, so the time and resource for consistency checking would dominate that for real reasoning steps. Rather, property 5 requires the use of paraconsistent formal systems [#!paraconsistent!#], such as relevance logics [#!relevance!#] instead of classical or intuitionistic formal systems. Further research is required to illuminate the choice of a particular paraconsistent formal system for a particular reasoning agent. The right paraconsistent formal system can reduce the harm done by errors and contradictions in the hypotheses used by a reasoning agent, allowing useful reasoning to go on during the typically long time between the introduction of an error or contradiction and its detection. But, they still do not address the hard problem of retracting information when an error is finally detected.

Against Nonmonotonic Logic
(DRAFT)

This document was generated using the LaTeX2HTML translator Version 99.2beta6 (1.42)

Copyright © 1993, 1994, 1995, 1996, Nikos Drakos, Computer Based Learning Unit, University of Leeds.
Copyright © 1997, 1998, 1999, Ross Moore, Mathematics Department, Macquarie University, Sydney.

The command line arguments were:
latex2html -split 0 attack.tex

The translation was initiated by Mike O'Donnell on 2001-12-05

#### Footnotes

... O'Donnell1
Supported in part by NSF grant no. CCR-9016905

Mike O'Donnell 2001-12-05