Levesque
Levesque
www.annualreviews.org/aronline
KNOWLEDGEREPRESENTATION
1AND REASONING
eHector J. Levesque
All parties to the debate agree that a central goal of research is that computers must
somehowcometo "know" a good deal of what every humanbeing knowsabout the world
and about the organisms, natural or artificial, that inhabit it. This bodyof knowledge--
indefinite, no doubt, in its boundaries---goes by the name"common sense." The problem
weface is howto impart such knowledgeto a robot. That is, howdo wedesign a robot with
a reasoning capacity sufficiently powerful and fruitful that whenprovided with some
subbody of this knowledge, the robot will be able to generate enough of the rest to
intelligently adapt to and exploit its environment?(68, p. 37)
Despite the apparent simplicity of this goal, the research area of knowledge
3representation (KR) has a long, complex, and as yet nonconvergent history.
~This review draws heavily from work done with Ron Brachman. I also thank Jim des
Rivi6res, Bart Selman, and John Tsotsos, for helping with an earlier version of the document.
Financial support was received from the Natural Sciences and Engineering Research Council of
Canada.
2Fellow of the CanadianInstitute for AdvancedResearch.
3Research in KRoriginated with a single paper written by John McCarthyin 1958, and
republished as (89).
255
8756-7016/86/1115-0255502.00
Annual Reviews
www.annualreviews.org/aronline
256 LEVESQUE
Implicit in this hypothesis4 are two major properties that the structures
forming a KBmust satisfy:
¯ For the structures to represent knowledge,it must be possible to interpret
them propositionally, that is, as expressions in a language with a truth
theory. Weshould be able to point to one of them and say what the world
wouldhave to be like for it to be true.
¯ The system should act the way it does because of the presence of these
structures. Clearly, the hypothesis wouldnot be satisfied in a systemwhere
the KBwas completely ignored (like commentsin a program, for ex-
ample).
The key point is that an account of cognitive activity in terms of com-
putational operations over propositionally interpreted structures imposescon-
straints on howa KBScan be realized. First of all, it rules out data structure
operations that do not respect (in somesense) the propositional interpretation
of a KB(e.g. reversing the words of a sentence). Secondly, because of the
causal role of a KB, it rules out operations that are not computationally
manageable.In other words, the operations on a KBneed to be semantically
coherent without demandingmore than what any computercan be expected to
do. To better understand these constraints, we need to examinewhat it means
to operate on structures in a waythat respects their semantic interpretation.
262 LEVESQUE
9Auseful introduction to ATPis (86). Anoverviewof the research in the area can be found
(142).
1°S¢¢ (108, 139) for a fuller discussion of resource-limited processing. Manyof these ideas
were eventually incorporated into the KRL(11) representation language.
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 263
However, from the point of view of KR, both of these are only pseudo-
solutions. Clearly, the first one alone does not help us guarantee anything
about an inferential service. The secondone, on the other hand, might allow
us to guarantee an answer within certain time boundsbut wouldmakeit very
hard for us to specify what that answer would be. If we think of the KR
service as reasoningaccordingto a certain logic, then the logic being followed
is immenselycomplicated (comparedto that of FOL)when resource limita-
tions are present. Indeed, the wholenotion of the KRsystemcalculating what
is implicit in the KB(whichwas our original goal) wouldhave to be replaced
by someother notion that went beyondthe truth theory of the representation
language to include the inferential power of a particular theorem-proving
program. In a nutshell, we can guarantee getting an answer, but not the one
we wanted.
3. Bx Cousin(bill,x)/k Male(x).
Sentence3 says that Bill has at least one male cousin, but it does not say who
that cousinis.
4. ~’x Friend(george,x) D By Child(x,y).
Sentence4 says that all of George’sfriends have children, without saying who
those friends or their children are, or even if there are any.
The mainfeature of these examplesis that FOLis used not to capture complex
details about the domainbut to avoid having to represent details that maynot
be known.The expressive power of FOLdetermines not so muchwhat can be
said but what can be left unsaid.
For a system that has to be able to acquire arbitrary knowledgein a
piecemealfashion, there maybe no alternative to full logical reasoningwith a
language as expressive as FOL.But we maybe able to get by with muchless.
In what follows, we examineKRresearch by considering three distinct ways
of reducing the computational demandson a general KRreasoning service.
So the view of KRwe are considering here is certainly based on logic, but not
necessarily classical logic. Rather than dealing with a very expressive lan-
guage (i.e. FOL),and inferences that are logically complete and sound,
are concernedwith inexpressive languagesand with logics that are classically
incomplete and unsound. First, we will look at some special-purpose KR
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 265
3. SPECIAL-PURPOSE LANGUAGES
with back substitution) and calculate directly whetheror not P is true of these
values. In general, k equations can be solved in roughly k3 operations, and
assuming a unique solution, P can be checked in a linear number of op-
erations. 13 Specialized techniques like this are not possible in general, and
sets of linear equations are only a small subset of whatcan be expressedin the
full languageof first-order arithmetic. But it is nonetheless a very rich and
useful subset. So it makes sense to build systems whoseinput language is
restricted to lie in this subset.
Muchof the work in KRhas involved inventing new KRformalisms and
embeddingthese within KRsystems. In retrospect, a good proportion of this
research can be seen as the search for useful compromisesbetweenexpressive
power, on the one hand, and tractability of reasoning, on the other (80).
fact, with the possible exception of nonmonotonicfacilities (see Section
below), these formalisms can almost inevitably be understood as subsets of
classical first-order logic.
3.1 Databases
The most obvious restriction to the form of a KBis what might be called
database form. The idea is to restrict a KBto contain only the kind of
information that can be represented in a simple standard database. Consider,
for example,a very trivial database that deals with university courses. If we
had to characterize in FOLthe information contained in the database, we
might use a collection of function-free atomic sentences like
Course(csc248)Dept(csc373,ComputerScience) Enrollment(psy400,42)
. .
Course(matl00)Dept(his 100,History) ...
13Thiswill be true of any P that can be evaluated directly oncethe values of its variables are
known.So, for example, any P that does not use quantifiers has this property.
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 267
other than those mentionedin the list of sentences). But a database system
could answerthe question successfully by interpreting it as (somethinglike)
Howmanytuples in the COURSE
relation have ComputerSciencein their
Dept field?
This is a questionnot about the world being modelledbut about the data itself.
To be able to reinterpret it as the intuitive question about courses and
departments(rather than as one about tuples and fields), we need to account
for additional informationtaking us beyondthe stored data itself. In particu-
lar, we need FOLsentences of the form
i’ll’his is one form of what has been called the closed world assumption (118).
Annual Reviews
www.annualreviews.org/aronline
268 LEVESQUE
itself play the role of more general reasoning procedures (much the way
arithmetic can replace reasoning with Peano’s axioms). The disadvantage of
an analogue, however,should also be clear: There are certain kinds of facts
about the domainthat it cannot leave unsaid.~5 In this sense, an analogue
representation can be viewedas a special case of a propositional one, where
the information it contains is relatively complete.
we knowexactly whothe motherof Bill is, but only after having executed the
program.
In somesense, the logic programform does not provide any computational
advantage to an FOLreasoning system, since determining if a ground atomic
sentence is implied by a collection of Horn sentences (containing function
symbols) is undecidable. 18 On the other hand, the form is much more
manageablethan in the general case, since the necessaryinference can be split
very nicely into two components:a retrieval componentthat extracts (atomic)
facts from a database by pattern-matching, and a search componentthat tries
to use the nonatomic Horn sentences to complete the inference. In actual
systems like PROLOG and PLANNER, moreover, the search component is
(partially) under user control, giving the user the ability to incorporate
domain-specific control knowledge.The only purely automatic inference is
the retrieval component.
This suggests a different wayof looking at the inferential service provided
by a KRsystem(without even taking into account the logical form of the KB).
Instead of automatically performing the full deduction necessary to answer
questions, a KRsystemcould managea limited form of inference and leave to
the rest of the knowledge-based system (or to the user) the responsibility
intelligently completing it. As suggested in (50), the idea is to take the
"muscle" out of the automatic componentand leave the difficult part of
reasoning as a problem that the overall system can (meta-)reason about and
plan to solve (54). 19 It is clear that one of the majorattractions of PROLOG
its programmability, and its potential for integrating procedural and de-
2°
clarative concerns, rather than its poweras a KRlanguage.
270 LEVESQUE
KNOWLEDGE
REPRESENTATIONAND REASONING 271
instance, the elephant node can have a color link to the value gray, but
anythingbelowelephant (such as albino-elephant) can be linked to a different
color value. To infer the color of an individual only requires searching up the
taxonomyfor a value, and stopping whenthe first one is found, preempting
any higher values.25 The trouble with this type of reasoningis that, with only
a proceduralaccount like the one above, it is very easy to lose the import of
exactly what is being represented (17).
3.4 Frames
The final form we consider--frame descriptions--is mainly an elaboration of
the semanticnetworkform.26 The emphasis,in this case, is on the structure of
the types themselves(usually called frames) in terms of their attributes (now
called slots). Typically, the kind of detail associated with the slots of a frame
includes
1. values, stating exactly whatthe attribute of an instance should be. Alterna-
tively, the value maybe just a default, in whichcase an individual inherits
the value, provided it does not override it.
2. restrictions, stating what constraints mustbe satisfied by attribute values.
Thesecan be value restrictions, specified by a type that attribute values
should be instances of, or numberrestrictions, specified in terms of a
minimumand a maximumnumber of attribute values.
3. attached procedures, providing procedural advice on howthe attribute
should be used. Anif-needed procedureexplains howto calculate attribute
values if none have been specified; an if-added procedure explains what
should be done whena new value is discovered.
Like semantic networks, frame languages tend to take liberties with logical
form, and the developers of these languages have been notoriously lax in
characterizing their truth theories (17, 59). Restricting ourselves to a noncon-
troversial subset of a frame language, we might have descriptions like
(Student
with adept is computer-science and
with -> 3 enrolled-course is a
(Graduate-Course with adept is a Engineering-Department)).
This is intended to be a structured type that describes ComputerScience
students taking at least three graduate courses in departmentswithin Engineer-
25Formorecomplex examples
of reasoningbaseddirectly onthe formof semantic networks,
see(121)and(140).
26Thetheoryof framesin KRwasfirst presentedin (99), althoughmany of the ideaswere
already"in the air". KRL(11-13)andKL-ONE (14,23) are perhapsthe mostrepresentative
languagesbasedonthese ideas.
Annual Reviews
www.annualreviews.org/aronline
272 LEVESQUE
ing. If this type had a name(say A), we could express the type in FOLby
"meaning postulate" of the-form
~’xA(x)-= [Student(x)/~ dept(x, computer-science) /~
~YlY2Y3 (Yl -if= Y2 /’k Yl -¢ Y3 /~ Y2 -~ Y3
enrolled-course(x,yl) /~ Graduate-Course(y0/~
3z(dept(y~,z)/~ Engineering-Department(z))/~
enrolled-course(x,y2)/~ GradUate-Course(y2)/~
3z(dept(y2,z)/~ Engineering-Department(z))/~
enrolled-course(x,y3)/~ Graduate-Course(y3)/~
3z(dept(y3,z)/~ Engineering-Department(z)))].
Similarly, it should be clear howto state equally clumsily27 in FOLthat an
individual is an instance of this type.
Oneinteresting property of these structured types is that wedo not have to
explicitly assert that one of them is below another in the taxonomy. The
descriptions themselves implicitly define a taxonomyof subsumption, where
type A is subsumedby type B if, by virtue of the form of A and B, every
instance of A must be an instance of B. For example, without any word
knowledgeat all, we can determine that the type
(Person with every male friend is a Doctor)
subsumes
(Person with every friend is a (Doctor with a specialty is surgery)).
Analytic relationships like subsumptionare useful properties of structured
types that are not available in a semantic networkwhereall of the types are
essentially atomic (20).. In the KRYPTON KRlanguage (18, 19), a
first-order KBis used to represent facts about the world, but subsumption
information is also available. The reason for this is that while subsumption
can be defined in terms of logical, implication, 28 there can be very good
special-purpose "description matching"algorithms for calculating these rela-
tionships (21). Again,becausethe logical formis sufficiently constrained, the
required inference can be muchmore tractable.
27These sentences
are especiallyawkwardin FOLbecauseof the number reslrictions. For
example,the sentence"Thereare a hundredbillion stars in the MilkyWayGalaxy" wouldbe
translatedinto an FOLsentencewithonthe orderof lO22 conjuncts.
28Specifically,
typeAis subsumedbytypeB iff themeaning postulatesfor AandB logically
imply’#x[A(x)~ B(x)].
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 273
274 LEVESQUE
31Another
reasonfor preferringthis approach
is that it quitenaturallygeneralizes
to dealwith
beliefsaboutotherbeliefs(byusinga Boperatorwithinthe scopeof another).Thisallowsfacts
aboutself-knowledge to be expressed,whichhas applicationsdiscussedin Section5.
32See (67) for an introductionto possibleworldsandthe modallogicsbasedonthem.
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 275
276 LEVESQUE
believes that a is true, the set of worlds will be all those wheret~ is true:
some, for example, where /3 is true, others, where /3 is false. However,
because valid sentences will also be true in all of these possible worlds, the
agent is thought of as believing themjust as if they were amonghis actual
beliefs. In terms of the possible worlds, there is no wayto distinguish t~ from
a valid sentence.
Onesolution is to makethis notion of what an agent thinks the world is like
be morerelevant to what he actually believes. This can be done by replacing
the possible worlds by a different kind of semantic entity that does not
necessarily deal with the truth of all sentences. In particular, sentences not
relevant to what an agent actually believes (including somevalid ones) need
not get a truth value in one of these partial possible worlds [whichwe call
situations, following (7)]. In fact, we can think of possible worlds as those
limiting cases whereevery sentence has a truth value. Indeed, the concept of a
possible world being compatible with a situation is intuitively clear: Every
sentence whose truth is supported by the situation should be true in that
possible world, and every sentence whosefalsity is supported should be false.
Also, we can allow for situations that have no compatible possible worlds.
Theseare situations that support both the truth and falsity of somesentence.
Althoughthey can never be real, such impossible situations can be imagined
and are very useful, since they allow an agent to havean incoherent picture of
33
the world.
The "trick," then, that underlies the modelsof belief is to identify belief
with a set of situations rather than with a set of possible worlds. This has the
following effect (roughly): Not all valid sentences have to be believed, since
situations can fail to support themby being partial. Also, becauseof impossi-
ble situations, beliefs need not be closed under logical consequence. For
example,a situation can support both a and (~ct ~//3), without supporting/3,
by supporting both a and its negation.
Althoughthere were certainly antecedents in the philosophical literature
(e.g. 114), the first proposal for using this kind of logic in a computersystem
was by Belnap (8). He sought to use relevance logic (4), which embodies
these situations (in terms of four truth values), as a meansof dealing with the
partiality and potential inconsistency of information presented to a machine.
The relation of logical consequencein this type of relevance logic (called
tautological entailment) is a strict subset of the relation of logical conse-
quence in classical logic. Levesque(81) showedhowto use these ideas
establish a formal modelof belief parallel to the one based on possible worlds.
More importantly, Levesque described a KRservice based on this weaker
notion of implication and proved that the required inference was indeed more
33Althoughthey are used here, impossible situations are not strictly necessary to model
inconsistent beliefs, as is discussed later.
Annual Reviews
www.annualreviews.org/aronline
KNOWLEDGE
REPRESENTATIONAND REASONING 277
KNOWLEDGE
REPRESENTATIONAND REASONING 279
35Indeed,in "visualizing"
a describedsituation,weare filling in a largenumber of visually
significantdetails that neednot be impliedbyanythingin the originaldescriptionnor by
statisticallyrelevantdefaults.So,for example,it is hardto thinkaboutRonald Reagan standing
besideMargaret Thatcherwithoutthinkingof themfroma specificpointof viewwithoneof them
ontheleft andthe otheronthe right.
36Interestinglyenough,ShapiroandMartinsarguethat relevancelogicis the appropriate
framework for maintaining
a networkof assumptions, quiteindependently of its role in limiting
reasoning(as discussedin Section4.3).
Annual Reviews
www.annualreviews.org/aronline
280 LEVESQUE
KNOWLEDGE
REPRESENTATIONAND REASONING 281
that are membersof all such extensions, even though such a belief set
need not itself be an extension. The nonmonotonicityarises here from the
fact that as the base set grows, fewer Mot sentences will be added to an
extension.
The third formalismis called circumscription and is due to John McCarthy
(92). Theidea is to take any finite set of first-order sentencesas the base and
"circumscribe"a predicate in it by addingto the base an infinite collection of
sentences (as specified by a circumscription schema). The intent of these
sentences is to makethe extendedtheory state that the predicate is as small
as possible, given the base set. The way this is done for a predicate P is
to include in the collection a sentencefor each openfirst-order formulath with
the same number of arguments as P. What this sentence says is that if
the base sentences with P replaced by ~b are true and th is a subset of P, then
t h and P are equal. The net effect is to prohibit any property (specifiable by
an open formula) from simultaneously satisfying the base sentences and be-
ing a proper subset of the circumscribed predicate. In other words, the ex-
tended theory makes the circumscribed predicate minimal. This is nonmon-
otonic, because any change to the base set (additive or not) will lead to
different extendedtheory, since every sentence in that theory uses the base
set directly.
Since the publication of these three formalisms, there has been extensive
research on their properties. It is generally agreed that the first two have
serious drawbacks. In Reiter’s model, domain-dependentknowledgehas to
be encoded in inference rules, more or less ruling out a semantic account
for the logic (103); McDermott& Doyle’s proposal appears to be based
on a notion of consistency that is inappropriate for the job but is best un-
derstood in terms of a logic of knowledgeor belief (103). Circumscription
remains the most successful of the three and certainly the one that has
received the most attention (e.g. 41, 84, 119). It does have expressive draw-
backs, however,and new versions of it are under active investigation (e.g.
93, 98).
Gainingin popularity is an attempt to understandnonmonotonicityin terms
of explicit knowledgeand belief. Thus, in (56, 69, 77, 103) a general form
nonmonotonicity is explained in terms of the inherent nonmonotonicity of
self-knowledge. The idea is that certain kinds of assumptionsare based on a
lack of other beliefs (31). For example,one might be willing to believe that
there is no city in NewZealandlarger than Los Angelesbased on a belief that
one wouldknowif there were such a city. So belief in the lack of a certain
belief is sufficient to make the assumption. But this is a nonmonotonic
process, since beliefs about what one does not believe must surely be revised
as new beliefs are acquired. On the other hand, it is not clear that all
Annual Reviews
www.annualreviews.org/aronline
282 LEVESQUE
nonmonotonicreasoning behaves this way. For example, the fact that birds
generally fly seemsto have nothing to do with what is or is not believed at any
given point.
At this stage of research, the computationalrelevance of this workremains
somewhatquestionable. For the three formalisms described above, it appears
(counter-intuitively) that reasoning with assumptions and defaults is even
more difficult than reasoning without them.37 From a computational stand-
point, the formalizations seem to be movingin the wrongdirection. Not that
systems have been unable to take computational advantage of defaults. As
discussedin Section 3.3, the use of defaults goes back to Quillian and the very
early semantic networks (112), and it remains a major feature of inheritance
hierarchies (16, 44, 132, 133). Defaults were also a major part of the frame
concept (99) as can be seen in representation languageslike KRL(11). But
has often been much easier to construct systems that reason in certain
(nonmonotonic)waysthan to justify the correctness of that reasoning. Indeed,
systems whose behavior seemedappropriate at first glance were later shown
to exhibit reasoning anomalies (42, 45). So it remains to be seen howthe
computational promise of nonmonotonicreasoning can be correctly realized.
6. CONCLUSION
37In fact, in the first two cases, applying an assumptiondepends on that assumption being
consistent with sometheory of the world. But for sufficiently expressive languageslike that of
FOL,the consistent sentences are not even recursively enumerable.
Annual Reviews
www.annualreviews.org/aronline
Literature Cited
284 LEVESQUE
60. Hayes, P. 1985. Naive physics I: ontolo- of incomplete knowledge bases. PhD
gy for liquids. See Hobbs & Moore thesis, Dep. Comput.Sci., Univ. Tor-
1985, pp. 71-107 onto, Ontario
61. Hayes, P. 1985. The second naive phys- 78. Levesque, H. 1983. The logic of in-
ics manifesto. See Hobbs & Moore complete knowledge bases. See Ref.
1985, pp. 1-36 105, pp. 165-86
62. Hewitt, C. 1969. PLANNER:a lan- 79. Levesque, H. 1984. Foundations of a
guagefor proving theorems in robots. In functional approach to knowledgerepre-
Proc. Int. Jt. Conf. Artif. lntell., Wash- sentation. Artif. lntell. 23(2):155-212
ington, DC, pp. 295-301 80. Levesque, H. 1984. A fundamental
63. Hewitt, C. 1972. Description and tradeoff in knowledgerepresentation and
theoretical analysis (using schemata) reasoning. In Proc. Bienn. Conf. Can.
PLANNER,a language for proving Soc. Comput. Stud. lntell., London,
theorems and manipulating models in a Ontario, pp. 141-52
robot. Tech. Rep. TR-258, AI Lab., 81. Levesque, H. 1984. A logic of implicit
MIT, Cambridge, Mass. and explicit belief. In Proc. Natl. Conf.
64. Hintikka, J. 1962. Knowledgeand Be- Am.Assoc. Artif. Intell., Austin, Texas,.
lief: An Introduction to the Logic of the pp. 198-202
TwoNotions. Ithaca, NY:Cornell Univ, 82. Levesque, H. 1986. Making believers
Press out of computers.Artif. lntell. In press
65. Hintikka, J. 1975. Impossible possible 83. Lewis, H. 1978. Complexityof solvable
worlds vindicated. J. Philos. 4:475-84 cases of the decision problem for the
66. Hobbs, J., Moore, R., eds. 1985. For- predicate calculus. In Proc. 19th IEEE
mal Theories of the Commonsense Syrup. Found. Comput. Sci,, pp. 35-47
World. Norwood, NJ: Ablex 84. Lipschitz, V. 1985. Closed-world data-
67. Hughes, G., Cresswell, M. 1968. An bases and circumscription.Artif. lntell.,
Introduction to Modal Logic. London: 27(2):229-236
Mcthuen 85. Loganantharaj, R. 1985. Theoretical
68. Israel, D. 1983. The role of logic in and implementationalaspects of parallel
knowledge representation. IEEE Corn- link resolution in connection graphs.
put. 16(10):37--42 PhDthesis. Dep. Comput. Sci., Colo.
69. Konolige, K. 1982. Circumscriptive State Univ., Fort Collins
ignorance. In Proc. Natl. Conf. Am. 86. Loveland, D. 1978. Automated Theorem
Assoc. Artif. lntell., Pittsburgh, Pa., Proving: A Logical Basis. NewYork:
pp. 202-4 North-Holland
70. Konolige, K. 1983. A deductive model 87. Martins, J., Shapiro, S. 1983. Reason-
of belief. In Proc. Int. Jt. Conf. Artif. ing in multiple belief spaces. In Proc.
lntell., Karlsruhe, FRG,pp. 377-81 Int. Jt. Conf. Artif. lntell., Karlsruhe,
71. Konolige, K. 1984. A deduction model FRG, pp. 370-73
of belief and its logics. PhDthesis. Dep. 88. McAllester, D. 1980. The Use of Equal-
Comput. Sci., Stanford Univ., Palo ity in Deduction and KnowledgeRepre-
Alto, Calif. sentation. MSthesis. AI Lab., Mass.
72. Konolige, K. 1985. A computational Inst, Technol., Cambridge
theory of belief introspection. In Proc. 89. McCarthy, J. 1968. Programs with com-
Int. Jt. Conf. Artif. lntell., Los Angeles, mon sense. See Ref. 98a, pp. 403-18
pp. 502-8 90. McCarthy, J. 1977. Epistemological
73. Kowaiski, R. 1974. Predicate logic as a problemsin artificial intelligence. In
programming language. In IFIP Con- Proc. Int. Jt. Conf. Artif. Intell,, Cam-
gress, Stockholm, pp. 569-74 bridge, Mass., pp. 1038-44
74. Kowalski, R. 1979. Logic for Problem 91. McCarthy,J. 1979. First order theories
Solving. Amsterdam!Elsevier North- of individual concepts and propositions.
Holland In MachineIntelligence, ed. J. Hayes,
75. Kuipers, B. 1979. Onrepresenting com- D. Michie, L. Mikulich, 9:129-47.
monsense knowledge. See Ref. 47, pp. Chichester, England: Ellis Horwood
393-408 92. McCarthy, J. 1980. Circumscription--a
76. Lakemeyer, G. 1986. Steps towards a form of non-monotonicreasoning. Artif.
first order logic of implicit and explicit Intell. 13(1,2):27-39
belief. In Theoreti~al Aspects of Reason- 93. McCarthy, J. 1984. Applications of
ing about Knowledge: Proc. 1986 circumscription to formalizing com-
Conf., ed. J. Halpem, pp. 325-40. Los monsense knowledge. In The Non-
Altos, Calif: Morgan Kanfmann Monotonic Reasoning Workshop, New
77. Levesque, H. 1981. A formal treatment Paltz, NY, pp. 295-324
Annual Reviews
www.annualreviews.org/aronline
286 LEVESQUE
94. McCarthy, J., Hayes, P. 1969. Some 108. Norman, D., Bobrow,D. 1975. On data
philosophical problems from the stand- limited and resource limited processing.
point of artificial intelligence. In Ma- Cognitive Psychol. 7:44-64
chine Intelligence, ed. B. Meltzer, D. 109. Norman,D., Rumelhart, D., eds. 1975.
Michie, 4:463-502. Edinburgh: Edin- Explorations in Cognition. San Francis-
burgh Univ. Press co: W. H. Freeman
95. McDermott,D. 1982. A temporal logic 110. Patel-Schneider, P. 1985. A decidable
for reasoning about processes and plans. first-order logic for knowledgerepresen-
Cognitive Sci. 6(2):101-55 tation. In Proc. Int. Jr. Conf. Artif. In-
96. McDermott, D., Doyle, J. 1980. Non- tell., Los Angeles, pp. 455-58
monotoniclogic I. Artif. lntell. 13(1,2): Ill. Pentlandr A., Fischler, M. 1983. A
41-72 more rational view of logic. A/ Mag.
97. Mendelson, E. 1964. Introduction to 4(4):15-18
Mathematical Logic. New York: Van 112. Quillian, M. 1967. Wordconcepts: a
Nostrand Reinhold theory and simulation of some basic
98. Minker, J., Perlis, D. 1984. Protected semantic capabilities. Behav. Sci. 12:
circumscription. In Workshopon Non- 410-30
Monotonic Reasoning, New Paltz, NY, 113. Quillian, M. 1968. Semantic memory.
pp. 337-43 See Ref. 98a, pp. 227-70
98a. Minsky, M., ed. 1968. Semantic In- 114. Rantala, V. 1982. Impossible world
formation Processing. Cambridge, semantics and logical omniscience. Acta
Mass: MIT Press Philos. Fenni. 35:106-15
99. Minsky, M. 1981. A framework for 115. Raphael, B. 1971. The frame problem in
representing knowledge. In Mind De- problem solving systems. In Artificial
sign, ed. J. Haugeland, pp. 95-128. Intelligence and Heuristic Program-
Cambridge, Mass: MIT Press ming. NewYork: American Elsevier
100. Moore, R. 1977. Reasoning about 116. Reiter, R. 1978. Onclosed world data-
knowledgeand action. In Proc. Int. Jt. bases. See Ref. 51a, pp. 55-76
Conf. Artif. lntell., Cambridge,Mass., 117. Reiter, R. 1978. On reasoning by de-
pp. 223-27 fault. In Proc. Conf. Theor. Issues Nat.
101. Moore, R. 1980. Reasoning about Lang. Process, Univ. Ill. Urbana-
knowledgeand action. Tech. Note 191, Champaign
Artifi. lntell. Cent., SRI Int., Menlo 118. Reiter, R. 1980. A logic for default
Park, Calif. reasoning. Artif. Intell. 13(1,2):81-
102. Moore, R. 1982. The role of logic in 132
knowledge representation and com- 119. Reiter, R. 1982. Circumscription im-
monsense reasoning. In Proc. Natl. plies predicate completion (sometimes).
Conf. Am. Assoc. Artif. Intell., Pitts- In Proc. Natl. Conf. Am. Assoc. Artif.
burgh, Pa., pp. 428-33 Intell., Pittsburgh, Pa., pp. 418-20
103. Moore, R. 1983. semantical con- 120. Rich, C. 1980. Knowledgerepresenta-
siderations on nonmonotoniclogic. In tion languages and predicate calculus:
Proc. Int. Jt. Conf. Artif. Intell., Karl- howto have your cake and eat it too. In
sruhe, FRG, pp. 272-79 Proc. Natl. Conf. Am. Assoc. Artif. In-
104. Moore, R., Hendrix, G. 1979. Com- tell., Pittsburgh, Pa., pp. 193-96
putational models of belief and the 121. Rieger, C. 1976. An organization of
semantics of belief sentences. Tech. knowledgefor problem solving and lan-
Note 187, Artifi. lntell. Cent., SRI Int., guage comprehension. Artif . Intell.
Menlo Park, Calif. 7(2):89-127
105. Mylopoulos, J., Levesque, H. 1983. An 122. Rosch, E., Mervis, C. 1975. Family re-
overview of knowledgerepresentation. semblances:studies in the internal struc-
In On Conceptual Modelling: Per- ture of categories. Cognitive Psychol.
spectives from Artificial Intelligence, 7:573-605
Databases, and Programming Lan- 123. Rosenschein, S. 1986. Formal theories
guages, ed. M. Brodie, J..Mylopoulos, of knowledge in AI and robotics. New
J. Schmidt, pp. 3-17. New York: Generation Comput. 3(4): In press
Springer-Verlag 124. Schank, R. 1973. Identification of con-
106. Nelson, G., Oppen, D. 1979. Sim- ceptualizations underlying natural lan-
plification by cooperating decision pro- guage. In Computer Models of Thought
cedures. ACMTrans. Program. Lang. and Language, ed. R. Schank, K. Col-
Sys. 1(2):245-57 by, pp. 187-247. San Francisco: W. H.
107. Nilsson, N. 1980. Principles of Artifi- Freeman
cial Intelligence. Palo Alto, Calif: Tioga 125. Schank, R. 1975. Conceptual Informa-
Annual Reviews
www.annualreviews.org/aronline