0% found this document useful (0 votes)
17 views

Ch07 Knowledge Repr Modified

The document discusses different approaches to knowledge representation including semantic networks, conceptual graphs, case frames, and conceptual dependency theory. It describes how each approach represents concepts, relationships between concepts, and meanings of words. Examples are provided to illustrate each approach.

Uploaded by

musabrown0329
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Ch07 Knowledge Repr Modified

The document discusses different approaches to knowledge representation including semantic networks, conceptual graphs, case frames, and conceptual dependency theory. It describes how each approach represents concepts, relationships between concepts, and meanings of words. Examples are provided to illustrate each approach.

Uploaded by

musabrown0329
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 72

7 Knowledge Representation

7.0 Issues in Knowledge 7.4 Agent Based and


Representation Distributed Problem
Solving
7.1 A Brief History of AI
Representational 7.5 Epilogue and
Systems References
7.2 Conceptual Graphs: A 7.6 Exercises
Network Language
7.3 Alternatives to Explicit
Representation

1
Chapter Objectives

• Learn different formalisms for Knowledge


Representation (KR)
• Learn about representing concepts in a
canonical form
• Compare KR formalisms to predicate calculus
• The agent model: Transforms percepts and
results of its own actions to an internal
representation

2
“Shortcomings” of logic
• Emphasis on truth-preserving operations rather
than the nature of human reasoning (or natural
language understanding)
• if-then relationships do not always reflect how
humans would see it:
X (cardinal (X)  red(X))

X( red (X)   cardinal(X))


• Associations between concepts is not always clear
snow: cold, white, snowman, slippery,
ice, drift, blizzard
• Note however, that the issue here is clarity or ease
of understanding rather than expressiveness.
3
Network representation of properties of
snow and ice

4
Semantic network developed by Collins
and Quillian (Harmon and King 1985)

5
Meanings of words (concepts)

The plant did not seem to be in good shape.

Bill had been away for several days and nobody


watered it.
OR
The workers had been on strike for several days
and regular maintenance was not carried out.

6
Three planes representing three
definitions of the word “plant” (Quillian 1967)

7
Intersection path between “cry” and
“comfort” (Quillian 1967)

8
“Case” oriented representation schemes

• Focus on the case structure of English verbs


• Case relationships include:
agent location
object time
instrument
• Two approaches
case frames: A sentence is represented
as a verb node, with various case links to nodes
representing other participants in the action
conceptual dependency theory: The
situation is classified as one of the standard
action types. Actions have conceptual cases
(e.g., actor, object).
9
Case frame representation of
“Sarah fixed the chair with glue.”

10
Conceptual Dependency Theory

• Developed by Schank, starting in 1968


• Tried to get as far away from language as
possible, embracing canonical form, proposing
an interlingua
• Borrowed
 from Colby and Abelson, the terminology that sentences
reflected conceptualizations, which combine concepts
 from case theory, the idea of cases, but rather assigned
these to underlying concepts rather than to linguistic
units (e.g., verbs)
 from the dependency grammar of David Hayes, idea of
dependency

11
Basic idea

• Consider the following story:

Mary went to the playroom when she heard Lily


crying.
Lily said, “Mom, John hit me.”
Mary turned to John, “You should be gentle to
your little sister.”
“I’m sorry mom, it was an accident, I should not
have kicked the ball towards her.” John replied.
• What are the facts we know after reading this?

12
Basic idea (cont’d)

Mary went to the Mary’s location changed.


playroom when she
heard Lily crying. Lily was sad, she was
Lily said, “Mom, John crying.
hit me.” John hit Lily (with an
Mary turned to John, unknown object).
“You should be gentle
to your little sister.” John is Lily’s brother.
“I’m sorry mom, it was
an accident, I should John is taller (bigger)
not have kicked the than Lily.
ball towards her.” John kicked a ball, the
John replied. ball hit Lily.

13
“John hit the cat.”

• First, classify the situation as of type Action.


• Actions have cocceptual cases, e.g., all actions require
 Act (the particular type of action)
 Actor (the responsible party)
 Object (the thing acted upon)

ACT: [apply a force] or PROPEL


ACTOR: john
OBJECT: cat

o
john  PROPEL  cat

14
Conceptual dependency theory

Four primitive conceptualizations:


• ACTs actions
• PPs objects (picture producers)
• AAs modifiers of actions (action aiders)
• PAs modifiers of objects (picture aiders)

15
Conceptual dependency theory (cont’d)
Primitive acts:
• ATRANS transfer a relationship (give)
• PTRANS transfer of physical location of an object (go)
• PROPEL apply physical force to an object (push)
• MOVE move body part by owner (kick)
• GRASP grab an object by an actor (grasp)
• INGEST ingest an object by an animal (eat)
• EXPEL expel from an animal’s body (cry)
• MTRANS transfer mental information (tell)
• MBUILD mentally make new information (decide)
• CONC conceptualize or think about an idea (think)
• SPEAK produce sound (say)
• ATTEND focus sense organ (listen)
16
Basic conceptual dependencies

17
Examples with the basic conceptual
dependencies

18
Examples with the basic conceptual
dependencies (cont’d)

19
CD is a decompositional approach

“John took the book from Pat.”

o John
John <> *ATRANS*  book
Pat

The above form also represents:


“Pat received the book from John.”

The representation analyzes surface forms


into an underlying structure, in an attempt to
capture common meaning elements.

20
CD is a decompositional approach

“John gave the book to Pat.”

o Pat
John <> *ATRANS*  book
John

Note that only the donor and recipient have


changed.

21
Ontology

• Situations were divided into several types:


 Actions
 States
 State changes
 Causals

• There wasn’t much of an attempt to classify


objects

22
“John ate the egg.”

23
“John prevented Mary from giving a
book to Bill”

24
Representing Picture Aiders (PAs) or states

thing <> state-type (state-value)

• “The ball is red” ball <> color (red)


• “John is 6 feet tall” john <> height (6 feet)
• “John is tall” john <> height (>average)
• “John is taller than Jane”
john <> height (X)
jane <> height (Y)
X>Y

25
More PA examples

• “John is angry.” john <> anger(5)


• “John is furious.” john <> anger(7)
• “John is irritated.” john <> anger (2)
• “John is ill.” john <> health (-3)
• “John is dead.” john <> health (-10)

Many states are viewed as points on scales.

26
Scales

• There should be lots of scales


 The numbers themselves were not meant to be taken
seriously
 But that lots of different terms differ only in how they
refer to scales was

• An interesting question is which semantic


objects are there to describe locations on a
scale?
For instance, modifiers such as “very”,
“extremely” might have an interpretation as
“toward the end of a scale.”

27
Scales (cont’d)

• What is “John grew an inch.”


• This is supposed to be a state change:
somewhat like an action but with no
responsible agent posited

Height (X+1)
John < Ξ
Height (X)

28
Variations on the story of the poor cat

“John applied a force to the cat by moving


some object to come in contact with the cat”
o
John <> *PROPEL*  cat
i o loc(cat)
John <> *PTRANS*  [ ] 

The arrow labeled ‘i’ denotes instrumental case

29
Variations on the cat story (cont’d)

“John kicked the cat.”


o
John <> *PROPEL*  cat
i
o loc(cat)
John <> *PTRANS*  foot 

kick = hit with one’s foot

30
Variations on the cat story (cont’d)

“John hit the cat.”


o
John <> *PROPEL*  cat
<

Health(-2)
cat <

Hitting was detrimental to the cat’s health.

31
Causals

“John hurt Jane.”


o
John <> DO  Jane
<

Pain( > X)
Jane <
Pain (X)

John did something to cause Jane to become


hurt.

32
Causals (cont’d)

“John hurt Jane by hitting her.”


o
John <> PROPEL  Jane
<

Pain( > X)
Jane <
Pain (X)

John hit Jane to cause Jane to become hurt.

33
How about?

“John killed Jane.”


“John frightened Jane.”
“John likes ice cream.”

34
“John killed Jane.”

John <> *DO*


<

Health(-10)
Jane <
Health(> -10)

35
“John frightened Jane.”

John <> *DO*


<

Fear (> X)
Jane <
Fear (X)

36
“John likes ice cream.”

o
John <> *INGEST*  IceCream
<

Joy ( > X)
John <
Joy ( X )

37
Comments on CD theory

• Ambitious attempt to represent information in


a language independent way
 formal theory of natural language semantics, reduces
problems of ambiguity
 canonical form, internally syntactically identical
 decomposition addresses problems in case theory by
revealing underlying conceptual structure. Relations are
between concepts, not between linguistic elements
 prospects for machine translation are improved

38
Comments on CD theory (cont’d)

The major problem is incompleteness


 no quantification
 no hierarchy for objects (and actions), everything is a
primitive
 are those the right primitives?
 Is there such a thing as a conceptual primitive?
(e.g., MOVE to a physiologist is complex)
 how much should the inferences be carried? CD didn’t
explicitly include logical entailments such as “hit” entails
“being touched”, “bought” entails being at a store
 fuzzy logic? Lots of linguistic details are very lexically-
dependent, e.g., likely, probably
 still not well studied/understood, a more convincing
methodology never arrived

39
Understanding stories about restaurants

John went to a restaurant last night. He ordered


steak. When he paid he noticed he was running
out of money. He hurried home since it had
started to rain.

Did John eat dinner?


Did John pay by cash or credit card?
What did John buy?
Did he stop at the bank on the way home?

40
Restaurant stories (cont’d)

Sue went out to lunch. She sat at a table and called a


waitress, who brought her a menu. She ordered a
sandwich.
Was Sue at a restaurant?
Why did the waitress bring Sue a menu?
Who does “she” refer to in the last sentence?

41
Restaurant stories (cont’d)

Kate went to a restaurant. She was shown to a table


and ordered steak from a waitress. She sat there and
waited for a long time. Finally, she got mad and she
left.

Who does “she” refer to in the third sentence?


Why did Kate wait?
Why did she get mad? (might not be in the
“script”)

42
Restaurant stories (cont’d)

John visited his favorite restaurant on the way to the


concert. He was pleased by the bill because he liked
Mozart.

Which bill? (which “script” to choose:


restaurant or concert?)

43
Scripts

• Entry conditions: conditions that must be true


for the script to be called.
• Results: conditions that become true once the
script terminates.
• Props: “things” that support the content of the
script.
• Roles: the actions that the participants
perform.
• Scenes: a presentation of a temporal aspect of
a script.

44
A RESTAURANT script

Script: RESTAURANT
Track: coffee shop
Props: Tables, Menu, F = food,
Check, Money
Roles: S= Customer
W = Waiter
C = Cook
M = Cashier
O = Owner

45
A RESTAURANT script (cont’d)

Entry conditions: S is hungry


S has money
Results: S has less money
O has more money
S is not hungry
S is pleased (optional)

46
A RESTAURANT script (cont’d)

47
A RESTAURANT script (cont’d)

48
A RESTAURANT script (cont’d)

49
Frames

Frames are similar to scripts, they organize


stereotypic situations.
Information in a frame:
• Frame identification
• Relationship to other frames
• Descriptors of the requirements
• Procedural information
• Default information
• New instance information

50
Part of a frame description of a hotel
room

51
Conceptual graphs

A finite, connected, bipartite graph


Nodes: either concepts or conceptual relations
Arcs: no labels, they represent relations
between concepts
Concepts: concrete (e.g., book, dog) or
abstract (e.g., like)

52
Conceptual relations of different arities

Flies
is a bird flies
unary
relation

Color
is a dog color brown
binary
relation

Parents father
is a child parents
ternary
relation mother
53
“Mary gave John the book.”

54
Conceptual graphs involving a brown dog
Conceptual graph indicating that the dog named emma dog is brown:

Conceptual graph indicating that a particular (but unnamed) dog is brown:

Conceptual graph indicating that a dog named emma is brown:

55
Conceptual graph of a person with three
names

56
“The dog scratches its ear with its paw.”

57
The type hierarchy

A partial ordering on the set of types:


ts
where, t is a subtype of s, s is a supertype of t.
If t  s and t  u, then t is a common subtype of
s and u.
If s  v and u  v, then v is a common supertype
of s and u.
Notions of: minimal common supertype
maximal common subtype

58
A lattice of subtypes, supertypes, the
universal type, and the absurd type

w
r v

s u

 59
Four graph operations

• copy: exact copy of a graph


• restrict: replace a concept node with a node
representing its specialization
• join: combines graph based on identical
nodes
• simplify: delete duplicate relations

60
Restriction

61
Join

62
Simplify

63
Inheritance in conceptual graphs

64
“Tom believes that Jane likes pizza.”

person:tom experiencer believe

object

proposition

person:jane agent likes

pizza object

65
“There are no pink dogs.”

66
Translate into English

person:john agent eat object pizza

instrument

part hand

67
Translate into English

1
person between place attr hard

2
rock

68
Translate into English

69
Algorithm to convert a conceptual graph, g,
to a predicate calculus expression
1. Assign a unique variable, x1, x2, …, xn, to each one of
the n generic concepts in g.
2. Assign a unique constant to each individual constant
in g. This constant may simply be the name or marker
used to indicate the referent of the concept.
3. Represent each concept by a unary predicate with the
same name as the type of that node and whose argument
is the variable or constant given that node.
4. Represent each n-ary conceptual relation in g as an n-
ary predicate whose name is the same as the relation. Let
each argument of the predicate be the variable or
constant assigned to the corresponding concept node
linked to that relation.
5. Take the conjunction of all the atomic sentences
formed under 3 and 4. This is the body of the predicate
calculus expression. All the variables in the expression
are existentially quantified. 70
Example conversion

1. Assign variables
to generic concepts X1
2. Assign constants
to individual concepts emma
3. Represent each
concept node dog(emma) brown(X1)
4. Represent each
n-ary relation color(emma, X1)
5. Take the conjunction
all the predicates from
3 and 4 dog(emma)  color(emma, X1)  brown(X1)
All the variables are
existentially
quantified.  X1 dog(emma)  color(emma, X1)  brown(X71
1)
Universal quantification

A cat is on a mat.

cat on mat

Every cat is on a mat.

Cat:  on mat

72

You might also like