0% found this document useful (0 votes)
4 views

AI KB Systems

The document discusses knowledge-based systems and knowledge representation in artificial intelligence. It provides a history of AI and knowledge-based systems. It discusses using logic for knowledge representation and propositional logic syntax, semantics, and inference rules like modus ponens and modus tollens.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

AI KB Systems

The document discusses knowledge-based systems and knowledge representation in artificial intelligence. It provides a history of AI and knowledge-based systems. It discusses using logic for knowledge representation and propositional logic syntax, semantics, and inference rules like modus ponens and modus tollens.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Knowledge-Based Systems

Announcements
• Review sessions
• CS 4701 – focus on AI
Schedule
• Search
• Machine learning
• Knowledge based systems
• Discovery
History of AI
1943 – 1969 The Beginnings
1943 McCulloch and Pitts show networks of neurons can compute and learn
any function
1950 Shannon and Turing wrote chess programs
1951 Minsky and Edmonds build the first neural network computer (SNARC)
1956 Dartmouth Conference – Newell and Simon brought a reasoning
program “The Logic Theorist” which proved theorems.
1952 Samuel’s checkers player
1958 McCarthy designed LISP, helped invent time-sharing and created Advice
Taker (a domain independent reasoning system)
1960’s Microworlds – solving limited problems: SAINT (1963), ANALOGY
(1968), STUDENT (1967), blocksworld invented.
1962 Perceptron Convergence Theorem is proved.
1952 Samuel’s checkers player o TV
Arthur Samuel (1901-1990)
Example ANALOGY Problem
Blocksworld
History of AI
1966 – 1974 Recognizing Lack of Knowledge

• Herb Simon (1957): Computer chess program will


be world chess champion within 10 years.
• Intractable problems, lack of computing power
(Lighthill Report, 1973)
• Machine translation
• Limitations in knowledge representation (Minsky
and Papert, 1969)
 Knowledge-poor programs
Knowledge Representation
• Human intelligence relies on a lot of background knowledge
– the more you know, the easier many tasks become
– ”knowledge is power”
– E.g. SEND + MORE = MONEY puzzle.
• Natural language understanding
– Time flies like an arrow.
– Fruit flies like a banana.
– John saw the diamond through the window and coveted it
– John threw the brick through the window and broke it
– The spirit is willing but the flesh is weak. (English)
– The vodka is good but the meat is rotten. (Russian)
• Or: Plan a trip to L.A.
Domain knowledge
• How did we encode domain knowledge so far?
–For search problems?
–For learning problems?
Knowledge-Based Systems/Agents
• Key components:
– Knowledge base: a set of sentences expressed in some
knowledge representation language
– Inference/reasoning mechanisms to query what is
known and to derive new information or make
decisions.
• Natural candidate:
– logical language (propositional/first-order)
– combined with a logical inference mechanism
• How close to human thought?
– In any case, appears reasonable strategy for machines.
1
2
3
4
Example: Autonomous Car
State: k-tuple
(PersonInFrontOfCar, Policeman, Policecar, Slippery,
YellowLight, RedLight)
Actions:
Brake, Accelerate, TurnLeft, etc.
Knowledge-base describing when the car should brake:
( PersonInFrontOfCar  Brake )
((( YellowLight  Policeman )  (Slippery ))  Brake )
( Policecar  Policeman )
( Snow  Slippery )
( Slippery  Dry )
( RedLight  Brake )
Does (Policecar, YellowLight, Snow) imply Brake?
A=Yes B= No
What the computer “sees”:
State: k-tuple
(x1, x2, x3, x4, x5, x6, x7)
Actions:
x8, x9, x10, etc.
Knowledge-base describing when x:
( x1  x8 )
((( x5  x2 )  (x4 ))  x8)
( x3  x2 )
( x7  x4 )
( x4  x11 )
( x6  x8)

Does (x3, x5, x7) imply x8?


A=Yes B= No
Logic as a Knowledge Representation
• Components of a Formal Logic:
– Variables and operators, syntax
– semantics (link to the world, truth in worlds)
– logical reasoning: entailment  = 
• if, in every model in which α is true, β is also true.
– inference algorithm derives
• KB  α, i.e., α is derived from KB.

(x+y=4) entails that


A) x=2 y=2 B) 2x+2y=8 C) Neither D) Both
Models
• Model is an instantiation of all variables
• All models = all possible assignments
• Sentence α is true in model m, then m is a
model of α
• M(α) refers to the set of all models that satisfy
α
• α = β iff M(α)  M(β)
• β iff M(α) is contained in M(β)
Possible models for the presence of pits in [1,2] [2,2] [3,1]
Dashed = M(α1) where α1= P1,2 (no pit in [1,2])
Solid = M(KB) with observation of B1,1  B2,1 (no breeze in [1,1] and breeze in [2,1])
Possible models for the presence of pits in [1,2] [2,2] [3,1]
Dashed = M(α2) where α2= P2,2 (no pit in [2,2])
Solid = M(KB) with observation of B1,1  B2,1 (no breeze in [1,1] and breeze in [2,1])
Soundness and Completeness
Soundness:
An inference algorithm that derives only entailed
sentences is called sound or truth-preserving.
KB  α implies KB = α
Completeness:
An inference algorithm is complete if it can derive
any sentence that is entailed.
KB = α implies KB  α

Why soundness and completeness important?


 Allow computer to ignore semantics and “just
push symbols”!
AE Duncan-Jones - 1935
Entailment vs. Implication
• Entailment (KB = α) and implication (KB  α)
can be treated equivalently if the inference
process is sound and complete.
Propositional Logic: Syntax
• Propositional Symbols
– A, B, C, …
• Connectives
– , , , , 
• Sentences
– Atomic Sentence: True, False, Propositional Symbol
– Complex Sentence:
• (Sentence )
• ( Sentence V Sentence )
• ( Sentence  Sentence )
• ( Sentence  Sentence )
• ( Sentence  Sentence )
• A KB is a conjunction (ANDs) of many sentences
Example: Autonomous Car
Propositional Symbols
PersonInFrontOfCar, Policeman, .. Brake, Accelerate, TurnLeft
Rules:
( PersonInFrontOfCar  Brake )
 ((( YellowLight  Policeman )  (Slippery ))  Brake )
 ( Policecar  Policeman ) Initial
 ( Snow  Slippery ) KB
 ( Slippery  Dry )
 ( RedLight  Brake )
from sensors:
YellowLight
 RedLight Added to
 Snow KB
 Dry
 Policecar
 PersonInFrontOfCar
Propositional Logic: Semantics

Models
• Model (i.e. possible world):
– Assignment of truth values to symbols
– Example: m={P=True , Q=False}
• Note: Often called “assignment” instead of “model”, and “model” is used for
an assignment that evaluates to true.
• Validity:
– A sentence  is valid, if it is true in every model.
• Satisfiability:
– A sentence  is satisfiable, if it is true in at least one model.
• Entailment:
–  =  if and only if, in every model in which  is true,  is also true.
Stay at home
• Sick StayAtHome
• true true
• false false
• false true

Does Sick entail StayAtHome?


A=Yes B=No
Puzzling aspects of Propositional Logic
• Non causality
– (5 is odd  Tokyo is the capital of Japan)
• True, because whenever 5 is odd, Tokyo is the capital of Japan.
Nothing to do with causality
• Statement always true when antecedent is false
– (5 is even  Sam is smart)
• True, because 5 is never even, so no models where this
statement is incorrect, regardless of whether Sam is smart or
not
• AB
– read: B is true whenever A is true
Propositional Logic: Semantics

Models

(PQ)  (P)  (Q)


A) True B) False
Creating a KB
• Variables
– Pi,j is true if there is a pit at position (i,j)
– Bi,j is true if there is a breeze at position (i,j)
• Knowledge
– R1: P1,1 There is no pit in [1,1]
– R2: B1,1(P1,2P2,1) Square is breezy iff next to pit
– R3: B2,1(P1,1P2,2P3,1)
• Perceptions
– R4: B1,1 There is no breeze in [1,1]
– R5: B2,1 There is breeze in [2,1]
Model Checking
• Idea:
– To test whether  = , enumerate all models
and check truth of  and .
–  entails  if no model exists in which  is true
and  is false (i.e. (  ) is unsatisfiable)
• Proof by Contradiction:
 =  if and only if the sentence (  ) is
unsatisfiable.
Example of model checking
P Q P QP P  (Q  P) (P  (Q  P)) Q
T T F T F F
T F F T F F
F T T F F F
F F T T T F
Models
•  |=  iff the sentence (  ) is unsatisfiable
• Prove that (-P and (Q  P))  Q
– By showing that [(-P and (Q  P))  Q] is not satisfiable
• Possible English translation:
– P=“The street is wet”
– Q=“It is raining”
– Does “The street not wet” (P) and “it is raining street is wet ” (Q  P) imply that “It is not
raining? (Q)?
• Test if [(-P and (Q  P))  Q] is satisfiable.
– It is not satisfiable (always false), therefore (-P and (Q  P)) entails  Q
Model Chekcing
• Variables: One for each propositional symbol
• Domains: {true, false}
• Objective Function: (  )
• Which search algorithm works best?
Doesn’t scale well…
Inference: Reasoning with Propositional Logic
Modus Ponens: Latin for “the way that affirms by affirming” (  )    

Know:  If raining, then soggy courts.


and  It is raining.
Then:  Soggy Courts.
Modus Tollens: Latin for "the way that denies by denying” (  )    

Know:  If raining, then soggy courts.


And  No soggy courts.
Then:  It is not raining.
And-Elimination:
Know:  It is raining and soggy courts.
Then:  It is raining.
Example: Forward Chaining
Knowledge-base describing when the car should brake?
( PersonInFrontOfCar  Brake )
 ((( YellowLight  Policeman )  (Slippery ))  Brake )
 ( Policecar  Policeman )
 ( Snow  Slippery )
 ( Slippery  Dry )
 ( RedLight  Brake )
 ( Winter  Snow )
Observation from sensors:
YellowLight  RedLight  Snow  Dry  Policecar  PersonInFrontOfCar
What can we infer?
• Policecar  ( Policecar  Policeman ): Modus Ponens: Policeman
• Dry  ( Slippery  Dry ): Modus Tollens: Slippery
• YellowLight  Policeman  Slippery  ((( YellowLight  Policeman )  (Slippery )) 
Brake ): Modus Ponens: Brake
• YellowLight  RedLight: And Elimination: YellowLight

Inferring (Winter) from (Snow  ( Winter  Snow )) is


A) Modus Ponens B) Modus Tollens C) And elimination
Other rules
Inference Strategy: Forward Chaining
Idea:
– Infer everything that can be inferred.
– Notation: In implication   , we say that
•  (or its components) are called premises,
•  is called consequent/conclusion.
Forward Chaining:
Given a fact p to be added to the KB,
1. Find all implications I that have p as a premise
2. For each i in I, holds
a) Add the consequent in i to the KB
Continue until no more facts can be inferred.
Inference Strategy: Backward Chaining
Idea:
– Check whether a particular fact q is true.

Backward Chaining:
Given a fact q to be “proven”,
1. See if q is already in the KB. If so, return TRUE.
2. Find all implications, I, whose conclusion “matches” q.
3. Recursively establish the premises of all i in I via
backward chaining.
 Avoids inferring unrelated facts.
Example: Backward Chaining
Knowledge-base describing when the car should brake:
( PersonInFrontOfCar  Brake )
 ((( YellowLight  Policeman )  (Slippery ))  Brake )
 ( Policecar  Policeman )
 ( Snow  Slippery )
 ( Slippery  Dry)
 ( RedLight  Brake )
 ( Winter  Snow )
Observation from sensors:
YellowLight  RedLight  Snow  Dry  Policecar  PersonInFrontOfCar
Should the agent brake (i.e. can “brake” be inferred)?
• Goal: Brake
– Modus Ponens (brake): PersonInFrontOfCar
• Failure: PersonInFrontOfCar  Backtracking
• Goal: Brake
– Modus Ponens (brake): YellowLight  Policeman  Slippery
– Known (YellowLight): Policeman  Slippery
– Modus Ponens (Policeman): Policecar  Slippery
– Known (Policecar):  Slippery
– Modus Tollens (Slippery): Dry
– Known (Dry)
Conjunctive Normal Form
• Convert expressions into the form
– (l1,1…  l1,k )  …  (ln,1…  ln,k )
– Conjunction of disjunctions
– k-CNF (k literals)
• Every expression can be transformed into 3-
CNF
Conjunctive Normal Form

• Original R2 (From Wumpus)


– B1,1(P1,2P2,1)
• Biconditional elimination
– (B1,1 (P1,2P2,1))  ((P1,2P2,1)  B1,1)
• Implication elimination
– (B1,1(P1,2P2,1))  ((P1,2P2,1)  B1,1)
• De Morgan
– (B1,1P1,2P2,1)  ((P1,2  P2,1)  B1,1)
• Distributivity of 
– (B1,1P1,2P2,1)  (P1,2B1,1)  (P2,1 B1,1)
Conjunctive Normal Form
• Algorithms exist for 3-CNF
– E.g. 3-SAT

You might also like