AI KB Systems
AI KB Systems
Announcements
• Review sessions
• CS 4701 – focus on AI
Schedule
• Search
• Machine learning
• Knowledge based systems
• Discovery
History of AI
1943 – 1969 The Beginnings
1943 McCulloch and Pitts show networks of neurons can compute and learn
any function
1950 Shannon and Turing wrote chess programs
1951 Minsky and Edmonds build the first neural network computer (SNARC)
1956 Dartmouth Conference – Newell and Simon brought a reasoning
program “The Logic Theorist” which proved theorems.
1952 Samuel’s checkers player
1958 McCarthy designed LISP, helped invent time-sharing and created Advice
Taker (a domain independent reasoning system)
1960’s Microworlds – solving limited problems: SAINT (1963), ANALOGY
(1968), STUDENT (1967), blocksworld invented.
1962 Perceptron Convergence Theorem is proved.
1952 Samuel’s checkers player o TV
Arthur Samuel (1901-1990)
Example ANALOGY Problem
Blocksworld
History of AI
1966 – 1974 Recognizing Lack of Knowledge
Models
• Model (i.e. possible world):
– Assignment of truth values to symbols
– Example: m={P=True , Q=False}
• Note: Often called “assignment” instead of “model”, and “model” is used for
an assignment that evaluates to true.
• Validity:
– A sentence is valid, if it is true in every model.
• Satisfiability:
– A sentence is satisfiable, if it is true in at least one model.
• Entailment:
– = if and only if, in every model in which is true, is also true.
Stay at home
• Sick StayAtHome
• true true
• false false
• false true
Models
Backward Chaining:
Given a fact q to be “proven”,
1. See if q is already in the KB. If so, return TRUE.
2. Find all implications, I, whose conclusion “matches” q.
3. Recursively establish the premises of all i in I via
backward chaining.
Avoids inferring unrelated facts.
Example: Backward Chaining
Knowledge-base describing when the car should brake:
( PersonInFrontOfCar Brake )
((( YellowLight Policeman ) (Slippery )) Brake )
( Policecar Policeman )
( Snow Slippery )
( Slippery Dry)
( RedLight Brake )
( Winter Snow )
Observation from sensors:
YellowLight RedLight Snow Dry Policecar PersonInFrontOfCar
Should the agent brake (i.e. can “brake” be inferred)?
• Goal: Brake
– Modus Ponens (brake): PersonInFrontOfCar
• Failure: PersonInFrontOfCar Backtracking
• Goal: Brake
– Modus Ponens (brake): YellowLight Policeman Slippery
– Known (YellowLight): Policeman Slippery
– Modus Ponens (Policeman): Policecar Slippery
– Known (Policecar): Slippery
– Modus Tollens (Slippery): Dry
– Known (Dry)
Conjunctive Normal Form
• Convert expressions into the form
– (l1,1… l1,k ) … (ln,1… ln,k )
– Conjunction of disjunctions
– k-CNF (k literals)
• Every expression can be transformed into 3-
CNF
Conjunctive Normal Form