19-20-21. Propositional Logic Forward Chaining and Backward Chaining
19-20-21. Propositional Logic Forward Chaining and Backward Chaining
CS 5003
Proportional Logic
First-order Logic
INTRODUCTION
In order to solve the complex problems encountered in AI,
one needs both a large amount of knowledge and some
mechanisms for manipulating that knowledge to create
solutions to new problems.
based agents.
KNOWLEDGE-BASED AGENTS
The central component of a knowledge-based agent is its
knowledge base, or KB. A knowledge base is a set of
sentences.
we need specify only what the agent knows and what its
goals are, in order to fix its behavior.
KNOWLEDGE-BASED AGENTS
For example, an automated taxi might have the goal of
taking a passenger from San Francisco to Marin County and
might know that the Golden Gate Bridge is the only link
between the two locations. Then we can expect it to cross
the Golden Gate Bridge because it knows that will achieve its
goal.
Notice that this analysis is independent of how the taxi
Starting with an empty knowledge base, the agent designer can TELL
sentences one by one until the agent knows how to operate in its
environment. This is called the declarative approach to system building.
sometimes m is a model of α.
We use the notation M(α) to mean the set of all models of α.
LOGIC
A complex sentence:
Example1:
A-If Harsh was born in New Delhi, then he is Indian.
B-Harsh was born in New Delhi.
Therefore, Harsh is Indian.
Example2:
“If it rains, I will take a leave”, (P→Q)
“If it is hot outside, I will go for a shower”, (R→S)
“Either I will not take a leave or I will not go for a shower”, ¬Q∨¬S
Example3:
“If it rains, I will take a leave”, (P→Q)
“If it is hot outside, I will go for a shower”, (R→S)
“Either it will rain or it is hot outside”, P∨R
Therefore − "I will take a leave or I will go for a shower"
EXAMPLE:
Show that the hypotheses The hypotheses are –
“It is not sunny this afternoon and it is colder than yesterday”, NOT P AND Q,
“We will go swimming only if it is sunny”, R P,
“If we do not go swimming, then we will take a canoe trip”, and NOT R S,
“If we take a canoe trip, then we will be home by sunset” ST
lead to the conclusion
“We will be home by sunset”.
The first step is to identify propositions and use propositional variables to represent them.
P- “It is sunny this afternoon”
Q- “It is colder than yesterday”
R- “We will go swimming”
S- “We will take a canoe trip”
T- “We will be home by sunset”
The conclusion is – T
PROVING THINGS
A proof is a sequence of sentences, where each sentence is
either a premise or a sentence derived from earlier sentences
in the proof by one of the rules of inference.
The last sentence is the theorem (also called goal or query)
Entailment: KB |= Q
Q is entailed by KB (a set of premises or assumptions) if and only if
there is no logically possible world in which Q is false while all the
premises in KB are true.
Or, stated positively, Q is entailed by KB if and only if the
conclusion is true in every logically possible world in which all the
premises in KB are true.
Derivation: KB |- Q
We can derive Q from KB if there is a proof consisting of a sequence
of valid inference steps starting from the premises in KB and
resulting in Q
TWO IMPORTANT PROPERTIES FOR INFERENCE
Soundness: If KB |- Q then KB |= Q
IfQ is derived from a set of sentences KB using a given set of rules
of inference, then Q is entailed by KB.
Hence, inference produces only real entailments, or any sentence
that follows deductively from the premises is valid.
Completeness: If KB |= Q then KB |- Q
IfQ is entailed by a set of sentences KB, then Q can be derived
from KB using the rules of inference.
Hence, inference produces all entailments, or all valid sentences
can be proved from the premises.
LOGICAL EQUIVALENCE
two sentences α and β are logically equivalent if they are
true in the same set of models. Or
any two sentences α and β are equivalent only if each of
resolvent.
Example
Use resolution to show that the hypotheses
“Jasmine is skiing or it is not snowing” and
“It is snowing or Bart is playing hockey” imply that
“Jasmine is skiing or Bart is playing hockey.”
RESOLUTION
Example
Modus Ponens:
(A ⇒ B), A
-----------------
B
RESOLUTION
RESOLUTION
RESOLUTION ALGORITHM
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
The forward-chaining algorithm determines if a single
proposition symbol q—the query—is entailed by a knowledge
base of definite clauses.
It begins from known facts (positive literals) in the knowledge
conclusion.
This approach is data-driven.
the size of the knowledge base, because the process touches only
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
Tom is sweating (B).
If a person is running, he will sweat (A->B).
infection).
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
MYCIN uses the backward chaining technique to diagnose
bacterial infections.
DENDRAL employs forward chaining to establish the structure of
chemicals.
The backward and forward chaining techniques are used by the
propositional logic.
Example:
All
the girls are intelligent.
Some apples are sweet.
Propositional logic has limited expressive power.
In propositional logic, we cannot describe statements in terms of