0% found this document useful (0 votes)
4 views

19-20-21. Propositional Logic Forward Chaining and Backward Chaining

The document provides an overview of knowledge representation and reasoning in artificial intelligence, focusing on knowledge-based agents and various types of knowledge such as declarative, procedural, and heuristic knowledge. It discusses the importance of logical reasoning, including propositional and first-order logic, and the mechanisms for knowledge acquisition and inference. The document also highlights the structure and functioning of knowledge-based agents, emphasizing the role of a knowledge base and the operations of TELL and ASK.

Uploaded by

dsukdev131
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

19-20-21. Propositional Logic Forward Chaining and Backward Chaining

The document provides an overview of knowledge representation and reasoning in artificial intelligence, focusing on knowledge-based agents and various types of knowledge such as declarative, procedural, and heuristic knowledge. It discusses the importance of logical reasoning, including propositional and first-order logic, and the mechanisms for knowledge acquisition and inference. The document also highlights the structure and functioning of knowledge-based agents, emphasizing the role of a knowledge base and the operations of TELL and ASK.

Uploaded by

dsukdev131
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 68

BASIC ARTIFICIAL INTELLIGENCE

CS 5003

Dr. Bharat Singh

Department of Computer Science and Engineering

Indian Institute of Information Technology, Ranchi


OUTLINE
Knowledge Representation and Reasoning
 Knowledge-based Agent

 Proportional Logic

 First-order Logic
INTRODUCTION
 In order to solve the complex problems encountered in AI,
one needs both a large amount of knowledge and some
mechanisms for manipulating that knowledge to create
solutions to new problems.

 A variety of ways of representing knowledge (facts) have


been exploited in AI programs.
INTRODUCTION
 We are dealing with two different kinds of entities:
 Facts: truths in some relevant world. These are the things we
want to represent.
 Representation of facts in some chosen formalism. These are the
things we will actually be able to manipulate.

 One way to think of structuring these entities is as two


levels:
 Knowledge level: at which facts (including each agent’s
behaviors and current goals) are described.
 Symbol level: at which representations of objects at the
knowledge level are defined in terms of symbols that can be
manipulated by programs.
 According to the fig. 4.1, we will focus
on facts, on representations, and on
the two-way mappings that must exist
between them, we will call these links
representation mappings.
 The forward representation mapping maps
from facts to representations.
 The backward representation mapping
goes the other way, from representations
to facts.
 One representation of facts is so
common that it deserves special
mention: natural language (particularly
English) sentences.
 Dotted line across the top
represents the reasoning process
that a program is intended to
model.
 Solid line a across the bottom
represents the concrete reasoning
process that a particular program
performs.
APPROACHES TO KNOWLEDGE REPRESENTATION

 Representational Adequacy: the ability to represent


all of the kinds of knowledge that are needed in that
domain.
 Inferential Adequacy: the ability to manipulate the
representational structures in such a way to drive new
structures corresponding to new knowledge inferred from
old.
 Inferential Efficiency: the ability to incorporate into

the knowledge structure additional information that can


be used to focus the attention of the inference
mechanisms in the most promising directions.
 Acquisitional Efficiency: the ability to acquire new

information easily. The simplest case involves direct


insertion, by a person, of new knowledge into the
TYPES OF KNOWLEDGE-BASED AGENTS
 Declarative Knowledge – It includes concepts, facts, and objects
and expressed in a declarative sentence.
 Procedural Knowledge – This is responsible for knowing how to do
something and includes rules, strategies, procedures, etc. It is also
known as imperative knowledge. It can be directly applied to any task.
 Structural Knowledge – It is a basic problem-solving knowledge that
describes the relationship between concepts and objects. Structural
knowledge is basic knowledge to problem-solving.
 Meta Knowledge – Meta Knowledge defines knowledge about other
types of Knowledge.
 Heuristic Knowledge – This represents some expert knowledge in
the field or subject. Heuristic knowledge is rules of thumb based on
previous experiences, awareness of approaches, and which are good
to work but not guaranteed.
KNOWLEDGE AND REASONING
 Representation, Reasoning and Logic.
 Knowledge-based agent
 Proportional Logic.
 First-order Logic.
 Inference of first-order logic.
LOGICAL AGENTS
 Humans, it seems, know things; and what they know helps
them do things. These are not empty statements.
 They make strong claims about how the intelligence of

humans is achieved—not by purely reflex mechanisms but by


processes of reasoning that operate on internal
representations of knowledge.
 In AI, this approach to intelligence is embodied in knowledge-

based agents.
KNOWLEDGE-BASED AGENTS
 The central component of a knowledge-based agent is its
knowledge base, or KB. A knowledge base is a set of
sentences.

 Each sentence is expressed in a language called a knowledge


representation language and represents some assertion
about the world.

 Sometimes we dignify a sentence with the name axiom,


when the sentence is taken as given without being derived
from other sentences.
KNOWLEDGE-BASED AGENTS
 There must be a way to add new sentences to the
knowledge base and a way to query what is known. The
standard names for these operations are TELL and ASK,
respectively.
 Both operations may involve inference—that is, deriving

new sentences from old.


 Inference must obey the requirement that when one

ASKs a question of the knowledge base, the answer


should follow from what has been told (or TELLed) to
the knowledge base previously.
WORKING OF KB-AGENT
 Like all our agents, it takes a percept as input
and returns an action.
 The agent maintains a knowledge base, KB,
which may initially contain some back ground
knowledge. Each time the agent program is
called, it does three things.
 First, it TELLs the knowledge base what it
perceives.
 Second, it ASKs the knowledge base what
action it should perform. In the process of
answering this query, extensive reasoning may
be done about the current state of the world,
about the outcomes of possible action
sequences, and so on.
 Third, the agent program TELLs the knowledge
base which action was chosen, and the agent
executes the action.
KNOWLEDGE-BASED AGENTS
 The details of the representation language are hidden inside
three functions that implement the interface between the sensors
and actuators on one side and the core representation and
reasoning system on the other.
 MAKE-PERCEPT-SENTENCE constructs a sentence asserting that

the agent perceived the given percept at the given time.


 MAKE-ACTION-QUERY constructs a sentence that asks what

action should be done at the current time.


 MAKE-ACTION-SENTENCE constructs a sentence asserting that

the chosen action was executed.


KNOWLEDGE-BASED AGENTS
 The details of the inference mechanisms are hidden inside
TELL and ASK.
 Because of the definitions of TELL and ASK, however, the

knowledge-based agent is not an arbitrary program for


calculating actions.
 It is amenable to a description at the knowledge level, where

we need specify only what the agent knows and what its
goals are, in order to fix its behavior.
KNOWLEDGE-BASED AGENTS
 For example, an automated taxi might have the goal of
taking a passenger from San Francisco to Marin County and
might know that the Golden Gate Bridge is the only link
between the two locations. Then we can expect it to cross
the Golden Gate Bridge because it knows that will achieve its
goal.
 Notice that this analysis is independent of how the taxi

works at the implementation level.


 It doesn’t matter whether its geographical knowledge is

implemented as linked lists or pixel maps, or whether it


reasons by manipulating strings of symbols stored in
registers or by propagating noisy signals in a network of
neurons.
KNOWLEDGE-BASED AGENTS
 A knowledge-based agent can be built simply by TELLing it what it
needs to know.

 Starting with an empty knowledge base, the agent designer can TELL
sentences one by one until the agent knows how to operate in its
environment. This is called the declarative approach to system building.

 In contrast, the procedural approach encodes desired behaviors directly


as program code.

 We now understand that a successful agent often combines both


declarative and procedural elements in its design, and that declarative
knowledge can often be compiled into more efficient procedural code.

 We can also provide a knowledge-based agent with mechanisms that


allow it to learn for itself.
Thank You!!
Dr. Bharat Singh
Email id—
[email protected]
Mobile No–
8707223885
KNOWLEDGE AND REASONING
 Knowledge Representation, Reasoning and Logic
 Propositional Logic
 Logic
 First-order logic
 Inference in first-order logic
LOGIC
 A formal system for describing states of affairs, consisting
of:
 Syntax: describes how to make sentences that are well
formed, and
 Semantics: describes the relation between the sentences and
the states of affairs. It defines the truth of each sentence with
respect to each possible world.

 A proof theory – a set of rules for deducing the


entailments of a set of sentences.
LOGIC
 In standard logics, every sentence must be either true of false in
each possible world – there is no “in between.”
 We use the term model in place of “possible world.”

 Whereas possible worlds might be thought of as (potentially)

real environments that the agent might or might not be in,


models are mathematical abstractions, each of which simply
fixes the truth or falsehood of every relevant sentence.
 If a sentence α is true in model m, we say that m satisfies α or

sometimes m is a model of α.
 We use the notation M(α) to mean the set of all models of α.
LOGIC

 Now that we have a notion of truth, we are ready to talk


about logical reasoning.
 This involves the relation of logical entailment between

sentences—the idea that a sentence follows logically


from another sentence.
 In mathematical notation, we write:

to mean that the sentence α entails the sentence β.

 The formal definition of entailment is this: if


and only if, in every model in which α is true, β is also
true. We can write:
LOGIC
 The property of completeness is also desirable: an
inference algorithm is complete if it can derive any
sentence that is entailed.

 We have described a reasoning process whose conclusions


are guaranteed to be true in any world in which the
premises are true; in particular, if KB is true in the real
world, then any sentence α derived from KB by a sound
inference procedure is also true in the real world.

 The final issue to consider is grounding—the connection


between logical reasoning processes and the real
environment in which the agent exists.
INFERENCE RULES

 Logical inference is used to create new sentences that


logically follow from a given set of predicate calculus
sentences (KB).
 An inference rule is sound if every sentence X produced by
an inference rule operating on a KB logically follows from the
KB. (That is, the inference rule does not create any
contradictions)
 An inference rule is complete if it is able to produce every
expression that logically follows from (is entailed by) the KB.
(Note the analogy to complete search algorithms.)
LOGIC

if KB is true in the real world,


then any sentence α derived
from KB by a sound inference
procedure is also true in the
real world.
TYPES OF LOGICS
Language What exists Belief of agent

Propositional Logic Facts T/F/ Unknown

First-order Logic Facts, Objects, T/F/ Unknown


Relations
Temporal Logic Facts, Objects, T/F/ Unknown
Relations, Times

Probability Logic Facts Degree of Belief


[0…1]
Fuzzy Logic Degree of Truth Degree of Belief
[0..1]
PROPOSITIONAL LOGIC: A VERY SIMPLE LOGIC
 Propositional logic (PL) is the simplest form of logic where all the
statements are made by propositions.

 A proposition is a declarative sentence (that is, a sentence that


declares a fact) that is either true or false, but not both.

 It is a technique of knowledge representation in logical and


mathematical form.
 Propositions that cannot be expressed in terms of simpler

propositions are called atomic propositions


PROPOSITIONAL LOGIC

 Logical constants: true, false


 Propositional symbols: P, Q, S, ... (atomic
sentences)
 Wrapping parentheses: ( … ) Example:
• Delhi is the capital of India.
 Sentences are combined by connectives: • Mr. Narendra Modi the prime minister of India
 ...and [conjunction]


1+1=2
1+1=3
 ...or [disjunction]
...implies [implication / conditional]
Stand up.
..is equivalent [biconditional] Go there.
 ...not [negation] Why are you playing?
 Literal: atomic sentence or negated atomic sentence
EXAMPLES OF PL SENTENCES

 P means “It is hot.”


 Q means “It is humid.”
 R means “It is raining.”
 (P  Q)  R

“If it is hot and humid, then it is raining”


Q  P

“If it is humid, then it is hot”


 A better way:

Hot = “It is hot” Example:


Humid = “It is humid” a) It is Sunday.
b) The Sun rises from West (False proposition)
Raining = “It is raining” c) 3+3= 7(False proposition)
PROPOSITIONAL LOGIC (PL)
 A simple language useful for showing key ideas and
definitions
 User defines a set of propositional symbols, like P and Q.

 User defines the semantics of each propositional symbol:


P means “It is hot”
 Q means “It is humid”
 R means “It is raining”
 A sentence (well formed formula) is defined as follows:
A symbol is a sentence
 If S is a sentence, then S is a sentence
 If S is a sentence, then (S) is a sentence
 If S and T are sentences, then (S  T), (S  T), (S  T), and (S ↔ T)
are sentences
 A sentence results from a finite number of applications of the
above rules
A BNF GRAMMAR OF SENTENCES IN PROPOSITIONAL LOGIC
 Given a set of atomic propositions AP
Sentence Atom| ComplexSentence
Atom  True | False | AP
ComplexSentence  (Sentence)
| Sentence Connective Sentence
| Sentence

Connective  AND | OR | Bidirectional | implications


SOME TERMS

 The meaning or semantics of a sentence determines its


interpretation.
 Given the truth values of all symbols in a sentence, it can be

“evaluated” to determine its truth value (True or False).

 A model for a KB is a “possible world” (assignment of truth values


to propositional symbols) in which each sentence in the KB is True.
MORE TERMS

 A valid sentence or tautology is a sentence that is True


under all interpretations, no matter what the world is actually
like or how the semantics are defined. Example: “It’s raining
or it’s not raining.”
 An inconsistent sentence or contradiction is a sentence
that is False under all interpretations. The world is never like
what it describes, as in “It’s raining and it’s not raining.”

 P entails Q, written P |= Q, means that whenever P is True,


so is Q. In other words, all models of P are also models of Q.
TRUTH TABLES
TRUTH TABLES II
The five logical connectives:

A complex sentence:

a) "It is raining today, and street is wet."


b) "Ankit is a doctor, and his clinic is in Mumbai."
MODELS OF COMPLEX SENTENCES
EXAMPLE-1

W- I will work hard.


V- there are vacancies.
J- I will get the job.

I will not get the job, if I don’t work hard


-W  -J
If I work hard but there are no vacancies, I would not get the
job.
(W AND -V)  --J
If I will work hard, then get the job.
W J
SOUND RULES OF INFERENCE
 Here are some examples of sound rules of inference
A rule is sound if its conclusion is true whenever the premise
is true
 Each can be shown to be sound using a truth table
RULE PREMISE CONCLUSION
Modus Ponens A, A  B B
AND Introduction A, B AB
AND Elimination AB A
Double Negation A A
Unit Resolution A  B, B A
Resolution A  B, B  C AC
Modus Tollens A  B, B A
Hypothetical Syllogism A  B, B  C AC
Disjunctive Syllogism A  B, A B
Addition A AB
SOUND RULES OF INFERENCE
 Here are some examples of sound rules of inference
A rule is sound if its conclusion is true whenever the premise
is true
 Each can be shown to be sound using a truth table
RULE PREMISE CONCLUSION
Constructive Dilemma (P→Q)∧(R→S)
P∨R
∴Q∨S
Destructive Dilemma (P→Q)∧(R→S)
¬Q∨¬S
∴¬P∨¬R
EXAMPLE

Example1:
A-If Harsh was born in New Delhi, then he is Indian.
B-Harsh was born in New Delhi.
Therefore, Harsh is Indian.

Example2:
“If it rains, I will take a leave”, (P→Q)
“If it is hot outside, I will go for a shower”, (R→S)
“Either I will not take a leave or I will not go for a shower”, ¬Q∨¬S

Therefore − "Either it does not rain or it is not hot outside"

Example3:
“If it rains, I will take a leave”, (P→Q)
“If it is hot outside, I will go for a shower”, (R→S)
“Either it will rain or it is hot outside”, P∨R
Therefore − "I will take a leave or I will go for a shower"
EXAMPLE:
Show that the hypotheses The hypotheses are –
“It is not sunny this afternoon and it is colder than yesterday”, NOT P AND Q,
“We will go swimming only if it is sunny”, R  P,
“If we do not go swimming, then we will take a canoe trip”, and NOT R  S,
“If we take a canoe trip, then we will be home by sunset” ST
lead to the conclusion
“We will be home by sunset”.
The first step is to identify propositions and use propositional variables to represent them.
P- “It is sunny this afternoon”
Q- “It is colder than yesterday”
R- “We will go swimming”
S- “We will take a canoe trip”
T- “We will be home by sunset”

The conclusion is – T
PROVING THINGS
 A proof is a sequence of sentences, where each sentence is
either a premise or a sentence derived from earlier sentences
in the proof by one of the rules of inference.
 The last sentence is the theorem (also called goal or query)

that we want to prove.


 Example:

1 Humid Premise “It is humid”

2 HumidHot Premise “If it is humid, it is hot”

3 Hot Modus Ponens(1,2) “It is hot”

4 (HotHumid)Rain Premise “If it’s hot & humid, it’s raining”

5 HotHumid And Introduction(1,2) “It is hot and humid”

6 Rain Modus Ponens(4,5) “It is raining”


ENTAILMENT AND DERIVATION

 Entailment: KB |= Q
Q is entailed by KB (a set of premises or assumptions) if and only if
there is no logically possible world in which Q is false while all the
premises in KB are true.
 Or, stated positively, Q is entailed by KB if and only if the
conclusion is true in every logically possible world in which all the
premises in KB are true.
 Derivation: KB |- Q
 We can derive Q from KB if there is a proof consisting of a sequence
of valid inference steps starting from the premises in KB and
resulting in Q
TWO IMPORTANT PROPERTIES FOR INFERENCE

Soundness: If KB |- Q then KB |= Q
 IfQ is derived from a set of sentences KB using a given set of rules
of inference, then Q is entailed by KB.
 Hence, inference produces only real entailments, or any sentence
that follows deductively from the premises is valid.
Completeness: If KB |= Q then KB |- Q
 IfQ is entailed by a set of sentences KB, then Q can be derived
from KB using the rules of inference.
 Hence, inference produces all entailments, or all valid sentences
can be proved from the premises.
LOGICAL EQUIVALENCE
 two sentences α and β are logically equivalent if they are
true in the same set of models. Or
 any two sentences α and β are equivalent only if each of

them entails the other:

 NOT(P and Q) = (NOT P OR NOT Q)


 P Q = Not P or Q
 = not(P and not Q)
LOGICAL EQUIVALENCE
 Validity
 Satisfiability: A sentence is satisfiable if it is true in, or satisfied by, some
model.

 Validity and satisfiability are of course connected:


 α is valid iff ¬α is unsatisfiable; contrapositively, α is satisfiable iff ¬α is
not valid.

 Proof by refutation or proof by contradiction.


One assumes a CONTRADICTION sentence β to be false and shows that
this leads to a contradiction with known axioms α. This contradiction
is exactly what is meant by saying that the sentence (α ∧ ¬β) is
unsatisfiable
CNF (CONJUNCTIVE NORMAL FORM)
A literal is either a propositional variable, or the negation of
one.
 Examples: p, ¬p.

A clause is a disjunction of literals.


 Example: p ∨ ¬q ∨ r.

A formula in conjunctive normal form (CNF) is a conjunction


of clauses.
 Example: (p ∨ ¬q ∨ r) ∧ (¬p ∨ ¬r)
CNF (CONJUNCTIVE NORMAL FORM)
 A sentence expressed as a conjunction of clauses is said to
be in conjunctive normal form or CNF

“every sentence of propositional logic is logically equivalent to


a conjunction of clauses. “

 We now describe a procedure for converting to CNF


1. Eliminate ⇔, replacing α ⇔ β with (α ⇒ β) ∧ (β ⇒ α).
2. Eliminate ⇒, replacing α ⇒ β with ¬α ∨ β:
3. CNF requires ¬ to appear only in literals, so we “move ¬ inwards”
by repeated application of the following equivalences
¬(¬α) ≡ α (double-negation elimination)
¬(α ∧ β) ≡ (¬α ∨ ¬β) (De Morgan)
¬(α ∨ β) ≡ (¬α ∧ ¬β) (De Morgan)
4. Now we have a sentence containing nested ∧ and ∨ operators
applied to literals
DISJUNCTIVE NORMAL FORM (DNF)
 Disjunctive normal form (DNF) is the normalization of a
logical formula.
 A logical formula is said to be in disjunctive normal form if it

is a disjunction of conjunctions with every variable.


 One defines formulae in disjunctive normal form (DNF) by

swapping the words ‘conjunction’ and ‘disjunction’ in the


definitions above.

 Example: (¬p ∧ q ∧ r) ∨ (¬q ∧ ¬r) ∨ (p ∧ r).


HORN CLAUSE
 A Horn clause is a clause (a disjunction of literals) with at
most one positive
 A clause with at most one positive (unnegated) literal is

called a Horn Clause.


 Example:- ( A ∨ ¬B) ∧ (¬A ∨ ¬C ∨ D)

Types of Horn Clause


 Definite clause / Strict Horn clause –
It has exactly one positive literal.
 Unit clause –

Definite clause with no negative literals.


 Goal clause –

Horn clause without a positive literal.


RESOLUTION
 Computer programs have been developed to automate the
task of reasoning and proving theorems. Many of these
programs make use of a rule of inference known as
resolution. This rule
of inference is based on the tautology.

 ((p ∨ q) ∧ (¬p ∨ r)) → (q ∨ r).

resolvent.

 Example
 Use resolution to show that the hypotheses
 “Jasmine is skiing or it is not snowing” and
 “It is snowing or Bart is playing hockey” imply that
 “Jasmine is skiing or Bart is playing hockey.”
RESOLUTION
Example

 Show that the premises


(p ∧ q) ∨ r and r → s imply the conclusion p ∨ s.
RESOLUTION
 Simple, efficient, sound, complete... dominates AI reasoning,
automated logical proof
 Produces proofs by refutation: to prove S from KB,

add ∼ S to KB and show that set of clauses is inconsistent.


 CNF ("Clause Normal Form" or "Conj. Nor. Form") means

simplicity: no issues of how to represent knowledge


 Every FOL sentence convertable to CNF.
RESOLUTION
 how Resolution is closely related
to the true inference rule
(mislabeled here) of "transitivity
of implication").

 Thus "unit" resolution produces a


new clause with one less term
than its longer parent. As we
have seen, it's clostly related to
modus ponens.

 Modus Ponens:
(A ⇒ B), A
-----------------
B
RESOLUTION
RESOLUTION
RESOLUTION ALGORITHM
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
 The forward-chaining algorithm determines if a single
proposition symbol q—the query—is entailed by a knowledge
base of definite clauses.
 It begins from known facts (positive literals) in the knowledge

base. If all the premises of an implication are known, then its


conclusion is added to the set of known facts.
 FACTS  Solution (Bottom-Up strategy)

 In this type of chaining, the inference engine starts by

evaluating existing facts, derivations, and conditions before


deducing new information.
 Forward chaining is an example of the general concept of

data-driven reasoning—that is, reasoning in which the


focus of attention starts with the known data.
PROPERTIES OF FORWARD CHAINING
 The process uses a down-up approach (bottom to top).
 It starts from an initial state and uses facts to make a

conclusion.
 This approach is data-driven.

 It’s employed in expert systems and production rule system.

 Tom is running (A)


 If a person is running, he will sweat (A->B)

 Therefore, Tom is sweating. (B)


FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
 The backward-chaining algorithm, as its name suggests, works
backward from the query. If the query q is known to be true, then
no work is needed.
 Otherwise, the algorithm finds those implications in the knowledge

base whose conclusion is q.


 If all the premises of one of those implications can be proved true

(by backward chaining), then q is true.


 Solution  Facts (Top-Down Strategy)

 Backward chaining is a form of goal-directed reasoning.

 It is useful for answering specific questions such as “What shall I

do now?” and “Where are my keys?”


 Often, the cost of backward chaining is much less than linear in

the size of the knowledge base, because the process touches only
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
 Tom is sweating (B).
 If a person is running, he will sweat (A->B).

 Tom is running (A).

 The patient is vomiting.


 He/she is also experiencing diarrhea and severe stomach upset.

 Therefore, the patient has typhoid (salmonella bacterial

infection).
FORWARD AND BACKWARD CHAINING IN
PROPOSITIONAL LOGIC
 MYCIN uses the backward chaining technique to diagnose
bacterial infections.
 DENDRAL employs forward chaining to establish the structure of

chemicals.
 The backward and forward chaining techniques are used by the

inference engine as strategies for proposing solutions or deducing


information in the expert system.
FORWARD AND BACKWARD CHAINING IN PROPOSITIONAL
LOGIC
PROPOSITIONAL LOGIC IS A WEAK LANGUAGE
 Propositional logic quickly becomes impractical, even for very
small worlds
 Hard to identify “individuals” (e.g., Mary, 3)

 Can’t directly talk about properties of individuals or relations

between individuals (e.g., “Bill is tall”)


 We cannot represent relations like ALL, some, or none with

propositional logic.
 Example:
 All
the girls are intelligent.
 Some apples are sweet.
 Propositional logic has limited expressive power.
 In propositional logic, we cannot describe statements in terms of

their properties or logical relationships.


PROPOSITIONAL LOGIC IS A WEAK LANGUAGE

 Generalizations, patterns, regularities can’t easily be


represented (e.g., “all triangles have 3 sides”)
 First-Order Logic (abbreviated FOL or FOPC) is expressive

enough to concisely represent this kind of information


FOL adds relations, variables, and quantifiers, e.g.,
“Every elephant is gray”:  x (elephant(x) → gray(x))

“There is a white alligator”:  x (alligator(X) ^ white(X))


Thank You!!
Dr. Bharat Singh
Email id—
[email protected]
Mobile No–
8707223885

You might also like