AI Unit-2
AI Unit-2
(MR22-1CS0105)
UNIT-II
KNOWLEDGE REPRESENTATION AND REASONING:
LOGICAL SYSTEMS KNOWLEDGE BASED SYSTEMS,
PROPOSITIONAL LOGIC CONSTRAINTS, PREDICATE
LOGIC FIRST ORDER LOGIC, INFERENCE IN FIRST ORDER
LOGIC.
We require:
2. Procedural Knowledge
• It is also known as imperative knowledge.
• Procedural knowledge is a type of knowledge which is responsible for knowing how to do
something.
• It includes rules, strategies, procedures, agendas, etc.
• Procedural knowledge depends on the task on which it can be applied.
3. Meta-knowledge:
• Knowledge about the other types of knowledge is called Meta-
knowledge.
4. Heuristic knowledge:
• Heuristic knowledge is representing knowledge of some experts
in a filed or subject.
5. Structural knowledge:
• Structural knowledge is basic knowledge to problem-solving.
• It describes relationships between various concepts such as kind
of, part of, and grouping of something.
There are mainly four ways of knowledge representation which are given as follows
1. Logical Representation
3. Frame Representation
4. Production Rules
Why logic?
The challenge is to design a language which allows one to represent all the
necessary knowledge.
Logic makes statements about the world which are true (or false) if the
state of affairs it represents is the case (or not the case).
Compared to natural languages (expressive but context sensitive) and
programming languages (good for concrete data structures but not
expressive) logic combines the advantages of natural languages and formal
languages.
Logical Representation
Logical representation is a language with some concrete rules which deals with propositions and
has no ambiguity in representation.
Logical representation means drawing a conclusion based on various conditions.
This representation lays down some important communication rules.
It consists of precisely defined syntax and semantics which supports the sound inference.
Each sentence can be translated into logics using syntax and semantics.
Syntax:
• Syntaxes are the rules which decide how we can construct legal sentences in the logic.
• It determines which symbol we can use in knowledge representation.
• How to write those symbols.
Semantics:
• Semantics are the rules by which we can interpret the sentence in the logic.
• Semantic also involves assigning a meaning to each sentence.
Atomic Proposition: Atomic propositions are the simple propositions. It consists of a single
proposition symbol. These are the sentences which must be either true or false.
Example:
a) 2+2 is 4, it is an atomic proposition as it is a true fact.
b) "The Sun is cold" is also a proposition as it is a false fact.
Compound proposition: Compound propositions are constructed by combining simpler or
atomic propositions, using parenthesis and logical connectives.
Example:
a) "It is raining today, and street is wet."
b) "Ankit is a doctor, and his clinic is in Mumbai."
Logical Connectives:
Logical connectives are used to connect two simpler propositions or representing a sentence
logically. We can create compound propositions with the help of logical connectives.
There are mainly five connectives, which are given as follows:
Negation: A sentence such as ¬ P is called negation of P. A literal can be either Positive literal or
negative literal.
Conjunction: A sentence which has ∧ connective such as, P ∧ Q is called a
conjunction.
Example: Rohan is intelligent and hardworking. It can be written as,
P= Rohan is intelligent,
Q= Rohan is hardworking. → P∧ Q.
Precedence of connectives:
Parenthesis, Negation, Conjunction(AND), Disjunction(OR), Implication,
Biconditional
Rules of Inference
Inference:
In artificial intelligence, we need intelligent computers which can create new logic from old logic
or by evidence, so generating the conclusions from evidence and facts is termed as Inference.
Inference rules:
Inference rules are the templates for generating valid arguments. Inference rules are applied to
derive proofs in artificial intelligence, and the proof is a sequence of the conclusion that leads to
the desired goal.
In inference rules, the implication among all the connectives plays an important role. Following
are some terminologies related to inference rules:
Implication: It is one of the logical connectives which can be represented as P → Q. It is a
Boolean expression.
Converse: The converse of implication, which means the right-hand side proposition goes to the
left-hand side and vice-versa. It can be written as Q → P.
Contrapositive: The negation of converse is termed as contrapositive, and it can be represented
as ¬ Q → ¬ P.
Inverse: The negation of implication is called inverse. It can be represented as ¬ P → ¬ Q.
Types of Inference rules:
1. Modus Ponens:
The Modus Ponens rule is one of the most important rules of inference, and it states that if P and
P → Q is true, then we can infer that Q will be true. It can be represented as:
Example:
Statement-1: "If I am sleepy then I go to bed" ==> P→ Q
Statement-2: "I am sleepy" ==> P
Conclusion: "I go to bed." ==> Q.
Hence, we can say that, if P→ Q is true and P is true then Q will be true.
2.Modus Tollens:
The Modus Tollens rule state that if P→ Q is true and ¬ Q is true, then ¬ P will also true. It can
be represented as:
Example
Statement-1: "If I am sleepy then I go to bed" ==> P→ Q
Statement-2: "I do not go to the bed."==> ~Q
Statement-3: Which infers that "I am not sleepy" => ~P
3. Hypothetical Syllogism:
The Hypothetical Syllogism rule state that if P→R is true whenever P→Q is true, and Q→R is
true. It can be represented as the following notation:
Example:
Statement-1: If you have my home key then you can unlock my home. P→Q
Statement-2: If you can unlock my home then you can take my money. Q→R
Conclusion: If you have my home key then you can take my money. P→R
4. Disjunctive Syllogism:
The Disjunctive syllogism rule state that if P∨Q is true, and ¬P is true, then Q will be true. It can
be represented as:
Example:
Statement-1: Today is Sunday or Monday. ==>P∨Q
Statement-2: Today is not Sunday. ==> ¬P
Conclusion: Today is Monday. ==> Q
5. Addition:
The Addition rule is one the common inference rule, and it states that If P is true, then P ∨Q will
be true.
6. Simplification:
The simplification rule state that if P∧ Q is true, then Q or P will also be true. It can be
represented as:a
7. Resolution:
The Resolution rule state that if P∨Q and ¬ P∧R is true, then Q ∨R will also be true. It can be
represented as
First-Order logic:
First-order logic is another way of knowledge representation in artificial intelligence. It is an
extension to propositional logic.
First-order logic is also known as Predicate logic or First-order predicate logic.
First-order logic is a powerful language that develops information about the objects in a more
easy way and can also express the relationship between those objects.
• First-order logic (like natural language) does not only assume that the world contains facts like
propositional logic but also assumes the following things in the world:
• Objects: A, B, people, numbers, colors, wars, theories, squares, pits, wumpus, ......
• Relations: It can be unary relation such as: red, round, is adjacent, or n-any relation such as: the
sister of, brother of, has color, comes between
• Function: Father of, best friend, third inning of, end of, ......
Universal Generalization:
• Universal generalization is a valid inference rule which states that if premise P(c) is true for any
arbitrary element c in the universe of discourse, then we can have a conclusion as ∀ x P(x).
Example: Let's represent, P(c): "A byte contains 8 bits", so for ∀ x P(x) "All bytes
contain 8 bits.", it will also be true.
Universal Instantiation:
• Universal instantiation is also called as universal elimination or UI is a valid
inference rule. It can be applied multiple times to add new sentences.
• The new KB is logically equivalent to the previous KB.
• As per UI, we can infer any sentence obtained by substituting a ground term for
the variable.
Example:1.
IF "Every person like ice-cream"=> ∀x P(x) so we can infer that
"John likes ice-cream" => P(c)
Example: 2.
Let's take a famous example,
"All kings who are greedy are Evil." So let our knowledge base contains this detail as in the form
of FOL:
∀x king(x) ∧ greedy (x) → Evil (x),
So from this information, we can infer any of the following statements using Universal
Instantiation:
• King(John) ∧ Greedy (John) → Evil (John),
• King(Richard) ∧ Greedy (Richard) → Evil (Richard),
• King(Father(John)) ∧ Greedy (Father(John)) → Evil (Father(John)),
Existential Instantiation:
• Existential instantiation is also called as Existential Elimination, It can be
applied only once to replace the existential sentence.
• The new KB is not logically equivalent to old KB, but it will be satisfiable if
old KB was satisfiable.
Example:
From the given sentence: ∃x Crown(x) ∧ OnHead(x, John),
So we can infer: Crown(K) ∧ OnHead( K, John), as long as K does not
appear in the knowledge base.
• The above used K is a constant symbol, which is called Skolem constant.
• The Existential instantiation is a special case of Skolemization process.
Existential Introduction
• An existential introduction is also known as an existential generalization, which is a valid
inference rule in first-order logic.
• This rule states that if there is some element c in the universe of discourse which has a property P,
then we can infer that there exists something in the universe which has the property P.
Example:
Let's say that,
"Priyanka got good marks in English."
"Therefore, someone got good marks in English."
Well Formed Formulas
Not all strings can represent propositions of the predicate logic.
Those which produce a proposition when their symbols are interpreted must follow the rules
given below, and they are called wffs(well-formed formulas) of the first order predicate logic.
A predicate name followed by a list of variables such as P(x, y), where P is a predicate name,
and x and y are variables, is called an atomic formula.
Wffs are constructed using the following rules:
1. True and False are wffs.
2. Each propositional constant (i.e. specific proposition), and each propositional variable (i.e. a
variable representing propositions) are wffs.
3. Each atomic formula (i.e. a specific predicate with variables) is a wff.
Resolution
Resolution is a theorem proving technique that proceeds by building refutation proofs, i.e., proofs
by contradictions.
Resolution is used, if there are various statements are given, and we need to prove a conclusion
of those statements.
Resolution is a single inference rule which can efficiently operate on the conjunctive normal
form or clausal form.
Conjunctive Normal Form: A sentence represented as a conjunction of clauses is said to
be conjunctive normal form or CNF.
Example:
a) Ravi likes all kind of food.
b) Apple and chicken are food.
c) Anything anyone eats and is not killed is food.
d) Ajay eats peanuts and still alive.
4. eats(ajay,peanuts) ∧ alive(ajay)
5. ∀x: ¬killed(x) -> alive(x)
added predicates
6. ∀x: alive(x) -> ¬killed(x)
a->b : ¬a V b
a<->b : a->b ∧ b->a
2. Move ¬ inward
¬(∀xp) = ∃x¬p
¬(∃xp) = ∀x¬p
¬(aVb) = ¬a ∧ ¬b
¬(a ∧ b) = ¬a V ¬b
¬( ¬a ) = a
3. Rename Variable.
4. Replace Existential Quantifier by skolem constant
∃x Rich(x) = Rich(G1)
(II)
(III)
(v)
Uncertain Knowledge and Reasoning
Inreal life, it is not always possible to determine the state of the environment as it might not be
clear. Due to partially observable or non-deterministic environments, agents may need to handle
uncertainty and deal with:
Uncertain data: Data that is missing, unreliable, inconsistent or noisy
Uncertain knowledge: When the available knowledge has multiple causes leading to multiple
effects or incomplete knowledge of causality in the domain
Uncertain knowledge representation: The representations which provides a restricted model of
the real system, or has limited expressiveness
Inference: In case of incomplete or default reasoning methods, conclusions drawn might not be
completely accurate. Let’s understand this better with the help of an example.
In such uncertain situations, the agent does not guarantee a solution but acts on its own
assumptions and probabilities and gives some degree of belief that it will reach the required
solution.
Such uncertain situations can be dealt with using
Probability theory
Truth Maintenance systems
Fuzzy logic.
Probability
Probability is the degree of likeliness that an event will occur.
It provides a certain degree of belief in case of uncertain situations.
It is defined over a set of events U and assigns value P(e) i.e. probability of occurrence of event e
in the range [0,1].
Here each sentence is labeled with a real number in the range of 0 to 1, 0 means the sentence is
false and 1 means it is true.
Conditional Probability or Posterior Probability is the probability of event A given that B has
already occurred.
For example, P(It will rain tomorrow| It is raining today) represents conditional probability of it
raining tomorrow as it is raining today.
P(A|B) + P(NOT(A)|B) = 1
Joint probability is the probability of 2 independent events happening simultaneously like rolling
two dice or tossing two coins together.
Bayes Theorem
Bayes' theorem is also known as Bayes' rule, Bayes' law, or Bayesian reasoning, which
determines the probability of an event with uncertain knowledge.
In probability theory, it relates the conditional probability and marginal probabilities of two
random events.
A B
P(B|A) is called the likelihood, in which we consider that hypothesis is true, then we calculate the
probability of evidence.
P(A) is called the prior probability, probability of hypothesis before considering the evidence.
Nodes
Uncertainity (Dog bark/Rain)
Rain
Ditrected Edges
Cat Hide
Directed Arrow represents conditional probability
Alarm
P1 Calls P2 Calls