0% found this document useful (0 votes)
8 views

Unit 8 Part 2 Probability and Uncertainty

Uploaded by

kcshristi070
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Unit 8 Part 2 Probability and Uncertainty

Uploaded by

kcshristi070
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Knowledge:

Knowledge enables us to act properly according to situation. Knowledge is necessary to show


intelligent behavior. To show intelligent behavior, machine must have some way to represent the
knowledge. To express the knowledge, we must have some language. We must empower the
machine with the knowledge of particular domain in which machine is expected to act. The
language must be understood by machine, and there must be some method to use it.
Inference mechanism reads the environment, use the knowledge and show certain action. Any
language must have some grammar or rules so that it can be understood or interpreted in
particular way. This helps the inference machine to understand it. It must have some syntax and
semantics. Pen cuts road is syntactically correct but semantically incorrect. Inference is
important to use knowledge and derive new facts from existing.

Logic:
Logics are formal languages for representing information such that conclusions can be drawn: It
has syntax and semantics:

– Syntax defines rules or grammar required for correct use and interpretation of language. It
specifies the symbols, operators, variables we can use in the language.

– Semantics define the meaning of sentences. That allows us to assign value to sentence
interpreted in certain world

Propositional logic:
Proposition is simple statement or sentence which may be true or false and propositional logic is
the language that allows to represents sentence, operate these facts and deduce new facts. Similar
to any language, it has syntax and semantics.

Syntax
The syntax or rules help us to represent statements, interpret it, make the knowledge base system,
deduce new facts from the existing ones etc. It allows us to represent or label the sentences in
symbol. We can use set of sentences in propositional logic. The sentence may be atomic or
compound sentence. For eg. Ram is tall. This is a proposition and propositional logic allows us
to label this statement symbolically by P.

Atom: Single proposition is atom. For eg. Today is Sunday. Constants like TRUE or FALSE are
also taken as atoms.
Literal: A single proposition or its negation is called literal.

Compound sentence: Two or more sentence combined through various connectors are called
compound statements. Proposition logic has various logical operators or connectors like V (or) ᴧ
(and) ¬ (Not) →(implication). Its value also may be true or false based on interpretation in
certain world.

Compiled By: Dinesh Maharjan 1


Let P and Q denotes some proposition, then P V Q is also a sentence whose value is be true if
one of P or Q is true otherwise false.
P Q PVQ
T T T
T F T
F T T
F F F

P ᴧ Q is also a sentence whose value is be true when both of P and Q are true otherwise false.
P Q Pᴧ Q
T T T
T F T
F T T
F F F

¬P is the sentence whose value will be true if P is false otherwise true.


P ¬P
T F
F T

P → Q is a sentence that states the if P is true then Q is true. It can be written as ¬P V Q as well
as ¬Q → ¬P.
P Q ¬P ¬P V Q P→ Q
T T F T T
T F F F F
F T T T T
F F T T T

P ↔ Q is the sentence states that P is true if and only if Q. It is similar to P XNOR Q i.e. it is
ANDing of P → Q and Q → P
P Q Q→P P→ Q P ↔ Q =(P→Q) ᴧ (Q→P) P XNOR Q
T T T T T T
T F T F F F
F T F T F F
F F T T T T

Semantics:
Interpretation of proposition in certain world provides the semantic or meaning of statement and
assigns value to the statement that may be true or false. A statement that is true in some world
may be false in some other world. It means interpretation is mapping of the given proposition in

Compiled By: Dinesh Maharjan 2


particular world to check its meaning or semantics. Interpretation is also defined as the
assignment of Boolean constant to Boolean variable or proposition.

Satisfiability:
If a sentence is evaluates to true under at least one interpretation, then the sentence is said to
have satisfiable. To check the satisfiability, we assign Boolean values to propositions and see
whether it is true at least for once.

Validity:
A proposition is said to be valid when it is always true irrespective of the world of its
interpretation. For eg. P V ¬P is always true thus it is a valid sentence or proposition such a
statement is also called tautology.
P Q ¬P PVQ PV¬P
T T F T T
T F F T T
F T T T T
F F T F T

Here, PV Q is satisfiable proposition and P V ¬P is a valid proposition.


Entailment:
If set of sentences S is true under all interpretation and a sentence S1 is also true under the same
interpretations then, set of sentences logically entails the sentence S1. It means if we can deduce
a sentence or fact S1 from given set of sentences or facts and rules S then the sentence S1
follows the set of sentences S under some interpretation.

Inference rules allows you to deduce the new things from existing ones. We have two popular
inference rules namely modus ponens and resolution.

Deductive reasoning:
Derives conclusions or new facts by combining given premises(facts) and rules.

Inductive reasoning:
Derives new rule by using set of premises, rules and conclusions.

Abductive reasoning:
Derives premises from set of rules, and conclusions.

Modus ponens:
This rule say that if P → Q and P is given (true) then we can infer or conclude that Q is true. This
→,
rule is written as follows.  , Where upper one is premises and down one is conclusion.
Premises means given facts or conditions.

Compiled By: Dinesh Maharjan 3


Horn Clause:
P 1 ᴧ P2 ᴧ P3 ……. ᴧ Pk → Q in which no ORing is possible in LHS and only atomic sentence
in RHS. No all proposition can be converted into Horn Clause. Horn clause is used as a rule for
forward and backward chaining.

Forward Chaining:
Knowledge Base consists set of facts and set of rules. Facts are the proposition in knowledge
base and rules are in the form of implication. Facts in knowledge base are always true. In
forward chaining, we start facts and rules and deduce new facts. The new fact is kept in
knowledge base and again new facts are deduced. This process is continued till we get our goal.
Suppose we have following facts in knowledge base.

F1:A
F2: B
F3: D
Also suppose we have following rules.
R1: A ᴧ B → C
R2: C ᴧ D → E
R2: C ᴧ F → G
Now deduce if E is a fact.
From fact F1 and F2 and Rule R1, C is true, thus kept in knowledge base.
From facts C, D and rule R2, E is fact. Hence proved.

Backward Chaining:
In backward Chaining, We start from goal, and check whether the RHS of rules in knowledge
base contains the goal propositions. If so, we replace the goal proposition with LHS of the rule.
This is done till we get the facts in our goal. This is done in most of logic programming
language. Backward chaining is more powerful as it only focus on the goal. It does not compute
any unnecessary goals. For eg.
Suppose we have following facts in knowledge base.
F1:A
F2: B
F3: D
Also suppose we have following rules.
R1: A ᴧ B → C
R2: C ᴧ D → E
R2: C ᴧ F → G
Now deduce if E is a fact using backward chaining.

Compiled By: Dinesh Maharjan 4


Conjunctive Normal form: It has compound terms with disjunction operators which are
ultimately combined through AND operator is called conjunction. For eg. (PVT) ᴧ Q ᴧ (¬ NVM)
Disjunctive Normal form: The combination of compound sentences that may contain
conjunctive operator through OR operator is called Disjunctive Normal form.
For eg. (P ᴧ S) V (Q ᴧ R).
Clause: The combination of literals through OR operator is called clause. PVQ, P, ¬ C

To convert ¬ (A→B) V (C→A) compound proposition into clause, following steps must be
taken.
1. Eliminate the implication ¬ (¬A V B) V (¬ C V A)
2. Eliminate the Double negation sign (A ᴧ ¬ B) V (¬ C V A)
3. Apply distributive or associative laws to get conjunctive normal forms.(A V ¬ C V A) ᴧ (¬ B V
¬ C V A) ((A V ¬ C ) ᴧ (¬ B V ¬ C V A)
4. Get the set of clauses.
(A V ¬ C )
(¬ B V ¬ C V A)

Compiled By: Dinesh Maharjan 5


Clausal form allows us to prove the fact given the set of facts through the process of resolution.
In fact process of proving a fact with given sets of fact is called entailment.

Convert followings in CNF:


1. (P ᴧ Q) V (P ᴧ ¬ Q)
2. ¬ (¬P ᴧ ( Q V ¬ ( R ᴧ S)))
3. ¬ (Q ᴧ R) V S
4. B↔(AVC)

Represent followings into propositional logic.


1. Ravi works hard. If ravi works hard, then he is a dull boy. If he is a dull boy, he will not get a
job. Prove that Ravi won’t get job.
2. If it neither rained nor foggy, the sailing race will be held and lifesaving demonstration will go
on. If sailing race will go on, the trophy will be awarded. Trophy was not awarded. Prove that, it
rained

Resolution:
Resolution is another method of inference. This rule says that if P V Q is true and ¬P is given
,¬ ,¬
(true), then Q is true. This rule is written as  . Resolution can be also written as 
¬→,→
¬→
Steps in resolution.
o Convert the given proposition into clausal form.
o Negate the sentence to be proved.
o Combine the clauses into set.
o Iteratively apply the resolution to the clausal sets.
o If we got the null clause then it lets us in the situation of contradiction and it proves the given
fact.
With following KB prove that S can be inferred with resolution but not with modes ponens.
P→Q, Q→S, ¬P→R, R→S
Exercise:
1. Represent the following proposition in propositional logic labels and deduce some
conclusions.
If the unicorn is mythical, then it is immortal but if it is not mythical, then it is mortal mammal.
If the unicorn is either immortal or a mammal, then it is horned. The unicorn is magical, if it is
horned. Check if the unicorn is mythical, mammal or mortal.
Let
P : unicorn is mythical.
Q: unicorn is mortal.
R: unicorn is mammal.
S: unicorn is horned.
T: unicorn is magical.
From given proposition,

Compiled By: Dinesh Maharjan 6


P → ¬ Q, ¬ P → Q ᴧ R, ¬ Q V R → S, S → R
2. Represent following English sentences into propositional logic.
It rains in July.
This book is not costly.
If it rains today and one does not carry umbrella, he will be drenched.
3. Represent following into propositional logic and deduce some conclusion.
Mammals drink milk.
Man is mortal.
Man is a mammal.
Tom is a man.
Prove Tom drink milk and tom is mortal.
4. Consider the knowledge base (B ↔ (A V C)) ᴧ ¬ B . Prove that ¬A can be inferred from this
KB by using resolution.
5. (PVR) ᴧ (R→Q) ᴧ (P→T) ᴧ ¬ S ᴧ (S→T) prove that Q is true through resolution.
6. (P ᴧ Q), (P → R), (Q ᴧR) → S are in knowledge base, prove that S is true using resolution
7. P → Q
LᴧM→P
BᴧL→M
AᴧP→L
AᴧB→L
A
B
Prove that Q can be inferred from above KB using forward and backward chaining.
8. Prove that smoke → smoke is valid and satisfiable but smoke → fire is satisfiable but not
valid.

First Order Logic:


First order logic is more powerful language than propositional logic. In first order logic, we use
predicate to represent the statements and predicate take one or more arguments. The predicate
may be true or false. It also has syntax and semantics. This logic assumes that world contains
object that have some function, properties and some relations. Thus first order logic is more
powerful as it can express the object, its properties, its functions and relation to other object.
First order logic can quantify the objects also.

Syntax:
We can use followings to represent statements in first order logic.

Constant: we can use constants like a, 5, lalitpur. In fact it represents the objects of real world.

Variables: Variables can be used in capital letters.

Predicate: we can use predicate that take arguments and return true or false value when
instantiated with the variable of some domain. For eg. Mother(x,y). The predicate can represent

Compiled By: Dinesh Maharjan 7


some property of object or relationship between various objects. For eg tall(ram). Here ram is
object and tall is its property. Mother(x,y) depicts relationship between x and y objects. Predicate
can take function as its argument also.

Function: We can use function that take arguments and return some value which may numeric,
non-numeric etc. function represents the function of an object.

Everyone loves their mother. ∀ ∃ ℎ ,  ,  Here loves is prdicate and so is
mother.
∀  , ℎ  Here mother is function that returns mother of x and loves is predicate. In
FOL, functions will never appear outside the predicate.

Different types of sentences like atomic sentence, complex sentence, quantified sentences.

Atomic sentence: It is the predicate that takes the arguments like constant, variable, or
functions. For eg mother(x,y).

Complex sentence: It is combination of two or more atomic sentences through connectives like
˄ , ¬, V, →, ↔. For eg Mother(x,y) V loves(x,y)

Quantified sentences contains quantifiers like∀, ∃. It is because of quantifiers which makes


inferencing in FOL makes very difficult.
Not all students take history and biology.
Student(x) : x is a student
Takes(x,y) : subject x is taken by student y.
∀   →  !" , ˄ #"$ , 
Express following in first order logic.
Cats are mammals. ∀ cat(x) →mammal(x)
Jane is a tall surveyor. Tall(jane) ˄ surveyor(jane)
A nephew is a sibling’s son. ∀ %&ℎ' ,  → ∃(")"$ , (˄ , (*
Everybody loves somebody.∀ ∃  , 
Nobody loves Jane. ∀ ¬ , + ¬∃  , +
Everybody has a father.
All men are mortal. ∀ ,  → , 
Some birds cannot fly. ∃ )" ˄¬-  
At least one planet has life on it. ∃ & ˄¬"- .
If ram is man, then ram is mortal. Man(ram)→mortal(ram)
All dogs are faithful. ∀ $  → -"ℎ- 
Some dogs bark. ∃ $ ˄)  
All dogs have four legs. ∀ $  → -$  ∀ $  → $ , 4
All barking dogs are irritating. ∀ $ ˄)   → """$ 
 
No dogs purr. ¬∃ $ ˄&  $ → -"ℎ- )
∀ ¬

Compiled By: Dinesh Maharjan 8


Fathers are male parent with children. ∀ -ℎ  → , ˄ℎ/ℎ" 
Students are people who are enrolled in courses.

Inference rules: It tells what can be deduced from what.


Universal Elimination: ∀ likes(x, ice-cream) when x is substituted with ram, we get likes(ram,
iceram) which is proposition. Universal elimination is done by using ground term i.e. some
constant which contains no variables and will not have any further instantiation..

Existential Elimination:
∃x likes(x, icecream) we can eliminate existential elimination with skolem constant which will
not appear in knowledge base elsewhere. For eg. Likes(ram,icecream) is elimination of
existential quantifier. That does not appear in knowledge base. Skolem constant is also a ground
term as it will have no further instantiation. This process is also called skolemization.

Existential Introduction:
We can convert likes(monalisa, ice-cream) ∃x likes(x, ice-cream) as we know that monalisa likes
ice-cream which means there exists some x who likes ice-cream.

Horn sentences:
Any atomic sentences are horn sentences. For eg likes(x,flower) is a horn sentence. Implication
with conjunction of predicates at LHS with no existential quantifier and atomic sentence at RHS
is called horn sentence without any existential quantifier. For eg. ∀ , &0 1&",  →
0 

Unification:
It is a process of finding a particular substitution that makes two atomic sentences identical. Two
atomic sentences like prime(x) and prime(7) can be identical when x is substituted with 7. It
means when these two sentences are unified, the x is replaced with 7. It is formally written as
UNIFY( prime(x), prime(7)) = {x/7}. Unification of prime(x,7) and prime(47,x) is impossible.
The unification is necessary when we use modus ponens for reasoning. It is also required for
resolution process. Unification provides a special fact for some particular subject. The
unification must be unrestricted. Unification must be as our requirement and we must use most
general unifier.

Substitution:
It replaces the variables with constant. It is used during unification of two predicates. For eg.
SUBST({x/49} , perfect(x) )=perfect(49).

AND elimination:
Prime(x) 1 composite(x) can be written as prime(x) , composite(x) by eliminating AND operator.
AND operator can be introduce if two predicates are true.
Resolution in FOL

Compiled By: Dinesh Maharjan 9


Duality
All men are mortal can always be written as no man is immortal. ∀ ,  → ,  =
¬∃ , 1¬, 
There exists birds that can fly. It is not the case that all birds cannot fly. ∃ )" 1-   =
)"  → ¬-  
¬∀

Represent following statements in first order logic and check if Traitorix is a criminal using
forward and backward chaining.
Law says that it is a crime to sell potion formula to hostile nations. The country Rome, an enemy
of Gaul has acquired some potion formulas and all of its potion formulas was sold by Druid
Traitorix. Traitorix is a Gaul.
Gaul(x)
Hostile(z)
Potion(y)
Criminal(x)
Sells(x,y,z).
Owns(x,y)
$ 1&" 1 , , (1ℎ"( → /"," 
Hostile(Rome)
∃ &" 1 ∋  , ,
&" 1' , , → "" , , ,

Gaul(Traitorix) …………a

Forward chaining
∃ &" 1 ∋  , ,
&"&1 ∋ &, , (Eliminating existential quantifier with p that is not in knowledge).

∀ &" 1' , , → "" , , ,


&"&1'&, , → "" , &, , (Unification by substituting y with p)

$ 1&" 1 , , (1ℎ"( → /"," 


$"" 1&"&1"" , &, ,1ℎ", → /",""" 
….b
(Unification by substituting {x/Traitorix}, {y/p}, {z/Rome}

Hostile(Rome) (substitution {z/Rome}) ……….. c


Sells(Traitorix,p,Rome) …………….. d
Potion(p) ……………..e

From a,b,c,d,e
Criminal(Traitorix)

Compiled By: Dinesh Maharjan 10


Modes ponens is not a complete inferring technique as it can’t deduce the negation of some fact.
And can’t deduce new facts. It means if any implication in knowledge base has negation then it
can’t be converted to horn clause and modus ponens can’t be used. It means it can’t deduce the
fact which actually can be deduced. For eg.

∀ &  → 3 
∀ 3  → 4 
∀ ¬&  → 5 
∀ 5  → 4 

From above knowledge base, we must be able to deduce S(A) is true. But using modus ponens,
we can’t.

Steps to convert into Conjunctive normal form.


o Eliminate implication.
o Move negation inwards
o Push the quantifiers left as far as possible.
o Eliminate the existential quantifier through skolemization.
o Drop all universal quantifiers.
o Distribute 1 over V
Steps to deduce new facts through resolution
o Negate the sentence to be proved and make it clause.
o Combine all the clauses into a set.
o Iteratively apply the resolution to the set and add the resolvent to the set.
o Continue till the null clause is obtained.
For eg. Can we infer someone smiling from following sentences?
All people who are graduating are happy.
All happy people smile.
Someone is graduating.
∀ $"$  → ℎ&&   …………1
∀ ℎ&&   → ,""$  ……………..2
∃ $"$  …………………..3
Goal is to prove
∃ ,""$ 
Negate the goal

¬∃ ,""$ 

Converting 1 and 2 into clausal form we get


∀ ¬$"$ 6ℎ&&   ………………. 4
∀ ¬ℎ&&  6,""$  …………………. 5
∃ $"$  ………………. 3
¬∃ ,""$  …………………… 6

Compiled By: Dinesh Maharjan 11


4,5,6,7 are clauses we have.
Reduce the scope of negation in 6
∀ ¬,""$  ……………..7
Standardize the variables apart.
Move all quantifiers to the left as much as possible. This must be done when there are more than
1 quantifier and not at the left of clause. Here is no change at all as all quantifiers are at the
leftmost.
Eliminate the existential quantifier using the skolem constant. Here ram is a skolem constant.
$"$,
Drop all the universal quantifiers.
¬$"$ 6ℎ&&   ………… a
¬ℎ&&  6,""$  …………… b
$"$, ………………….. c
¬,""$  ………………………d

Resolving b and d we get


¬ℎ&&  
This is resolved with a to get
¬$"$ 
This is resolved with c to get null clause. This proves that out supposition is wrong i.e.
smiling(x) is true.

Compiled By: Dinesh Maharjan 12


Reasoning under uncertainty:
Suppose we have following rule in our knowledge base.
∀& ,&,&, ℎ/ℎ → "&, /"  This rule is not correct because toothache can
be caused by many causes. Toothache can be caused through gum disease, hit by somebody.
These all causes must be specified comprehensively in the rule. Some causes may not be known
even. Thus specifying all these causes is a problem. In such a case which is uncertain about the
causes, we can use probability. Probability is very powerful tool for reasoning in uncertainty. We
compute the probability of some event and infer something from this probability.

The reasons for using probability reasoning under uncertainty are as follows
Specification becomes too large.
All antecedents are not known.
Some antecedents are not measurable.

Probability refers if a person is fat or not as we don’t know the person. Fuzzy logic refers how
fat is the person. Fat is not a true, false value in fuzzy logic. It is gradation. Possibility of being
fat is fuzzy.
Basics of probability
0<=P(A)<=1
P(true)=1
P(false)=0
Union probability of two independent events A and B is P(AVB)=P(A)+P(B)- P(A˄B).
Mutually exclusive events can’t occur simultaneously.

Prior probability (unconditional probability):


It corresponds to the degree of belief in absence of any other belief.
For eg. P(cavity)=0.1

Conditional Probability:
Occurring of some events may depend upon the occurring of other events. In such case we must
calculate the conditional probability. Conditional probability of an event is defined as the it’s
probability given other events. Conditional probability is used for dependent events.
For eg. P(cavity|toothache)=0.8 means probability of cavity when there is a toothache is 0.8.

Compiled By: Dinesh Maharjan 13


Independence among events:
Some events which does not affect the event of which probability is being calculated, then such
events can be dropped. For eg P(johncalls|alarm, wakeup) here wakeup event does not affect the
probability of john calls thus wakeup term can be dropped.
For eg if A is independent of B then
P(A|B)=P(A).
P(B|A)=P(B).
P(A˄B)=P(A)*P(B).

Bayes rules:
Bayes rule is very important for computing conditional probability for dependent events and
reasoning under uncertainty. Bayes rules states that
P(A|B)=P(B|A)*P(A)/P(B)
This can be proved like this
P(A˄B)=P(A|B)*P(B) …………. 1
P(B˄A)=P(B|A)*P(A) ………….2
From 1 and 2,
P(A|B)*P(B)=P(B|A)*P(A)
Or P(A|B)=P(B|A)*P(A)/P(B)

Joint probability is the method of representing the probability of all events. Joint probability
distribution encompasses all the probabilities of various combinations of different variables. It
also allows us to compute the probability of some events provided that other events are true or
false. In fact advantage of Full joint probability distribution is it enables us to answer all types
of queries. Joint probability of two independent events A and B is P(A˄B)=P(A)*P(B).
P(X1, X2,…, Xn) = P(Xn| Xn-1, Xn-2, ……….., X1) * P(Xn-1, ……….., X1)
= P(Xn| Xn-1, Xn-2, ……….., X1) * P(Xn-1|Xn-2, ……….., X1)P(Xn-2, ……….., X1)
=7?@AB 89" ∨ 9" − 1 … . . 9"
=P(Xi|parents(Xi)

Consider following example.

P(cavity)=0.108+0.012+0.072+0.008=0.200
P(toothache V cavity)= 0.108+0.012+0.072+0.008+0.016+0.064=0.280

Compiled By: Dinesh Maharjan 14


Belief Network/Bayesian Network
Belief network is the attempt to understand the cause effect relationship with their probability. It
means it is representation of causal connections between different events with associated
conditional probability. Belief network which is also known as bayesian network or causal
network can represent the uncertain knowledge and we can reason under uncertainty through
conditional probability of the events. There are two phases reasoning using belief network. First
phase is to learn the belief network and second phase is use of belief network. The learning of
belief network can be done with experimental data. This is called inductive reasoning. A belief
network is a graph with followings
o Nodes: It is set of random variables.
o Directed links: The intuitive meaning of link from a node X to another node Y is that X
has direct influence on Y.
o The graph has no directed cycles (DAG).
o Each node has conditional probability that quantifies the effect of its parent node.

For eg Burglary alarm at home. Burglary alarm can be caused by somebody at the door, short
circuit, wind, animals etc. The cause may be infinite and listing of them in rule is not possible.
Such a rule that does not contain all causes can’t be tautology and can’t be useful. The alarm is
reliable at detecting burglary. It responds at minor earthquake also. Two neighbours john and
mary can call police hearing the alarm. John always calls the police when he hears the alarm. He
even calls the police when he is confused with the telephone rings with alarm. But mary likes
loud music and sometimes misses the alarm. Belief network of this scenario is given below.

Compiled By: Dinesh Maharjan 15


P(J˄M˄A˄ ¬B˄ ¬E )= P(J|A)*P(M|A)*P(A|¬B, ¬E)*P(¬B)*P(¬E) =.90*.70*.001*(1-.001)*(1-
.002)

Construction of Bayesian Network.


1. Choose the set of relevant variables X that describes the domain.
2. Choose an ordering for the variables (very important step).
3. While there are variables left,
a. Pick a variable X and add a node for it.
b. Set parent(X) to some minimal set of existing nodes such that the conditional
independence property is satisfied.
c. Define the conditional probability table for X.
Ordering must be in such a way that causal effect relationship is in proper direction. If ordering
is correct then we can learn the conditional probability table efficiently. Improper ordering
increases size of probability table. The proper ordering of variables maximizes the use of
conditional independence. Improper ordering results very confusing conditional probability that
does not exists in practice. Probability table may be known or extracted from experimental data.

More on conditional independence.


If information about parent node is not known then a node is not conditionally independent of its
child, siblings and ancestors.
A set of nodes E d-separates two set of nodes X and Y if every undirected path from a node in X
to a node in Y is blocked given E.
A path is blocked given a set of nodes E if there exists a node Z in the path for which one of the
following holds.
- Z is in E and Z has one arrow on the path leading in and one leading out.

Here the evidence lack of iodine has one path leading in and one path leading out of it. Now this
node d-separates other nodes and other nodes becomes independent of each other. Now goiter
only depends upon hormonal imbalance and lack of iodine. It means we can infer goiter or
computer its conditional probability given hormonal imbalance and lack of iodine.

Compiled By: Dinesh Maharjan 16


If every undirected path from the set of nodes in X and set of nodes in Y is d-separated by a
given set of evidence nodes E, then X and Y are conditionally independent given E. This allows
us to restrict our computation to subset of belief network.

- Z is in E and Z has both path arrows leading out.

Here malnutrition is evidence that has both paths leading out of it. Thus this also d-separates
other nodes.
- Neither Z nor its descendants are in E, and both path arrows lead in to Z.

Here lack of iodine is evidence and goiter has both path leading in. Thus goiter d-separates other
nodes.

Compiled By: Dinesh Maharjan 17


Inferences in Belief Network.
After representing the uncertain knowledge in belief network, we can inference in belief network
by making queries like what is probability of some event given some events. For this we
compute the probability and infer the probability of some events. Following types are the
inferences we can do in belief network.
Diagnostics Reasoning: from effects to causes i.e. abductive reasoning. It means it is computing
the probability of parent given its child for eg. Given that john calls what is the probability of
burglary?
P(B|JC)=0.016

Causal Inferences: From causes to effects that is deductive reasoning. It means it is computing
the probability of a child given the parent. For eg. Given the burglary, what is the probability that
john calls, mary calls.
P(JC|B)=0.86
P(MC|B)=0.67

Intercausal inferences: It is the reasoning between causes of common events. It computes the
probability of a parent given its siblings. Given alarm and earthquake what is the probability of
burglary.
P(B|A,E)=0.003

Mixed inferences: Some causes and some effects are known. It means computing the probability
of an event when its parent and child is known. For eg. Given john calls and no earthquake, what
is probability of Alarm?
P(A|JC ˄ ¬E)=0.003

Default Reasoning:
It is paradigm where we make some default assumption about some facts. For eg. If we can see
two wheel of a car from one side, then we can infer that there are two wheels in other side. Thus
we make such assumption and add them in knowledge base. It is non-monotonic reasoning as the
facts derived may change. Default rule can’t be applied any time. For eg if we know that one tyre
is stolen, then we can’t say by default that there are 2 wheels at other side.

Compiled By: Dinesh Maharjan 18


Rule based reasoning:
Rule based reasoning has some rules and facts based upon which new rules and facts are
deduced. Thus modus ponens is a rule based reasoning. If precondition of a rule is matched then
post condition of the rule is always true.

Compiled By: Dinesh Maharjan 19

You might also like