12.2 Resolution Theorem Proving
12.2 Resolution Theorem Proving
the set of axioms. Resolve these clauses together, producing new clauses that logically follow from them(12.2.3). Produce a contradiction by generating the empty clause. The substitutions used to produce the empty clause are those under which the opposite of the negated goal is true(12.2.5).
Fido is a dog.,
All dogs are animals. and All animals will die.
Changing premises to predicates "(x) (dog(X) animal(X)) dog(fido) Modus Ponens and {fido/X} animal(fido) "(Y) (animal(Y) die(Y)) Modus Ponens and {fido/Y} die(fido)
Resolution Theorem Proving 4
4. die(fido)
die(fido)
dog(X) animal(X)
{Y/X} dog(fido) {fido/Y} die(fido)
animal(Y) die(Y)
dog(Y) die(Y)
die(fido)
KU NLP
KU NLP
skolemization.
KU NLP
of disjuncts form
(a b) (c d)
= (a (c d)) (b (c d))
= (a c) (a d) (b c) (b d)
step 8: Call each conjunct a separate clause step 9: Standardize the variables apart again.
Variables are renamed so that no variable symbol appears in more than one clause.
("X)(a(X) b(X))=("X)a(X) (" Y)b(Y)
Resolution Theorem Proving 9
KU NLP
Skolemization
10
KU NLP
d(X,Y))]) ("x)(e(X))
step 2: ("X)([a(X) b(X)] [c(X,I) ($Y)(("Z)[c(Y,Z)]
d(X,Y))]) ("x)(e(X))
step 3: ("X)([a(X) b(X)] [c(X,I) ($Y)(("Z)[c(Y,Z)]
d(X,Y))]) ("W)(e(W))
step 4: ("X)($Y)("Z)("W)( [a(X) b(X)] [c(X,I) (c(Y,Z)
d(X,Y))]) (e(W))
step 5: ("X)("Z)("W)( [a(X) b(X)] [c(X,I) (c(f(X),Z)
d(X,f(X)))]) (e(W))
step 6: [a(X) b(X)] [c(X,I) (c(f(X),Z) d(X,f(X)))]) e(W)
Resolution Theorem Proving 11
KU NLP
12
KU NLP
KU NLP
Case 2: L is false in I
Then since C1 = L C1 is true in I, C1 must be true in I. Thus,
14
KU NLP
15
KU NLP
a b c
b
proved, a , is negated and added to the clause set. The derivation of indicates that the database of clauses is inconsistent.
b c
c
ef
c d e
d e
d
f
Resolution Theorem Proving
f d
f
16
KU NLP
17
KU NLP
Lucky student
"X(pass(X,history) win(X,lottery) happy(X)) "X"Y(study(X) lucky(X) pass(X,Y)) study(john) lucky(john) "X(lucky(X) win(X,lottery))
18
KU NLP
19
KU NLP
lucky(john)
{} pass(john,history)
{john/V,history/W} lucky(john) {}
Resolution Theorem Proving 20
lucky(john)
KU NLP
Exciting Life
"X(poor(X) smart(X) happy(X))
"Z(Happy(Z) exciting(Z))
21
KU NLP
22
KU NLP
happy(Z)
poor(Y) read(Y)
{john/Y} read(john) {}
Resolution Theorem Proving
read(john)
23
KU NLP
read(john) {}
exciting(john) {}
Resolution Theorem Proving
24
whether they can be combined Search heuristics are very important in resolution proof procedures
Strategies
Breadth-First Strategy Set of Support Strategy Unit Preference Strategy
25
Breadth-First Strategy
First round: each clause is compared for resolution with
every other clause in the clause space. Second round: generate the clauses by resolving the clauses produced at the first round with all the original clauses. Nth round: generate the clauses by resolving all clauses at level n-1 against the elements of the original clause set and all clauses previously produced. Characteristics
it guarantees finding the shortest solution path because it generates every search state for each level before going any deeper. It is a complete strategy in that it is guaranteed to find a refutation if one exists. Figure 12.8 (p. 530)
Resolution Theorem Proving 26
of support such that S-T is satisfiable. Require that resolvents in each resolution have an ancestor in the set of support(T). Based on the insight that the negation of what we want to prove true is going to be responsible for causing the clause space to be contradictory. Forces resolutions between clauses of which at least one is either the negated goal clause or a clause produced by resolutions on the negated goal. Figure 12.6 (p. 528)
27
the parent clauses. Resolving with a clause of one literal, called a unit clause, will guarantee that the resolvent is smaller than the largest parent clause. Figure 12.9 (p. 531)
28
The result is then resolved with one of the axioms to get another new clause. The new clause is again resolved with one of the axioms. Try to resolve the most recently obtained clause with the original axioms.
29
KU NLP
KU NLP
at(fido, Z)
at(john, X) at(fido, X)
at(john, X)
at(john, library)
31
KU NLP
Exciting Life
read(Y) smart(Y)
read(john)
32