L08 - Learning and Logic
L08 - Learning and Logic
Acknowledgement: Material derived from slides for the book Machine Learning, Tom M. Mitchell, McGraw-Hill, 1997 http://www-2.cs.cmu.edu/~tom/mlbook.html and the book Inductive Logic Programming: Techniques and Applications by N. Lavrac and S. Dzeroski, Ellis Horwood, New York, 1994 (available at http://www-ai.ijs.si/SasoDzeroski/ILPBook/) and the paper by A. Cootes, S.H. Muggleton, and M.J.E. Sternberg The automatic discovery of structural principles describing protein fold space. Journal of Molecular Biology, 2003. (available at http://www.doc.ic.ac.uk/~shm/jnl.html) and the book Data Mining (2e), Ian H. Witten and Eibe Frank, Morgan Kaufmann, 2005. http://www.cs.waikato.ac.nz/ml/weka
Aims
This lecture will introduce you to theoretical and applied aspects of representing hypotheses for machine learning in rst-order logic. Following it you should be able to: outline the key dierences between propositional and rst-order learning describe the problem of learning relations and some applications reproduce the basic FOIL algorithm and its use of information gain outline the problem of induction in terms of inverse deduction describe inverse resolution and least general generalisation dene the -subsumption generality ordering for clauses [Recommended reading: Mitchell, Chapter 10] [Recommended exercises: 10.5 10.7 (10.8)]
COMP9417: April 19, 2010 Learning and Logic: Slide 1
Relevant programs
Progol http://www.doc.ic.ac.uk/~shm/progol.html Aleph http://web.comlab.ox.ac.uk/oucl/research/areas/machlearn/Aleph FOIL http://www.rulequest.com/Personal/ iProlog http://www.cse.unsw.edu.au/~claude/research/software/ Golem http://www.doc.ic.ac.uk/~shm/golem.html See also: http://www-ai.ijs.si/~ilpnet2/systems/
Slide 2
Potentially useful inferences: P Q If the paper is red then the solution is acid
Q T F T F
P Q T F T T
Learning and Logic: Slide 4
Slide 6
Cannot use xed set of attributes where each attribute describes a linked node (how many attributes ?) Cannot use xed set of attributes to learn connectivity concepts . . .
Slide 7
Slide 8
BUT in rst order logic sets of rules can represent graph concepts such as Ancestor(x, y) P arent(x, y) Ancestor(x, y) P arent(x, z) Ancestor(z, y) The declarative programming language Prolog is based on the Horn clause subset of rst-order logic a form of Logic Programming : Prolog is a general purpose programming language: logic programs are sets of rst order rules pure Prolog is Turing complete, i.e., can simulate a Universal Turing machine (every computable function) learning in this representation is called Inductive Logic Programming (ILP)
COMP9417: April 19, 2010 Learning and Logic: Slide 9
Slide 10
predicates are dened by sets of clauses, each with that predicate in its head e.g. the recursive denition of ancestor/2 ancestor(X,Y) :- parent(X,Y). ancestor(X,Y) :- parent(X,Z), ancestor(Z,Y). clause head, e.g. ancestor/2, is to the left of the :- clause body, e.g. parent(X,Z), ancestor(Z,Y), is to the right of the :-
each instance of a relation name in a clause is called a literal a denite clause has exactly one literal in the clause head a Horn clause has at most one literal in the clause head Prolog programs are sets of Horn clauses Prolog is a form of logic programming (many approaches) related to SQL, functional programming, . . .
Slide 11
Slide 12
where
xi is ith training instance f (xi) is the target function value for xi B is other background knowledge
Induction as Inverted Deduction Induction is, in fact, the inverse operation of deduction, and cannot be conceived to exist without the corresponding operation, so that the question of relative importance cannot arise. Who thinks of asking whether addition or subtraction is the more important process in arithmetic? But at the same time much dierence in diculty may exist between a direct and inverse operation; . . . it must be allowed that inductive investigations are of a far higher degree of diculty and complexity than any questions of deduction. . . . (W.S. Jevons, 1874)
COMP9417: April 19, 2010 Learning and Logic: Slide 15 COMP9417: April 19, 2010
A photograph of the Logic Piano invented by William Stanley Jevons. Photograph taken at the Sydney Powerhouse Museum on March 5, 2006. This item is part of the collection of the Museum of the History of Science, Oxford and was on loan to the Powerhouse Museum. [From: http://commons. wikimedia.org/wiki/File: William_Stanley_Jevons_ Logic_Piano.jpg, April 19, 2010]
Learning and Logic: Slide 16
We have mechanical deductive operators F (A, B) = C, where A B C need inductive operators O(B, D) = h where (xi, f (xi) D) (B h xi) f (xi)
Positives: Subsumes earlier idea of nding h that ts training data Domain theory B helps dene meaning of t the data B h xi f (xi) Suggests algorithms that search H guided by B
Slide 17
Slide 18
Negatives: Doesnt allow for noisy data. Consider (xi, f (xi) D) (B h xi) f (xi) First order logic gives a huge hypothesis space H overtting... intractability of calculating all acceptable hs
1. Given initial clauses C1 and C2, nd a literal L from clause C1 such that L occurs in clause C2 2. Form the resolvent C by including all literals from C1 and C2, except for L and L. More precisely, the set of literals occurring in the conclusion C is C = (C1 {L}) (C2 {L}) where denotes set union, and denotes set dierence.
Slide 19
Slide 20
Inverting Resolution
C : KnowMaterial 2 C : PassExam V 1 KnowMaterial
V Study
C : PassExam V 1
C : KnowMaterial 2 KnowMaterial
Study
C: PassExam V
Study
C: PassExam V
Study
Slide 21
Slide 22
Duce operators
Op Same Head Identication p A, B qB p A, q p A, q Intra-construction p A, B1 w B1 p A, B2 w B2 p A, w Dierent Head Absorption p A, B qA
p q, B qA
Inter-construction p1 A, B1 p1 w, B1 p2 A, B2 p2 w, B2 wA
Slide 23
Slide 24
Cigol
Father (Tom, Bob) GrandChild( y,x ) V Father ( x,z ) V Father ( z,y )
{Bob/y, Tom/z}
GrandChild ( Bob,x) V
Father ( x,Tom )
{Shannon/x}
Slide 25
Slide 26
LGG
Plotkin (1969, 1970, 1971), Reynolds (1969) LGG of clauses is based on LGGs of literals Lgg of literals is based on LGGs of terms, i.e. constants and variables LGG of two constants is a variable, i.e. a minimal generalisation
Slide 27
Slide 28
LGG of atoms
Two atoms are compatible if they have the same predicate symbol and arity (number of arguments) lgg(a, b) for dierent constants or functions with dierent function symbols is the variable X lgg(f (a1, ..., an), f (b1, ..., bn)) is f (lgg(a1, b1), ..., lgg(an, bn)) lgg(Y1, Y2) for variables Y1, Y2 is the variable X Note: 1. must ensure that the same variable appears everywhere its bound arguments do in the atom 2. must ensure introduced variables appear nowhere in the original atoms
COMP9417: April 19, 2010 Learning and Logic: Slide 29 COMP9417: April 19, 2010
LGG of clauses
The LGG of two clauses C1 and C2 is formed by taking the LGGs of each literal in C1 with every literal in C2. Clauses form a subsumption lattice, with LGG as least upper bound and MGI (most general instance) as lower bound. Lifts the concept learning lattice to a rst-order logic representation. Leads to relative LGGs with respect to background knowledge.
Slide 30
Subsumption lattice
g
g g
lgg(a,b)
Given two ground instances of target predicate Q/k, Q(c1, c2, . . . , ck ) and Q(d1, d2, . . . , dk ), plus other logical relations representing background knowledge that may be relevant to the target concept, the relative least general generalisation (rlgg) of these two instances is:
b
a
i i
mgi(a,b)
for every pair r1, r2 of ground instances from each relation in the background knowledge.
{lgg(r1, r2)}
i
COMP9417: April 19, 2010 Learning and Logic: Slide 31 COMP9417: April 19, 2010 Learning and Logic: Slide 32
RLGG Example
RLGG Example
This gure depicts two scenes s1 and s2 and may be described by the predicates Scene/1, On/3, Left-of/2, Circle/1, Square/1 and Triangle/1.
Ground Instances (tuples) {< s1 >, < s2 >} {< s1, a, b >, < s2, f, e >} {< s1, b, c >, < s2, d, e >} {< a >, < f >} {< b >, < d >} {< c >, < e >}
Learning and Logic: Slide 34
Slide 33
RLGG Example
RLGG Example
To compute RLGG of the two scenes generate the clause: Scene(lgg(s1, s2)) On(lgg(s1, s2), lgg(a, f ), lgg(b, e)), Left-of(lgg(s1, s2), lgg(b, d), lgg(c, e)), Circle(lgg(a, f )), Square(lgg(b, d)), Triangle(lgg(c, e))
COMP9417: April 19, 2010 Learning and Logic: Slide 35
Compute LGGs to introduce variables into the nal clause: Scene(A) On(A, B, C), Left-of(A, D, E), Circle(B), Square(D), Triangle(E)
COMP9417: April 19, 2010 Learning and Logic: Slide 36
learning is set up as a search through the hypothesis space of rst order rules
COMP9417: April 19, 2010 Learning and Logic: Slide 37 COMP9417: April 19, 2010 Learning and Logic: Slide 38
FOIL(T arget predicate, P redicates, Examples) P os := positive Examples N eg := negative Examples while P os, do // Learn a N ewRule N ewRule := most general rule possible N ewRuleN eg := N eg while N ewRuleN eg, do // Add a new literal to specialize N ewRule Candidate literals := generate candidates Best literal := argmaxLCandidate literals F oil Gain(L, N ewRule) add Best literal to N ewRule preconditions N ewRuleN eg := subset of N ewRuleN eg that satises N ewRule preconditions Learned rules := Learned rules + N ewRule P os := P os {members of P os covered by N ewRule} Return Learned rules
COMP9417: April 19, 2010 Learning and Logic: Slide 39
Slide 40
Slide 41
Slide 42
Variable Bindings
A substitution replaces variables by terms Substitution applied to literal L is written L If = {x/3, y/z} and L = P (x, y) then L = P (3, z) FOIL bindings are substitutions mapping each variable to a constant: GrandDaughter(x, y) With 4 constants in our examples we have 16 possible bindings: {x/V ictor, y/Sharon}, {x/V ictor, y/Bob}, . . . With 1 positive example of GrandDaughter, other 15 bindings are negative: GrandDaughter(V ictor, Sharon)
COMP9417: April 19, 2010 Learning and Logic: Slide 43 COMP9417: April 19, 2010
Where
L is the candidate literal to add to rule R p0 = number of positive bindings of R n0 = number of negative bindings of R p1 = number of positive bindings of R + L n1 = number of negative bindings of R + L t is the number of positive bindings of R also covered by R + L
Slide 44
FOIL Example
Information Gain in FOIL
Note log2 p0p0 0 is minimum number of bits to identify an arbitrary positive +n binding among the bindings of R log2 p1p1 1 is minimum number of bits to identify an arbitrary positive +n binding among the bindings of R + L F oil Gain(L, R) measures the reduction due to L in the total number of bits needed to encode the classication of all positive bindings of R
Carol
Ted
FOIL Example
New clause: ancestor(X,Y) :-. Best antecedent: parent(X,Y) Gain: 31.02 Learned clause: ancestor(X,Y) :- parent(X,Y). New clause: ancestor(X,Y) :-. Best antecedent: parent(Z,Y) Gain: 13.65 Best antecedent: ancestor(X,Z) Gain: 27.86 Learned clause: ancestor(X,Y) :- parent(Z,Y), ancestor(X,Z). Denition: ancestor(X,Y) :- parent(X,Y). ancestor(X,Y) :- parent(Z,Y), ancestor(X,Z).
COMP9417: April 19, 2010 Learning and Logic: Slide 47
7 0 2 1 x
COMP9417: April 19, 2010
8 3 4
6 5
represents
LinkedTo(x,y)
Learning and Logic: Slide 48
17
FOIL Example
Instances: pairs of nodes, e.g 1, 5, with graph described by literals LinkedTo(0,1), LinkedTo(0,8) etc. Target function: CanReach(x,y) true i directed path from x to y Hypothesis space: Each h H is a set of Horn clauses using predicates LinkedT o (and CanReach)
Slide 49
Slide 51
Slide 52
Determinate Literals
adding a new literal Q(X, Y ) where Y is the unique value for X this will result in zero gain ! FOIL gives a small positive gain to literals introducing a new variable BUT there may be many such literals
Determinate Literals
Rening clause A L1, L2, . . . , Lm1 a new literal Lm is determinate if Lm introduces new variable(s) there is exactly one extension of each positive tuple that satises Lm there is no more than one extension of each negative tuple that satises Lm So Lm preserves all positive tuples and does not increase the set of bindings Determinate literals allow growing the clause to overcome greedy search myopia without blowing up the search space.
Slide 53
Slide 54
background knowledge 20 single page documents 244 components 57 relations specifying component type (text or picture) position on page alignment with other components test set error from 0% to 4%
Slide 55
Slide 56
Slide 57
Slide 58
Slide 60
Organization: Reference.Com Posting Service Message-ID: [email protected] SOFTWARE PROGRAMMER Position available for Software Programmer experienced in generating software for PC-Based Voice Mail systems. Experienced in C Programming. Must be familiar with communicating with and controlling voice cards; preferable Dialogic, however, experience with others such as Rhetorix and Natural Microsystems is okay. Prefer 5 years or more experience with PC Based Voice Mail, but will consider as little as 2 years. Need to nd a Senior level person who can come on board and pick up code with very little training. Present Operating System is DOS. May go to OS-2 or UNIX in future. Please reply to: Kim Anderson AdNET (901) 458-2888 fax [email protected]
Slide 61
Slide 62
Slide 63
Slide 64
Progol
Progol: Reduce combinatorial explosion by generating most specic acceptable h as lower bound on search space 1. User species H by stating predicates, functions, and forms of arguments allowed for each 2. Progol uses sequential covering algorithm. For each xi, f (xi) Find most specic hypothesis hi s.t. B hi xi f (xi) actually, considers only k-step entailment 3. Conduct general-to-specic search bounded by specic hypothesis hi, choosing hypothesis with minimum description length
. . . located in Atlanta, Georgia. . . . oces in Kansas City, Missouri. Pre-Filler Pattern 1) word: in tag: in Filler Pattern 1) list: max length: 2 tag: nnp Post-Filler Pattern 1) word: , tag: , 2) tag: nnp semantic: state
where nnp denotes a proper noun (syntax) and state is a general label from the WordNet ontology (semantics).
Slide 65
Slide 66
Protein structure
Protein structure
H:6[79-88]
fold(Four-helical up-and-down bundle,P) :helix(P,H1), length(H1,hi), position(P,H1,Pos), interval(1 <= Pos <= 3), adjacent(P,H1,H2), helix(P,H2).
H:4[61-64] H:5[66-70] H:1[8-17] H:2[26-33]
E:2[96-98] E:1[57-59]
The protein P has fold class Four-helical up-and-down bundle if it contains a long helix H1 at a a secondary structure position between 1 and 3 and H1 is followed by a second helix H2.
H:3[40-50]
H:7[99-106]
H:4[93-108] H:2[41-64]
1omd - EF-Hand
2mhr - Four-helical up-and-down bundle
Slide 67
Slide 68
Immunoglobulin:Has antiparallel sheets B and C; B has 3 strands, topology123; C has 4 strands, topology 2134. TIM barrel:Has between 5 and 9 helices; Has a parallel sheet of 8 strands. SH3:Has an antiparallel sheet B. C and D are the 1st and 4th strands in the sheet B respectively. C and D are the end strands of B and are 4.360 (+/- 2.18) angstroms apart. D contains a proline in the c-terminal end.
Summary
can be viewed as an extended approach to rule learning BUT: much more ... learning in a general-purpose programming language use of rich background knowlege incorporate arbitrary program elements into clauses (rules) background knowledge can grow as a result of learning control search with declarative bias learning probabilistic logic programs
Slide 71
Slide 72