Supervised Learning Neural Networks
Supervised Learning Neural Networks
(Mapping Networks)
Presentation by:
C. Vinoth Kumar
SSN College of Engineering
Perceptrons (Architecture & learning rule)
i =n
Output = f ∑ wi xi − θ
i =1
i =n
= f ∑ wi xi + wo , w o ≡ −θ
i =1
i =n
= f ∑ wi xi , xo ≡ 1
i =0
Perceptrons
wi is a modifiable weight associated with an incoming
signal xi
X Y Class
Desired i/o pair 1 0 0 0
Desired i/o pair 2 0 1 1
Desired i/o pair 3 1 0 1
Desired i/o pair 4 1 1 0
Perceptrons: Exclusive-OR problem
0 * w1 + 0 * w2 + w0 ≤ 0 ⇔ w0 ≤ 0
0 * w1 + 1 * w2 + w0 > 0 ⇔ w0 > - w2
1 * w1 + 0 * w2 + w0 > 0 ⇔ w0 ≤ - w1
1 * w1 + 1 * w2 + w0 ≤ 0 ⇔ w0 ≤ - w1 – w2
(x1 = 0, x2 = 1 ⇒ 1)
results at the hidden layer
0 * (+1) + 1 * (+1) = 1 < 1.5 ⇒ x3 = 0
1 * (+1) + 0 * (+1) = 1 > 0.5 ⇒ x4 = 1
results at the output layer
0 * (-1) + 1 * (+1) = +1 > 0.5 ⇒ x5 = output = 1
p =1
x j = ∑ wij xi + w j
i
(Logistic function)
1
x j = f (x j ) =
1 + exp(− x j )
L 1
R ( x) =
i 2
1 + exp[ x − u i / σ i2 ]
G L
if x = ui ⇒ Ri = 1 (Maximum) & Ri = ½ (Maximum)
Radial Basis Function Networks
i=H i=H
r r r r
d( x ) = ∑ (a i x + b i ) w i ( x ) = ∑ (u i x + v i )
i =1 i =1
r
[ ]
m T r
where : u i = u i , u i ,..., u i , x = [x1 , x 2 ,..., x m ]
1 2 T
Radial Basis Function Networks
- Functional Equivalence to FIS
n ( x − x i )2
d ( x) = ∑ ci exp − 2
i =1 2σ i
Radial Basis Function Networks
- Interpolation and Approximation
n ( x 1 − x i )2
First pattern d(x 1 ) = ∑ c i exp − 2 = d1
i =1 2σ i
n ( x 2 − x i )2
Second pattern d(x 2 ) = ∑ c i exp − 2 = d2
i =1 2σ i
M
n ( x n − x i )2
nth pattern d(x n ) = ∑ c i exp − 2 = dn
i =1 2σ i
Radial Basis Function Networks
- Interpolation and Approximation
D = [d 1 , d 2 ,..., d n ] , C = [c 1 , c 2 ,..., c n ] ,
T T
where