0% found this document useful (0 votes)
51 views

Ch2 Pres

The document describes various aspects of neural network models including: 1) Single-input neurons that take an input and apply a transfer function with a weight and bias to produce an output. 2) Multiple layers of neurons can be connected where the output of one layer is the input to the next with different transfer functions applied at each layer. 3) Recurrent networks can incorporate delays and integrators to allow the output at one time step to influence the next time step, creating dynamical systems.

Uploaded by

Siva Kumar Ch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Ch2 Pres

The document describes various aspects of neural network models including: 1) Single-input neurons that take an input and apply a transfer function with a weight and bias to produce an output. 2) Multiple layers of neurons can be connected where the output of one layer is the input to the next with different transfer functions applied at each layer. 3) Recurrent networks can incorporate delays and integrators to allow the output at one time step to influence the next time step, creating dynamical systems.

Uploaded by

Siva Kumar Ch
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

2

Neuron Model
and
Network Architectures

Single-Input Neuron
General Neuron

Inputs

AA
A
AAA

1
a = f (wp + b)

Transfer Functions

AA

+1
n
0
-1

a = hardlim (n)

+1
p
0

-b/w

-1

a = hardlim (wp + b)

Hard Limit Transfer Function

Single-Input hardlim Neuron

a
+1
n
0
-1

a = purelin (n)
Linear Transfer Function

+b
-b/w

p
0

a = purelin (wp + b)
Single-Input purelin Neuron

Transfer Functions

a
+1
n
0
-1

a = logsig (n)

AA
AA

Log-Sigmoid Transfer Function

a
+1

-b/w

p
0
-1

a = logsig (wp + b)

Single-Input logsig Neuron

Multiple-Input Neuron

Inputs Multiple-Input Neuron

p1
p2
p3
pR

Input

Multiple-Input Neuron

AA
AA
AA
AA
AAAA AA
p

w1, 1

w1, R

Rx1

1xR

a = f (Wp + b)

AA
AA
AA
a

n
1x1

1x1

a = f (Wp + b)

1x1

Abreviated Notation
5

Layer of Neurons
Inputs
w1,1
p1

AA
A
AAA
AA
A
AA
A
AA
A
AAA

Layer of S Neurons
n1

a1

b1

p2

n2

p3

a2

b2

pR
wS, R

nS

aS

bS

a = f(Wp + b)
6

Abbreviated Notation

Input
p
Rx1

A
A
A
W

SxR

1
R

w 1, 1 w1, 2 w 1, R

Layer of S Neurons

Sx1

AA
AA
AA

W =

Sx1

w S, 1 wS, 2 w S, R

Sx1

p1

a = f(Wp + b)

w 2, 1 w2, 2 w 2, R

p =

p2
pR

b1

b =

b2
bS

a1

a =

a2
aS

Multilayer Network

Inputs

AA
AA
AAAA
AAAA
AA
AA
AAAA
First Layer

p1
p2
p3

n11

w 11,1

f1

w 21,1

a11

b11

n12

f1

w 1S 1, R

f1

b1S 1

a1 = f 1 (W1p + b1)

a12

f2

n22

b12

n1S 1

n21

w 31,1

a21

S ,S

f2

a31

f3

a32

f3

b32

a2S 2

b2S 2

n32

2 2

a22

f2

n2S 2

n31

b31

b22

a1S 1

AA
A
AAA
AAA
AA
A
AAA
Third Layer

b21

pR

AA
AA
AAAA
AAAA
AA
AA
AA
AAAA
Second Layer

3 3

S ,S

a2 = f 2 (W2a1 + b2)

n3S 3

f3

a3S 3

b3S 3

a3 = f 3 (W3a2 + b3)

a3 = f 3 (W3f 2 (W2f 1 (W1p + b1) + b2) + b3)

Abreviated Notation

Hidden Layers

Input

First Layer

p
Rx1

W1

a1

S1 x 1

S1 x 1

Second Layer

Third Layer

AA
AA
AA
AA
AA
AA
AA
AA
AA
AA
AA
AA
AA AAAA AAAA AA
S1 x R

Output Layer

S1 x 1

a1 = f 1 (W1p + b1)

S1

S2 x 1

S2 x 1

b1

W2

S2 x S1

f1

a2

S2 x 1

S2

a2 = f 2 (W2a1 + b2)

n3

S3 x 1

b2

W3

S3 x S2

f2

a3

S3 x 1

f3

b3

S3 x 1

S3

a3 = f 3 (W3a2 + b3)

a3 = f 3 (W3 f 2 (W2f 1 (W1p + b1) + b2) + b3)

Delays and Integrators

AA
AA
Delay

u(t)

a(t)

Integrator
u(t)

a(t)

a(0)

a(0)

a(t) = u(t - 1)

a(t) = u() d + a(0)


0

10

Recurrent Network

2
Initial
Condition

Sym. Sat. Linear Layer

AA
A
W

Rx1

a(t + 1)

Sx1

Sx1

SxR

1
S

Sx1

a(0) = p

AA
AA
AAAA
AA

n(t + 1)

a(t)
Sx1

a(t + 1) = satlin (Wa(t) + b)

a ( 1 ) = satlins ( Wa ( 0 ) + b ) = satlins ( Wp + b )
a ( 2 ) = satlins ( Wa ( 1 ) + b )
11

You might also like