AML TB3 CH6 Highlighted
AML TB3 CH6 Highlighted
let and
w,
us
6/ARTIFICTAL
NEURAL
NETWORKS talse. We
as inputs,
is given
representation).
input
is four
function
second
the
model.
the for
data ANDNOT
McCulloch-Pitts and
=1.Then
x1+0Xl=0
=0X1+1xl=1 binary
=1x1+1xl=2
as lx1+0Xl=1 true
inputs is for W,
CHAPTER (use input
if22
y.. 2< table and
four neuron
using 0 y. firsttruth =1
for = = if 0 1 yin=x
=1
W, 1=
W, implementation
tx,w,
calculated tx,w,
+x,w, +x,w, the
McCulloch-Pitts w,
if The i.e.,
truefalse. 0 0 equation.
excitatory,
x,w, =x,w,x,w, =x,w, as
is written is
input = = response
is
AND y, , V, , response 0 0 1
following
net (1,1),(4,0),(0,I),(0,0), be value. using are
the w,
6.15 the can
threshold function the andthe
weights, Y function,
Figure neuron variations, w, using
Solved
Problem
6.2 ANDNOT weights
assumed the
of ANDNOT input
output represents input
both
these net
116
- the Implement other
Solution: thatthe
With 2 of 1:Case
Thus, where caseall calculate
Assume
In For
wing Solution: Problem
Solved
The Implement 6.3 From
of Assume
the one Case 2:weights Erom Now, 6.6
The 1,
truth net McCULLOCH-PITTS
NEURON
MODEL
XOR i.e., the the we
input are
ations.
two table T
calculated calculated calculate
function XOR 21.Thus, weight not
as as
for follows suitable.
XOR function the
net net
cannot excitatory
and net
function inputs,
w,=l inputs,
using input
be
and
epresented it it as
McCulloch-Pitts
is w, is is follows
given =0x1+0x-1=0
=0x1+1x-1=-1
now=1x1+0x-1=1
(0,0),Y., (0,1),i,(1,0),).(1,1),y,, not
=0x1l+1x1=1
=1x1+0x1=1
the =0x1+0x1=0 (0,0),
=1x1+1xl=2
Vi, (0,1),)., (1,0), ).(1,1),y.,
=-1.
-fo)= possible other possible
1 0 0 as
by The
a =1x1+1x-1=0 as
simple output to
1 0 1 neuron. inhibitory, to
|0 fire fire
1
and if if ofthe neuronthe
the
y, y, neuron
single neuron i.e.,
<1 21
logic w,=
for for
caninput
function. is
It and 1 input(1,
be
written (1, w,
=
0) -1. Hence,
only.0)
as by Now,we
representedtheby fixing
a
threshold calculate 117 "
these
Assume 118 .
weight
Wez,,,=0x1+0X-1=0
oneCase 3: (1,1),(1,0),(0,1),(0,0), WeoneCase 2: =lX1+0x1=1 WeCase 1:
z,i (0,0), Assume
Hence (1,1),(1,0),(0,1), The First where
A
1),(1,0),(0,1),(0,0), Assume single
calculate calculatenetthe calculate truth
z,i, z,n a,in Zin function z
z,in z,in z,in z,,n z, it z,in both layer
is = =0=0X table =
=0X-1=
= =0X-l = =0X= not x,x,,
lX-l the 1 l weightlx1+1x1=2 the net
x1x1 possible X weights for (z,
net 1 1 1 net
+0 is 2,=
+0X-1=1
+1x-] +1x-1 as
+1xl=1 =x*,:
function
=-1 +1x +0x inputs as inputs inputs as not XX,)
inhibitory excitatory x
to 1l=0excitatory, sufficient
l=1 1=0 =0 =-1 obtain y=
z,
is Z,
and and z, shown to +
i.c., represent z,
the the using
W, 1 1 0 as
other other these =
Wy= the y=z,y=xx,+
z,
t ~x,
as as
excitatory, inhibitory, weights. 1. 1 1
function.
An CHAPTER
i.e., i.e., 1 0
intermediate
w,, W,
=-l =1 6/ARTIFICIAL
and
and
w,, layer
w,,= 1
=-1. is NETWORKS
NEURAL
necessary.
z,,=1%-1+1x1=0 (1,1),
+0x1=-1 1x-1 ,
=T4,0),
z,=0X-1 (0,1),
l=1 +1
%
+0x1=0 =0x-1 (0,0),
z,.
inputs net calculate
the We
weight
as oneAssume
other
as and
the excitatory Case
3:
and =-l
1. = w, W,inhibitory,
, i.c.,
Z1(1,in 1),
=1x1+1x-l=0
+0X-l=1 =1×1
Z1in(1,0),
=0×1+1x-l
=-1 2,(0,1),
,
+0X-l=0 =0X1 z,n(0,0),
inputs net calculate
the We
other and
the excitatory weight
as oneAssume
-1. =w,, andw,2=l inhibitory,
i.e., as
2:
Case
l=2 1+1× z,=l× (1,1),
=1x1+0x1= 1 (1,0),
z,
=0×1+1x1=1 z,, (0,1),
Zjin (0,0),
=0 +0× 1 1×
=0
inputs net calculate
the We
=1Wi =Wiexcitatory,
, i.e., weights
as both Assume
1:
Case
0
1 1
0 0
gias
ven function
is z, for table truch The
=,x,): function
(z, Second
follows: representedas becanneuron
and =-l w,2 and =-l w,, Thus input. calculated Theneuron. z,tor 21
rhreshold,T
net this on
based output desired the get possible
to is Ir
119 "
MODEL McCULLOCH-PITIS
NEURON
Ah
The =0X1+0×1=0,,=0x1+
(1,1), 1x1=1
=0x1+0x1=0. (0,1),(0,0),Assume
(1,0), y, Here 120
systems.
nervous
mation An By TheThird
threshold It
McCulloch-Pitts setting is
artificial the truth posible
,,=lx1+0 both function
processingSummary threshold net table 7
neural weights input 21for to
is(y= get
is given
system
network model x1= calculated
as z,the
1, excitatory, asOR
neuron.
desired
inspired the 1
for 1
z):
(ANN)
XOR network as output
The
i.e., neuron
by is function based
biological an Z can v, 0
=
infor be V, can
=1. on
is implemented. bethis
given We 0 1 0
1 represented
1 1 calculated
been an A as calculate
trained follows: 6/ARTIFICIAL
CHAPTER
NEURAL
expert"
given 1 0
the asnet
tollows:
toneural
in net input.
alyze. the irputs 0 1
network
category
hus l
Wi2
ofcan =-l,
rmation be
thought W2=-1,
NETWORKS
it of and
has 25
simple
What 1, 4. 3. 2.
tificial
orks. process
1.
neural List process) Slow(a)
Fast(bLearning Transmitter
(d) (Receptor
c) (b) (a) The (Chemical d) process
(c) Physical
process
(b) (a)Signal (d) (c) (b) )
(aWhat recurrent
feed-forward greater
are ANN fhres. dendrites.
from real-time A (ages ANSWER A
VERY
None Both None Both The stated The single neuron trained
some neuron?is function None Both QUESTIONS
SHORT
a transmission has the lik e
Very number
Multiple-Choice is than
mercial is of (a) of (a) of features
(a) unsupervised
Questions network. layer operation,
collectsS
a the and various dendrites. Itadaptiveneural
of the and the and network, its
effectively
Short above (b) a
above feed-forward firing
artificial dendrite (b) above (b) of of
network inputs
at a a threshold, If learning, andnetwork
practical Answer synapse group group learning? feedback thesumsusing fault
is may architectures.
network,
to are resulting tolerance. a has
lications act is self-organization,
then structure
all
a not not network, of other
Questions as the
a be
explicitly multilayer the
known value
These neuron inputscalled advan
of and
is
ferent
List5. 6. 5.
4. 3.
What What iii. neurons Sincemodel. can basic By The tion
ANNfunctions. ANN
unsupervised
learning. ANN
(d) (c) (b) (a) i. neural signal
(d) (vector process
(d) (c)
(b) (a) The Moderate
cLearning
)Inputparameter?
A
i. Which Learning
None All () tational"
(i) rate
real-time A
example be
McCulloch-Pitt
manipulating
are is neural neural A All change Can EXOR
implemented
logicreplicates
which uses is
three and and neural network are
a of of be used
the
perceptron of (ii) (ii) required operations both
the
statements the the ineither linearly is
urons
ationor
advantages operation network network above both
statements network over
following weight discrete
slow the
a to using like model isa in
in is has
conventional vector or solve
inseparable weights the supervised
machine are due more and
of is the fast AND,
the biological
networks?
neural are true to are this continuous
true its
more fault ability
advantages depends McCulloch-Pitts
issue. OR, and basic
learning? high layers
twoof learning
ions. tolerant
suited computer? thresholds,
to and neuron.model
"compu learn on 121 "
NOT activa
for of what and
by of
a
NETWORKS
NEURAL in
rules interest of McCulloch-Pitts
artifcialrequire
example
learningofi in problems
problems used the
of
researchers? with
are inseparable
separable
CHAPTER requiremnents
6/ARTIFICIAL functions the
Demonstrate
using
network
linearly linearlyproblem
network?
neural
activations
the
are are neural layers?
do EXOR
WhatANN?
What What Why model.
for two
3. 4. 4. 5.
an learning
neuronbiological
Questions
Answer
Shortwith (c)
6.
do biological of
cannot between (d)
Reviewdiagram. categories
Questions 5.
models?
and a neuron?
artificial Multiple-Choice
Questions
(a)
of similarities
different 4.
can deterministic neat
working
you a Answers
(a)
3.
of the
what the
thehelp and
neuronalgorithms?
Mentionare theare are
Explain (b)
ANN.What withWhat What 2.
122 (d)
1. 2. 1. 2. 3. 1.