0% found this document useful (0 votes)
25 views

AML TB3 CH6 Highlighted

The document discusses artificial neural networks and their similarities and differences compared to biological neurons. It aims to introduce students to the concept of neural networks and their use in solving classification problems. Students will learn about different types of artificial neurons and neural network architectures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

AML TB3 CH6 Highlighted

The document discusses artificial neural networks and their similarities and differences compared to biological neurons. It aims to introduce students to the concept of neural networks and their use in solving classification problems. Students will learn about different types of artificial neurons and neural network architectures.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

OUTCOMES

LEARNING " OBJECTIVES


LEARNING
tion time. Eent
beings are mentMOdern
aim humans
which 6.1 artihcial
neurons. To
neurons. rons versus To
To
A capabilities
ssing theneural6.1.1 some Students tion Students
cialStudents networks.
fexibility
of This from to Supervised with
introduce compareintroduce
our
Introduction are compute
tasks digital function
network's is the will wil
artificial wil
and the ofg0od among
Introduction and
nervous and where computers
both have in be of be be
different the the
6
the millions able able workings
powerability is unsupervised able
biological neurons. concept Networks
Artificial
Neural
ethod
system. computeofnarration,
rs even
not millions
to ofare to comparecompare types
oftoartificial the of
got An the possessed mathematical of of
Art ificial and while in truly discriminate neurons biological
most a models. the learning. artificial
artiftsichuman
iaperform
l intel igence, astounding biological
humans.computers by computer,
powerful use
a artifi
and of neurons neural
between
piratiocomput
n braiatinonsNeural
neural computIter. operations activa neu
by in whereas in
a
computersre the
in
netfromwartorkifiNet
cialworks general.woulImagid ne good
is t urn amatter classification
problems. neural
To
basicnetworks.ficial To
biological based computer
in second. a model.
usintghissolveAND,Students
model
(ANN)means. cannot
out thelogic of usintroduce
e
on to power nonlinear OR, for
nervous is Itthe power and can
becompeteHumans
McCulloch-Pitts
used tries hope solving
and wil the
the of do and be
mathematics. it problemsNOT. different
systems. asa tothat most a with cannotinspeed. basic able
mimic machine
methodology weremarkable mil iseconds.
the Humans Studentslogical use
These the can search like architectures
model
ifhuman
The
it EXOR
McCulloch-Pits
reproduce
systemsstructure can art of aare wil
operations
for invention particular for
However,
accommodate brain. there not
information storytelling problems ablebe solving of
consist and so arti
some Human docuintelli like
func- of to
all
Learning canA be
trained gainel from devices will 1870 centu
make
The
perceptron
NETWCRKS ANN
6/ARTIFICIAL madea found
intercon
They canA it from single nervousforming
continuous
neurons. can
information experience
receives
hardwarebut started learn,
mnay
problems. perform.
to
computers
data. only stop, invented actually doctrine was language
imprecise. cells
NEURAL
CHAPTER Not it ANN the cells This neuron.to able
be
the
Specialnot
trom information
of between capability. study
is individual
synapses.
kinds. Thedata. will were of it individualneuron translate
by
electron
microscope.
or information.
trained
learning
network
evolution
a to that a
of
different complicated parallel. theories is technique
Proposed of
exist the system werenected
model
this discreteConsolidation through and
that the atter representing
connections or extract in of the the
Different cells Simplifeddecisions,
solve humans
for to tasks outadvantage neuron, Features network.
Nervous s network.
system.
Golgi'
extract
characteristics of of Nerve
to structure.from do carriedperformance up
together information
to to and take 1870s.
single
synaptic data well. ability
complex be Electron
Microscope
organizing Networks Gerlach Pitts
workingthis givenits
can can the a learning. y
Ramon Heinrich
Wilhelmn
emulate as the
comparing
advantages we
the
computations to from Waldeyer-Hartz
Visualized
using and Frank
Rosenblatt
the Networks
adjusting deriving
too have thatwork, back Gottfried
von
neurons the
of deepnetworks von
Starting McCulloch
to are analyze networks so way Inventor Santiago
capable to Joseph
tries of which other manufactured
fails Neural of Cajal
Neural by dates6.1.era
by thus capable data
interconnected to
ANN neuron
trends
Cxperthas Neural is Tablethe neural
networks
systems
takes
place
and obtained ANN theory doctrine Neuron
doctrine McCulloch-Pitts
highly
Artificial it Operation:results.of in Theory
Name
example detect expert,Learning: andaIf Evolution
in are Neuron
term was
accepted
Perceptron
an Self-Organization:
data.
from
previous shown of
neural we Reticular
Neuron
newly canTolerance:
designed
be accurate Evolution coined Neuron
innumerable
highly are andas an now
by of networks work data.
training
Real-Time is
patterns as
from Adaptive of which
ries
before;
right
in mostly
biologicalUse of evolution
can thought less
extracted Fault 6.1
learns6.1.2 give onwards, 1871-731888-91
106 Neural network 6.2 1957-58
Table
be 1. 2. 3. 4. The Year 1891 1950 1943
ing Dendrites
biological
connections. brain boutons result 2006 BIOLOGICAL
6.1 NEURONA3
red units form elimited,mapping
ulate rirstO.3.1 6.3 1960-70 Year Table
1965-68(Continued)
the
WorksNetwork these
thatcollectively we consists is
in s y effectiveness can greater
summing are our between trFrom neuron. are
fre then Biological
Neuron
r called features. to Our of structures
when computation and models deduce Human than learning
Unsupervised perceptron
Back Multilayer Theory
Name
a intelligenceaboutnetworkWhen
and
theirartificial However,
s.simultaneously are human Dendrites of its Propagation
the one the hring used
nonlinear
total Neurons
necessarily essential
synapses neuron to a
is neuron hundred more
threshold, for
performed because depends
input neurons, Figure collecting
They and neurons fires, Hintonand
Salakhutdinov Rumelhart Inventor
1986 Became Ivakhnenkoal.et
mapping on gross features to so
exceeds or our billion
on that
are most by Artiticial a 6.1 itthe
simplyidealizationsbiological
knowledge the
popular
en or a ofStructure the viasends
neuroninput
oftjunctions.
certain dense neurons (100,000,000,000)
effectivenessconnections et
al influence an fora al.
Nucleus
ized neurons. datamesh electrical fires in by
levels. In and neuron.Neurons
ofof and of neuron.
neurons
real a of
else
Often
some of their biological of called
in inputs.
Neurons computing these one impulse it
It Unsupervised
pre-training. hidden A Though Features
any limitations.
Used precision McCulloch-Pitts
layers, networks We inhibits. multilayered
then Axon neurons, neuron collects
cases, weinterconnections. is synapses. continuous
in
Boutonssynaptic
and simply The incomplete neuron. from training layer(s) perceptron
usually typically
of
they basicnodes on Figure and
edback neurons. eachLearning its
connections. another can
network
can cal
operate proCessing l sums a function model
and
and program with nucleus 6.1 very be were
thresholdthbeem
as shows up
ections
within
considered
both inconnections. our Figure deep used it
changes.takes
about to of advanced
to the neurons had
parallel nodes. a the any to
elements computing
computer 6.2 place its inputs learner approximate its
shows 1000 boutons. desired
The structure own
and Neurons They by and with of 107 "
synaptichuman
are of power to a chang
neural oper simdirect The ofthe if
con- per a
is
networks.
and any The genlearyerates
time rule
value charac- ana presentedneuralthosethe network
learning
of layeis layer without layers.
6/ARTIFICIAL
NETWORKS speeds
numerical dynamic
basicneed
guide
natural performance input
neural
between intermediat
Others different cues to
neural layer
most of little output
as varietythe
and of can
between
NEURAL
CHAPTER
a
by responses. variety andmulilayer architectures. 6.3).
output
ncuron. cxpressed Their to to artificial
refer exhibit respond is
there
the
within (Fig. The
a duplicating link more
networks. oftenare wide to signal,
biological instantaneous networks of and or network
connected
There accurately formed network
single-layer or
is
strength we capabilities a one
however, for that input
Axon computing whichmodes. in metaphor
Threshold Finally, pattern stage forward of
amimicking successful of directly interconnec
the
connection to as a types
behavior,
provide learning abilityandlimited, Networks buffers
body
Cell
Summation distributed change. architectures a
only entities:weights. is different feed
classified
connection
Layer is
neuron Each networks their so been layer
Network single-layer and
time-domain
theirweights is
is be elements. input
Network
Dendrites in
in havestill
functions basic adjusting thenetwork
artificial allowed.parallel differ Neural the can to
the other and three leads by
connectingthe models been andnetworks Feed-forward
the called formed
of their
are asfunction cach also bothbrain developed. the for layers
processing
place which Feed-forward
neural
An modified.some from they No has Artificial adopted is
layers by between
actual by is the
6.2 Only characterized
the result, brain
them. connections.6.4.1
specified Network
form Neuraltalkes in network network
Architecture
Figure adjacent differ
how been of linking network of
be systems
architecture.
isteristic
about
a exist emulate the rules neuronsset
architecture. a input
can alsoandAs Therefore, of
have are 3.Activation
functions. a LayerSuch feed-forward
toward
which networks discrepancies
learning. Knowledge Training/learning
BasicsANNsynaptic by the of Multilayerthe
neural
Artifcialare when that of formed
to How type layers.
and try configurations arrangement Single receives
andweight, their establish brain. of network a
layer Neural of the
respond input. would
eficiency nerworks. modelsModel's is layer. be intermediate mulilayer
layer 6.4.1.1
108 called Vast human 6.4 output
can 6.4.1.2layer
che to that who The 2. called
Each There input
at 1. The
A
feedback. lateral called hen
istit
connected
to layer
is output thfeedback
e of the When networks. feedback Tecture
of
laver. same the back aroutputs
e When n
archi- network in
the results nodes,
it layerpreceding same
or inputs
to back
asdirected
Network Feedback 6.4.1.3
network.feed-forward Multilayer 6.4Figure
m
neurons N
Output
neurons
Input
Weights
Outplayer
ut
layer
layers Input
Hidden
layers.preceding the ofany or
layer same the in
node input
ato connectedas layer
is output the from neuron
network,
nofeed-forward aIn6.4. Fiillustrated
g. in Thiiss layer. next the in
node every connected
to layer
is
one from output every network, connected fully aIntrain
it. to
time more requires network
but the ciency
of
effi improve
the may Thinetwork.
s the complex
is more the layers, hidden number
of the more The layers.
hidden several zero
to becan There directly.
environment external the with contact not does layer hidden The
layer. bidden called layers
is output and input between
the formed is
that layer network.A the output
of the
network.
feed-forward layer Single 6Figure
.3
Wnm
W.
Wen
neurons W2 neurons
Output
W12 Input
Wa
W21,
W1
layer layer
Output Input
109 " NETWORKS NEURALARTIFICIAL BASICS
OF S 6.4
ustoutput actualisoutput. andin tone
rk gramoutput. learning
singer. to
Supervised
Learning
6.4.2.1 Learning
In learning. Learning
ing 6.4.2
ment Learning a
When 6.5. Recurrent
Network
6.4.1.4 110 e
processing Kecurrent
multilayer
Fig. network
ment's or
the Inand supervised proper
ofDuring
vector supervised the
the iscorresponding The to
supervised
idered manner. can (or feedback Outpt
ghts training, training sing, adjustments element network
training)
be
learning,the
Then At classified can
learning,Here,
of involves first, is be is
thlearning
e as this the output
directed a
the output
vector singing he/she of isa network.
directed
feedback
actualinput each intovalues
process
error
algorithm learning can
listening back
output input takes does three be
erssignal of in directed to In
back network
resultsvectorplace not done is network
categories: which mn W Waa to a
is is is the Figure single
and songtheknow itself
soshown checkedgiven with
hat is is by the backhidden 6.5 yer
associated trying
vect or wi th parameters laor
rated
for in to how supervised neural
training a toRecurrent
layers toclosed a
Fig. the again
and th e itself nctwork
all whether to to another
g networkto sing sing. help producing network loop.
by6.6. pai. with andforms it
NETWORKS
CHAPTER
the The it inHe/she of learning, tonetwork. with 6/ARTIFICIAL
the he/she
again
till processing This
irork. same Here, the a responds other a a
output teacher. multilayer feedback type
ifferenceproduce
the same tries
unsupervised the
e as processing
the network manner to desired element of
This which Consider to network
betweendesired an sing a recurrent connection,
can responsestimulus
error output. is as a elements or
knows
desired. song learning, NEURAL
reproduce the the to can
output.
the blockdia- network. both
ignal singer. example the correctly
for adopt by a be
in procesSing a
nis what: The
same the as single
actcan ualand it and each In shown
The input in same
the be output should the wa of reintorte stimulus. addition, la.
layer. in
ele-
same child a
the
esired used destoired theis be
as of
Reinforcement6.4.2.3
trom Learning ers Here, If
When typelearns
training. Unsupervised Just
Learning
6.4.2.2
Reinforcement an A
reinforcement
reinforcement its In of as RAsICS
this own there input a
unsupervised to
new The learningswim. the
critical patterns is does name OF
no input
network
learning feedback not suggests,is It
1S ARTIFICTAL
learninglearning,
informaticn. iS
independent not
(Input) X by belong (Input) -X
Figure (input) X applied,learning,
clubs taught
ischanging from unsupervised
(Fig. y
onlsimilar to Figure NEURAL
The 6.7 anytogether
the the how
and (D-Y)
signal
Error
signal
(D-Y)
Error 6.8). critical its the
network
cluster, inputs 6.6 Network
Neural
Network
Neural process toenvironment Block
parameters. not to NETWORKS
Figure supervised Network
Neural do
the taughtlearning
information diagram asimilar a
of so Blocks
of new gives similar but
6.8 extracting of
This an by it is
supervised
learning to of cluster input a done Generator
decide outputcategory develops
teacher. SignalError
Error
rcementGeneratorSignal is unsupervised
is
available.
real termed is
patterns without
in if formed.
response the learning
information that the are +
’ The as
output to
skills the Y
model. Y information
selforganization. learning This grouped
indicating (Desired
Output) D (Actual
(Actual Output)
(Actual Y form on help algorithm.
Reinforcement
Signal exact
is is its of
from together
shown clusters Output)
Output) information correct.
algorithm. own. a
is the teacher.
critical in
class
available. Here Fig. without in Thus,
the Consider
informationneeds the 7. to it
is
6.which
training the
However, network clear
to how
it help
be belongs. that
is
obtained discov
process. of a 111
termed in this any hsh
case
f(net)actis
chrough
its (6.2) (6.3) f(ne)
6/ARTIFICIAL
TWORKS or
che
scalar
function
NEURAL
NET
CHAPTER unipolar
inputs,
continuous. continuous 2
operation
nonlincar
and
functions 2) 8)
weighted = =
1.5
(a ( a.
sigmoid of
sigmoid
bipolar the
its of
values
of activation 6.9. 1 Bipolar
Bipolar
summation steepness 16) different
the
performs and
Fig. =
0>net 0<
net (a
binary -1
0.5
bipolar in sigmoid
exp(-Aner)
1+ the shown for
of bipolar +1, determining
operation
ittoare
computation
Subscquently, )= 0 Bipolarfunctions
uscd
functions: (ace 2 is 1) 4)
=
the
functions
performs
=sgn various
-0.5
= (a
( sigmoid
sigmold Activation
as gain
Activation
Functions
as
Functions
defined
activation
f(ua) defined
neuronfor
Bipolar
nct.
the
obtain
activation function -1 Bipolar
node is is the 6.9
function function ."..
Activation
bipolar to activation
-1.5 Figure
processing
Typical proportional
of binary continuous
function. types continuous 0.5 -0.5
-2
1
Bipolar
bipolar
a
aS two is
neuron The
bipolar >00.
activation are
productfunctions.
(x)
112 6.5 6.5.1 The
There A =net The
The
where
near
<0,Figure Function
rampTheRamp
6.5.4 ne Figure Function
ldentity
The6.5.3 Again, The functions
shifring
and6.5.2
By ro
ACTIVATION
activationpointNorice 6.5
identity unipolar The
output 6.10 the that
for x6.11 unipolar Unipolar out
unipolar can
of
uesshows
function here shows function
function. that as
binary be A,
continuous
scaling
obtained, the
remains the Activation DotnFUNCTIONS
the is binary activation
0 identity is
x< ramp< defined a posItive the
the linear function bipolar continuous
1, same activation
nction. as function. function function and
and Functions
as activation
1for is negative
input. Figure f() the f(ne)= function
values
f(*)=x
It 0sxS1 if and limit
defined
is function
is The f) f(ner)
6.10 can functions, responses
clear
of 0 a1,exp(-Anet)1+ becomes
if1 input =x be of 0,
<0
net is
f(ne) as defined
ldentify X defined
frox>m if
1. for net >
0 1
the layer unipolar of
x>1 all when neurons the
<0 as
igure uses function. x as sonlnet)
the continuous
that ’o, are
identity function.
produced
f)
takes and
activation unipolar The
for
the
thisword
value
function. activation
binary definition
"bipolar
0
for
113 "
values of S
(6.6) (6.5) (6.4) tne us
of
tofor We k=0, where instant The
basicbasic w, McCulloch The 114
exceed is 6.6
lloch-Pitts Although The
excitatory wil the inputs fhirst
operationslogic 1, k.
function muliplicativeassume 2,
by The dehnition
the .. x, McCulloch-Pitts
and Neuron
Model
this synapses, that denotes neuron's for
weighted Pitts
neuron fis a a i=1,
model NOT, linear unityweight the (1943). of
output 2, a
sumw,=-ldiscrete-time
delay . ., synthetic
model The
for OR, step of connecting
for signal n,
NOR function issignals elapses are Figure
6.12 McCulloch-Pitts
and, inhibitory
simplistic, neuron
is 0
gate
provided r
fobetween instant. denoted or Wa
Figure
at the 1, W, W,
is the depending was
shown threshold synapses, ith 1McCulloch-Pitts
itneuron model. based 6.11 +1
its has the input i=l
if "o. as
instants model Ramp fx)
in
substantial
weight
Fig, Tastoand with w, The on on
fire. the +1
6.14. and shown 2T fhring ofthe function.
Tis k the
absence i=1,2,...n =+1, W, the CHAPTER
andk+1.the neuron's simplified
threshold
computing rule neuron
neuron'
Fig.
in s
for or 6/ARTIFICIAL
X
membrane.
Note this
presence is
biological
are 6.13. shown
potential. threshold
ropriately model
that
in
for isthe of Fig,model NETWORKS
NEURAL
It value, this given
can input 6.12.
model tormulated
which in
selecteperd.form Eq.
impulse
nc w; (6.7).
The the (6.7) by
at
Problems
Solved che andarew, In 66
|assume As Solution:
Consider Problem
Solved
Implement 6.1 6.6.1
McCulloch-Pitts
other McCULLOCH-PITTS
already
the the weight
cxcitatory
weights
mentioned, AND
truth
as
table function inhibitory neuron,
and
w, Figure NEURON
onl=1y for X,
analyze.
and AND using and only
analysis 6.14 Figure
w, If
analyze. analysis MODEL
= McCulloch-Pitts
function. 1 1
the
1.The can Basic weights 6.13
1 0
McCulloch-Pitts can
network be 1 O4
are be Linear
performed performed.
not T
1 0 neuron. threshold
rchitecture r-t suitable,
in model Hence,
the we function.
1
have ’
Culloch-Pittsis for we Sum
shown NOR (NOR) to
try need
in gate. one to
Fig. assume
weight
15.
6.model.
as
both
excitatory
Hence, weights 115 "

let and
w,
us
6/ARTIFICTAL
NEURAL
NETWORKS talse. We
as inputs,
is given
representation).
input
is four
function
second
the
model.
the for
data ANDNOT
McCulloch-Pitts and
=1.Then
x1+0Xl=0
=0X1+1xl=1 binary
=1x1+1xl=2
as lx1+0Xl=1 true
inputs is for W,
CHAPTER (use input
if22
y.. 2< table and
four neuron
using 0 y. firsttruth =1
for = = if 0 1 yin=x
=1
W, 1=
W, implementation
tx,w,
calculated tx,w,
+x,w, +x,w, the
McCulloch-Pitts w,
if The i.e.,
truefalse. 0 0 equation.
excitatory,
x,w, =x,w,x,w, =x,w, as
is written is
input = = response
is
AND y, , V, , response 0 0 1
following
net (1,1),(4,0),(0,I),(0,0), be value. using are
the w,
6.15 the can
threshold function the andthe
weights, Y function,
Figure neuron variations, w, using
Solved
Problem
6.2 ANDNOT weights
assumed the
of ANDNOT input
output represents input
both
these net
116
- the Implement other
Solution: thatthe
With 2 of 1:Case
Thus, where caseall calculate
Assume
In For
wing Solution: Problem
Solved
The Implement 6.3 From
of Assume
the one Case 2:weights Erom Now, 6.6
The 1,
truth net McCULLOCH-PITTS
NEURON
MODEL
XOR i.e., the the we
input are
ations.
two table T
calculated calculated calculate
function XOR 21.Thus, weight not
as as
for follows suitable.
XOR function the
net net
cannot excitatory
and net
function inputs,
w,=l inputs,
using input
be
and
epresented it it as
McCulloch-Pitts
is w, is is follows
given =0x1+0x-1=0
=0x1+1x-1=-1
now=1x1+0x-1=1
(0,0),Y., (0,1),i,(1,0),).(1,1),y,, not
=0x1l+1x1=1
=1x1+0x1=1
the =0x1+0x1=0 (0,0),
=1x1+1xl=2
Vi, (0,1),)., (1,0), ).(1,1),y.,
=-1.
-fo)= possible other possible
1 0 0 as
by The
a =1x1+1x-1=0 as
simple output to
1 0 1 neuron. inhibitory, to
|0 fire fire
1
and if if ofthe neuronthe
the
y, y, neuron
single neuron i.e.,
<1 21
logic w,=
for for
caninput
function. is
It and 1 input(1,
be
written (1, w,
=
0) -1. Hence,
only.0)
as by Now,we
representedtheby fixing

a
threshold calculate 117 "
these
Assume 118 .
weight
Wez,,,=0x1+0X-1=0
oneCase 3: (1,1),(1,0),(0,1),(0,0), WeoneCase 2: =lX1+0x1=1 WeCase 1:
z,i (0,0), Assume
Hence (1,1),(1,0),(0,1), The First where
A
1),(1,0),(0,1),(0,0), Assume single
calculate calculatenetthe calculate truth
z,i, z,n a,in Zin function z
z,in z,in z,in z,,n z, it z,in both layer
is = =0=0X table =
=0X-1=
= =0X-l = =0X= not x,x,,
lX-l the 1 l weightlx1+1x1=2 the net
x1x1 possible X weights for (z,
net 1 1 1 net
+0 is 2,=
+0X-1=1
+1x-] +1x-1 as
+1xl=1 =x*,:
function
=-1 +1x +0x inputs as inputs inputs as not XX,)
inhibitory excitatory x
to 1l=0excitatory, sufficient
l=1 1=0 =0 =-1 obtain y=
z,
is Z,
and and z, shown to +
i.c., represent z,
the the using
W, 1 1 0 as
other other these =
Wy= the y=z,y=xx,+
z,
t ~x,
as as
excitatory, inhibitory, weights. 1. 1 1
function.

An CHAPTER
i.e., i.e., 1 0
intermediate
w,, W,
=-l =1 6/ARTIFICIAL
and
and
w,, layer
w,,= 1
=-1. is NETWORKS
NEURAL
necessary.
z,,=1%-1+1x1=0 (1,1),
+0x1=-1 1x-1 ,
=T4,0),
z,=0X-1 (0,1),
l=1 +1
%
+0x1=0 =0x-1 (0,0),
z,.
inputs net calculate
the We
weight
as oneAssume
other
as and
the excitatory Case
3:
and =-l
1. = w, W,inhibitory,
, i.c.,
Z1(1,in 1),
=1x1+1x-l=0
+0X-l=1 =1×1
Z1in(1,0),
=0×1+1x-l
=-1 2,(0,1),
,
+0X-l=0 =0X1 z,n(0,0),
inputs net calculate
the We
other and
the excitatory weight
as oneAssume
-1. =w,, andw,2=l inhibitory,
i.e., as
2:
Case
l=2 1+1× z,=l× (1,1),
=1x1+0x1= 1 (1,0),
z,
=0×1+1x1=1 z,, (0,1),
Zjin (0,0),
=0 +0× 1 1×
=0
inputs net calculate
the We
=1Wi =Wiexcitatory,
, i.e., weights
as both Assume
1:
Case
0
1 1
0 0
gias
ven function
is z, for table truch The
=,x,): function
(z, Second
follows: representedas becanneuron
and =-l w,2 and =-l w,, Thus input. calculated Theneuron. z,tor 21
rhreshold,T
net this on
based output desired the get possible
to is Ir
119 "
MODEL McCULLOCH-PITIS
NEURON
Ah
The =0X1+0×1=0,,=0x1+
(1,1), 1x1=1
=0x1+0x1=0. (0,1),(0,0),Assume
(1,0), y, Here 120
systems.
nervous
mation An By TheThird
threshold It
McCulloch-Pitts setting is
artificial the truth posible
,,=lx1+0 both function
processingSummary threshold net table 7
neural weights input 21for to
is(y= get
is given
system
network model x1= calculated
as z,the
1, excitatory, asOR
neuron.
desired
inspired the 1
for 1
z):
(ANN)
XOR network as output
The
i.e., neuron
by is function based
biological an Z can v, 0
=
infor be V, can
=1. on
is implemented. bethis
given We 0 1 0
1 represented
1 1 calculated
been an A as calculate
trained follows: 6/ARTIFICIAL
CHAPTER
NEURAL
expert"
given 1 0
the asnet
tollows:
toneural
in net input.
alyze. the irputs 0 1
network
category
hus l
Wi2
ofcan =-l,
rmation be
thought W2=-1,
NETWORKS

it of and
has 25
simple
What 1, 4. 3. 2.
tificial
orks. process
1.
neural List process) Slow(a)
Fast(bLearning Transmitter
(d) (Receptor
c) (b) (a) The (Chemical d) process
(c) Physical
process
(b) (a)Signal (d) (c) (b) )
(aWhat recurrent
feed-forward greater
are ANN fhres. dendrites.
from real-time A (ages ANSWER A
VERY
None Both None Both The stated The single neuron trained
some neuron?is function None Both QUESTIONS
SHORT
a transmission has the lik e
Very number
Multiple-Choice is than
mercial is of (a) of (a) of features
(a) unsupervised
Questions network. layer operation,
collectsS
a the and various dendrites. Itadaptiveneural
of the and the and network, its
effectively
Short above (b) a
above feed-forward firing
artificial dendrite (b) above (b) of of
network inputs
at a a threshold, If learning, andnetwork
practical Answer synapse group group learning? feedback thesumsusing fault
is may architectures.
network,
to are resulting tolerance. a has
lications act is self-organization,
then structure
all
a not not network, of other
Questions as the
a be
explicitly multilayer the
known value
These neuron inputscalled advan
of and
is
ferent
List5. 6. 5.
4. 3.
What What iii. neurons Sincemodel. can basic By The tion
ANNfunctions. ANN
unsupervised
learning. ANN
(d) (c) (b) (a) i. neural signal
(d) (vector process
(d) (c)
(b) (a) The Moderate
cLearning
)Inputparameter?
A
i. Which Learning
None All () tational"
(i) rate
real-time A
example be
McCulloch-Pitt
manipulating
are is neural neural A All change Can EXOR
implemented
logicreplicates
which uses is
three and and neural network are
a of of be used
the
perceptron of (ii) (ii) required operations both
the
statements the the ineither linearly is
urons
ationor
advantages operation network network above both
statements network over
following weight discrete
slow the
a to using like model isa in
in is has
conventional vector or solve
inseparable weights the supervised
machine are due more and
of is the fast AND,
the biological
networks?
neural are true to are this continuous
true its
more fault ability
advantages depends McCulloch-Pitts
issue. OR, and basic
learning? high layers
twoof learning
ions. tolerant
suited computer? thresholds,
to and neuron.model
"compu learn on 121 "
NOT activa
for of what and
by of
a
NETWORKS
NEURAL in
rules interest of McCulloch-Pitts
artifcialrequire
example
learningofi in problems
problems used the
of
researchers? with
are inseparable
separable
CHAPTER requiremnents
6/ARTIFICIAL functions the
Demonstrate
using
network
linearly linearlyproblem
network?
neural
activations
the
are are neural layers?
do EXOR
WhatANN?
What What Why model.
for two
3. 4. 4. 5.
an learning
neuronbiological
Questions
Answer
Shortwith (c)
6.
do biological of
cannot between (d)
Reviewdiagram. categories
Questions 5.
models?
and a neuron?
artificial Multiple-Choice
Questions
(a)
of similarities
different 4.
can deterministic neat
working
you a Answers
(a)
3.
of the
what the
thehelp and
neuronalgorithms?
Mentionare theare are
Explain (b)
ANN.What withWhat What 2.
122 (d)
1. 2. 1. 2. 3. 1.

You might also like