Lecture 2.1.3 INTRODUCTION TO ANN
Lecture 2.1.3 INTRODUCTION TO ANN
Artificial NN draw much of their inspiration from the biological nervous system. It is
therefore very useful to have some knowledge of the way this system is organized.
Most living creatures, which have the ability to adapt to a changing environment, need a
controlling unit which is able to learn. Higher developed animals and humans use very
complex networks of highly specialized neurons to perform this task.
The control unit - or brain - can be divided in different anatomic and functional sub-units,
each having certain tasks like vision, hearing, motor and sensor control. The brain is
connected by nerves to the sensors and actors in the rest of the body.
The brain consists of a very large number of neurons, about 10 11 in average. These can be
seen as the basic building bricks for the central nervous system (CNS). The neurons are
interconnected at points called synapses. The complexity of the brain is due to the massive
number of highly interconnected simple units working in parallel, with an individual neuron
receiving input from up to 10000 others.
The neuron contains all structures of an animal cell. The complexity of the structure and of
the processes in a simple cell is enormous. Even the most sophisticated neuron models in
artificial neural networks seem comparatively toy-like.
Structurally the neuron can be divided in three major parts: the cell body (soma), the
dentrites, and the axon, see Figure 1.1 for an illustration.
Figure 1.1: Simplified Biological Neurons.
Dendrites – In a biological neuron, the dendrites act as the input vector. These
dendrites allow the cell to receive signals from a large (>1000) number of neighboring
neurons. As in the above mathematical treatment, each dendrite is able to perform
"multiplication" by that dendrite's "weight value." The multiplication is accomplished
by increasing or decreasing the ratio of synaptic neurotransmitters to signal chemicals
introduced into the dendrite in response to the synaptic neurotransmitter. A negative
multiplication effect can be achieved by transmitting signal inhibitors (i.e. oppositely
charged ions) along the dendrite in response to the reception of synaptic
neurotransmitters.
Soma – In a biological neuron, the soma acts as the summation function, seen in the
above mathematical description. As positive and negative signals (exciting and
inhibiting, respectively) arrive in the soma from the dendrites, the positive and
negative ions are effectively added in summation, by simple virtue of being mixed
together in the solution inside the cell's body.
Axon – The axon gets its signal from the summation behavior which occurs inside the
soma. The opening to the axon essentially samples the electrical potential of the
solution inside the soma. Once the soma reaches a certain potential, the axon will
transmit an all-in signal pulse down its length. In this regard, the axon behaves as the
ability for us to connect our artificial neuron to other artificial neurons.
The cell body contains the organelles of the neuron and also the `dentrites' are originating
there. These are thin and widely branching fibers, reaching out in different directions to make
connections to a larger number of cells within the cluster.
Input connections are made from the axons of other cells to the dentrites or directly to the
body of the cell. These are known as axondentrititic and axonsomatic synapses.
There is only one axon per neuron. It is a single and long fiber, which transports the output
signal of the cell as electrical impulses (action potential) along its length. The end of the axon
may divide in many branches, which are then connected to other cells. The branches have the
function to fan out the signal to many other inputs.
There are many different types of neuron cells found in the nervous system. The differences
are due to their location and function.
The neurons perform basically the following function: all the inputs to the cell, which may
vary by the strength of the connection or the frequency of the incoming signal, are summed
up. The input sum is processed by a threshold function and produces an output signal. The
processing time of about 1ms per cycle and transmission speed of the neurons of about 0.6 to
120 {ms} are comparingly slow to a modern computer.
The brain works in both a parallel and serial way. The parallel and serial nature of the brain is
readily apparent from the physical anatomy of the nervous system. That there is serial and
parallel processing involved can be easily seen from the time needed to perform tasks. For
example a human can recognize the picture of another person in about 100 ms. Given the
processing time of 1 ms for an individual neuron this implies that a certain number of
neurons, but less than 100, are involved in serial; whereas the complexity of the task is
evidence for a parallel processing, because a difficult recognition task can not be performed
by such a small number of neurons. This phenomenon is known as the 100-step-rule.
Biological neural systems usually have a very high fault tolerance. Experiments with people
with brain injuries have shown that damage of neurons up to a certain level does not
necessarily influence the performance of the system, though tasks such as writing or speaking
may have to be learned again. This can be regarded as re-training the network.
Functions of BNN
The figure shows a simple artificial neural net with two input neurons (X1, X2) and one output
neuron (Y). The inter-connected weights are given by W1 and W2.
An Artificial Neuron
• Basic function of neuron is to sum inputs, and produce output given sum is greater
than threshold
1. Multiplies each component of the input pattern by the weight of its connection
2. Sums all weighted inputs and subtracts the threshold value => total weighted input
3. Transforms the total weighted input into the output using the activation function
The neuron is the basic information processing unit of a NN. It consists of:
1. A set of links, describing the neuron inputs, with weights W1, W2, …, Wm.
2. An adder function (linear combiner) for computing the weighted sum of the inputs (real
numbers):
m
u W jX j
j 1
y (u b)
Functions of ANN
The artificial neuron receives inputs analogous to the electrochemical impulses the dendrites
of biological neurons receive from another neuron.
- The artificial signals can be changed by weights in a manner similar to the physical changes
that occur in the synapses.
- The output of the artificial neuron corresponds to signals sent out from biological neuron
over its axon.
- In connectionism, neurons are connected randomly or uniformly, and all neurons perform
the same computation. Each connection has associated with it a numerical weight. Each
neuron’s output is a single numerical activity which is computed as a monotonic function of
the sum of the products of the activity of the input neurons with their corresponding
connection weights.
ANN Definition
An Artificial Neural Network (ANN) is modeled on the brain where neurons are connected in
complex patterns to process data from the senses, establish memories and control the body.
An Artificial Neural Network (ANN) is a system based on the operation of biological neural
networks or it is also defined as an emulation of biological neural system.
Artificial Neural Networks (ANN) is a part of Artificial Intelligence (AI) and this is the area
of computer science which is related in making computers behave more intelligently.
Artificial Neural Networks (ANN) process data and exhibit some intelligence and they
behaves exhibiting intelligence in such a way like pattern recognition, Learning and
generalization.
An artificial neural network is a programmed computational model that aims to replicate the
neural structure and functioning of the human brain.
Before knowing about Artificial Neural Networks, at first we need to study what are neural
networks and also about Structure of Neuron.
Structure of Neuron
Artificial Neural Networks are the computational tools which were modeled after brains. It is
made up of an interconnected structure of artificially produced neurons that function as
pathways for data transfer. Researchers are designing artificial neural networks (ANNs) to
solve a variety of problems in pattern recognition, prediction, optimization, associative
memory, and control.
Artificial neural networks have been described as the second best way to form interconnected
neurons. These artificial neural networks are used to model brains and also to perform
specific computational tasks. A successful ANN application will have the capability of
character recognition.