Summary
Summary
https://www.researchgate.net/publication/319903816
INTRODUCTION
BRAIN NEURON
The neuron receives signals from other neurons through dendrites. When
the strength of the signal exceeds a certain threshold, this neuron triggers
its own signal to be passed on to the next neuron via the axon using
synapses. The signal sent to other neurons through synapses trigger
them, and this process continues . A huge number of such neurons work
simultaneously. The brain has the capacity to store large amount of data.
ARTIFICIAL NEURON
are the inputs to the neuron. A bias is also added to the neuron along with
inputs. Usually bias value is initialised to 1. W0...Wn are the weights. A
weight is the connection to the signal. Product of weight and input gives
the strength of the signal. A neuron receives multiple inputs from different
sources, and has a single output.
There are various functions used for activation.
The other functions that are used are Step function, Linear function, Ramp
function, Hyperbolic tangent function.
Hyperbolic tangent function is similar in shape to sigmoid, but its limits
are from -1 to +1, unlike sigmoid which is from 0 to 1.
The sum is the weighted sum of the inputs multiplied by the weights
between one layer and the next. The activation function used is a sigmoid
function, which is a continuous and differentiable approximation of a step
function . An interconnection of such individual neurons forms the neural
network.
The ANN architecture comprises of: a. input layer: Receives the input
values b. hidden layer: A set of neurons between input and output layers.
There can be single or multiple layers c. output layer: Usually it has one
neuron, and its output ranges between 0 and 1, that is, greater than 0 and
less than 1. But multiple outputs can also be present .
a. Adaptive learning: ANN replicates human brain in the way it learns how
to do tasks while learning. A normal program cannot adapt to other types
of inputs b. Self organization: ANN can create its own organization while
learning. A normal program is fixed for its task and will not do anything
other than what it is intended to do c. Parallel operation: ANN works in
parallel like a human brain. This is dissimilar to a computer program which
works serially.
d. Fault tolerance: One of the most interesting properties of neural
networks is their ability to work even on the basis of incomplete, noisy,
and fuzzy data. A normal program cannot handle incomplete, unclear data
and will stop working once it encounters the smallest wrong data.
e. In comparison with human brain, ANN is quite fast, as brain processing
time is slower f. In comparison with normal program, the method in which
ANN calculates output is not clear. The time taken keeps changing with
different sets of inputs, though they are similar.
g. ANN can be used for data classification, pattern recognition, and in
applications where data is unclear h. ANN cannot be used when the nature
of input and output is exactly known, and what needs to be done is clearly
known.
EFERENCES. Emil M Petriu, Professor, University of Ottawa, «Neural
Networks: Basics». Carlos Gershenson, «Artificial Neural Networks for
Beginners», arxiv.org. Kuldeep S, Dr. Anitha G S, «Neural Network
Approach for Processing Substation Alarms», International
Journals of Power Electronics Controllers and Converters. M. Abdelrahman
«Artificial neural networks based steady state security analysis of power
systems», ThirtySixth Southeastern Symposium on System Theory 2004
Proceedings, 2004. K Y Lee, Y T Cha, J H Park, «Short Term Load
Forecasting Using an Artificial Neural Network», Transactions on Power
Systems, Vol 1, No 7. O.S. Eluyode, Dipo Theophilus Akomolafe,
«Comparative Study of Biological and Artificial Neural
.