Unit5-ANN
Unit5-ANN
Dendrites Inputs
Synapse Weights
Axon Output
There are around 1000 billion neurons in the human brain. Each neuron
has an association point somewhere in the range of 1,000 and 100,000. In
the human brain, data is stored in such a manner as to be distributed, and
we can extract more than one piece of this data when necessary, from our
memory parallelly. We can say that the human brain is made up of
incredibly amazing parallel processors.
Hidden Layer:
The hidden layer presents in-between input and output layers. It performs
all the calculations to find hidden features and patterns.
Output Layer:
The input goes through a series of transformations using the hidden layer,
which finally results in output that is conveyed using this layer.
The artificial neural network takes input and computes the weighted sum
of the inputs and includes a bias. This computation is represented in the
form of a transfer function.
After ANN training, the information may produce output even with
inadequate data. The loss of performance here relies upon the
significance of missing data.
Extortion of one or more cells of ANN does not prohibit it from generating
output, and this feature makes the network fault-tolerance.
Hardware dependence:
ANNs can work with numerical data. Problems must be converted into
numerical values before being introduced to ANN. The presentation
mechanism to be resolved here will directly impact the performance of
the network. It relies on the user's abilities.
If the weighted sum is equal to zero, then bias is added to make the
output non-zero or something else to scale up to the system's response.
Bias has the same input, and weight equals to 1. Here the total of
weighted inputs can be in the range of 0 to positive infinity. Here, to keep
the response in the limits of the desired value, a certain maximum value
is benchmarked, and the total of weighted inputs is passed through the
activation function.
Binary:
In binary activation function, the output is either a one or a 0. Here, to
accomplish this, there is a threshold value set up. If the net weighted
input of neurons is more than 1, then the final output of the activation
function is returned as one or else the output is returned as 0.
Sigmoidal Hyperbolic:
The Sigmoidal Hyperbola function is generally seen as an "S" shaped
curve. Here the tan hyperbolic function is used to approximate output
from the actual net input. The function is defined as:
Feedback ANN:
In this type of ANN, the output returns into the network to accomplish the
best-evolved results internally. As per the University of Massachusetts,
Lowell Centre for Atmospheric Research. The feedback networks feed
information back into itself and are well suited to solve optimization
issues. The Internal system error corrections utilize feedback ANNs.
Feed-Forward ANN:
A feed-forward network is a basic neural network comprising of an input
layer, an output layer, and at least one layer of a neuron. Through
assessment of its output by reviewing its input, the intensity of the
network can be noticed based on group behavior of the associated
neurons, and the output is decided. The primary advantage of this
network is that it figures out how to evaluate and recognize input
patterns.
Marketing and Sales: When you log onto E-commerce sites like
Amazon and Flipkart, they will recommend your products to buy based on
your previous browsing history. Similarly, suppose you love Pasta, then
Zomato, Swiggy, etc. will show you restaurant recommendations based on
your tastes and previous order history. This is true across all new-age
marketing segments like Book sites, Movie services, Hospitality sites, etc.
and it is done by implementing personalized marketing. This uses Artificial
Neural Networks to identify the customer likes, dislikes, previous shopping
history, etc., and then tailor the marketing campaigns accordingly.