0% found this document useful (0 votes)
19 views11 pages

11-Multi-layer Perceptron, Feed-forward Network, Feedback Network-05-08-2024

The document outlines key concepts in machine learning, specifically focusing on artificial neural networks (ANNs) including perceptrons, multi-layer perceptrons, and feedback networks such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs). It describes the architectures of these networks, their functionalities, and their applications in various fields such as image and sequence data processing. Additionally, it provides a comparative analysis of CNNs and RNNs, highlighting their differences in architecture, data types, use cases, and drawbacks.

Uploaded by

sachitamanna2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views11 pages

11-Multi-layer Perceptron, Feed-forward Network, Feedback Network-05-08-2024

The document outlines key concepts in machine learning, specifically focusing on artificial neural networks (ANNs) including perceptrons, multi-layer perceptrons, and feedback networks such as recurrent neural networks (RNNs) and convolutional neural networks (CNNs). It describes the architectures of these networks, their functionalities, and their applications in various fields such as image and sequence data processing. Additionally, it provides a comparative analysis of CNNs and RNNs, highlighting their differences in architecture, data types, use cases, and drawbacks.

Uploaded by

sachitamanna2003
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

BEEE410L MACHINE LEARNING

Dr.S.ALBERT ALEXANDER
SCHOOL OF ELECTRICAL ENGINEERING
[email protected]

Dr.S.ALBERT ALEXANDER-
SELECT-VIT 1
Module 2
Artificial Neural Networks
❖ Perceptron Learning Algorithm

❖ Multi-layer Perceptron, Feed-forward Network, Feedback


Network
❖ Back propagation Algorithm

❖ Recurrent Neural Network (RNN)

❖ Convolutional Neural Network (CNN)

Dr.S.ALBERT ALEXANDER-SELECT-
VIT 2
2.2 Architectures
❖ ANN is a computational system consisting of many
interconnected units called artificial neurons
❖ The connection between artificial neurons can transmit a
signal from one neuron to another
❖ There are multiple possibilities for connecting the neurons
based on which the architecture we are going to adopt for
a specific solution
Some permutations and combinations are as follows:
❖ There may be just two layers of neuron in the network – the input
and output layer
❖ There can be one or more intermediate ‘hidden’ layers of a
neuron
❖ The neurons may be connected with all neurons in the next layer
Dr.S.ALBERT ALEXANDER-SELECT-
VIT 3
Single layer perceptron
❖ It is the simplest and most basic architecture of ANN’s
❖ It consists of only two layers – the input layer and the
output layer
❖ The input layer consists of ‘m’ input neurons connected to
each of the ‘n’ output neurons
❖ The connections carry weights w11 and so on
❖ The input layer of the neurons doesn’t conduct any
processing – they pass the i/p signals to the o/p neurons
❖ The computations are performed in the output layer
❖ So, though it has 2 layers of neurons, only one layer is
performing the computation
❖ The network is known as SINGLE layer
Dr.S.ALBERT ALEXANDER-SELECT-
VIT 4
Single layer perceptron
❖ The signals always flow from the input layer to the output
layer
❖ The network is known as FEED FORWARD

❖ The net signal input to the output neurons is given by:


yin_k=x1w1k+x2w2k+….+xmwmk =σ𝑚 𝑖=1 𝑥𝑖 𝑤𝑖𝑘
Dr.S.ALBERT ALEXANDER-SELECT-
VIT 5
Multi layer perceptron
❖ The multi-layer feed-forward network is quite similar to the
single-layer feed-forward network
❖ Except for the fact that there are one or more intermediate
layers of neurons between the input and output layer
❖ Hence, the network is termed as multi-layer

❖ Each of the layers may have a varying number of neurons

❖ For example, the one shown in the in the next slide has ‘m’
neurons in the input layer and ‘r’ neurons in the output
layer and there is only one hidden layer with ‘n’ neurons
❖ The net signal input to the neuron in the output layer is
given by:

zin_k= yout_1 𝑤1𝑘 ′
+ yout_2 𝑤2𝑘 ′
+……+ yout_n 𝑤𝑛𝑘 = σ𝑚 ′
𝑖=1 yout_i 𝑤𝑖𝑘

Dr.S.ALBERT ALEXANDER-SELECT-
VIT 6
Multi layer perceptron

Dr.S.ALBERT ALEXANDER-SELECT-
VIT 7
Feedback Network
❖ A feed-back network, such as a recurrent neural
network (RNN), features feed-back paths, which allow
signals to use loops to travel in both directions
❖ Neuronal connections can be made in any way
❖ Since this kind of network contains loops, it transforms into
a non-linear dynamic system that evolves during training
continually until it achieves an equilibrium state
❖ In research, RNN are the most prominent type of feed-back
networks
❖ They are an artificial neural network that forms connections
between nodes into a directed or undirected graph along a
temporal sequence
❖ It can display temporal dynamic behavior as a result of this
Dr.S.ALBERT ALEXANDER-SELECT-
VIT 8
Feedback Network
❖ RNNs may process input sequences of different lengths by
using their internal state, which can represent a form of
memory
❖ They can therefore be used for applications like speech
recognition or handwriting recognition

Dr.S.ALBERT ALEXANDER-SELECT-
VIT 9
Comparative Analysis
Description Convolution Neural Networks Recurrent Neural Networks
(CNNs) (RNNs)
Architecture Feed-forward neural network Feed-back neural network

Information flows in different


Multiple layers of nodes including
Layout directions, simulating a memory
convolutional layers
effect

Data type Image data Sequence data

The size of the input and output


The size of the input and output
may vary
are fixed
Input/Output (i.e receiving different texts and
(i.e. input image with fixed size and
generating different translations
outputs the classification)
for example)
Image classification, recognition, Text translation, natural language
Use cases medical imagery, image analysis, processing, language translation,
face detection sentiment analysis
Slow and complex training
Drawbacks Large training data
procedures

Dr.S.ALBERT ALEXANDER-SELECT-
VIT 10
Dr.S.ALBERT ALEXANDER-SELECT-
VIT 11

You might also like