Soft Computing Perceptron Neural Network in MATLAB
Soft Computing Perceptron Neural Network in MATLAB
08
AIM: To study and implement perceptron network for AND function using NN tool by
MATLAB.
THEORY:
Perceptron Networks
In machine learning, the perceptron is an algorithm for supervised learning of binary
classifiers (functions that can decide whether an input, represented by a vector of numbers,
belongs or not to some specific class). It is a type of linear classifier, i.e. a classification
algorithm that makes its predictions based on a linear predictor function combining a set of
weights with the feature vector. The algorithm allows for online learning, in that it processes
elements in the training set one at a time.
Figure shows architecture of single layer perceptron. In the architecture, only the
associator unit and the response unit is shown. The sensor unit is hidden, because only the
weights between the associator and the response unit are adjusted. The input layer consists of
input from X1XiXn. There always exists a common bias of 1. The input neurons are
connected to the output neurons through weighted interconnections. This is a single layer
network because it has only one layer of interconnections between the input and the output
neurons. This network perceives the input signal received and performs the classification.
Algorithm:
To start training process, initially the weights and the bias are set to zero. The initial
weights of the network can be formulated from other techniques like fuzzy systems, Genetic
algorithm, etc. It is also essential to set the learning rate parameter, which ranges between 0 to 1.
Then the input is presented, the net input is calculated by multiplying the weights with the inputs
and adding the result with the bias entity. The output is compared with the target, where if any
difference occurs, we go in for weight update based on perceptron learning rule, else the network
training is stopped.
The training algorithm is as follows:
Step 1: Initialize weights and bias (initially it can be zero). Set learning rate (0 to 1).
Step 2: While stopping condition is false do Steps 3-7.
Step 3: For each training pair s:t do steps 4-6.
Step 4: Set activations of input units.
xi = sj for i =1 to n
Step 5: Compute the output unit response.
Step 6: The weights and bias are updated if the target is not equal to the output response.
If t != y and the value of xi is not zero.
Therefore y =0.
Step 6: t = 1 and y = 0
Since t != y, the new weights are,
CONCLUSION: Thus we studied and implemented Perceptron Networks for AND function
using NN Tool in MATLAB.
OUTPUT SCREENSHOTS: