0% found this document useful (0 votes)
57 views

Test 2 Lab Simulation: Perception Using OR Gate Back Propagation AND Gate

1. The document describes a lab report on designing a perceptron neural network model to simulate an OR gate and a backpropagation model to simulate an AND gate using a sigmoid activation function. 2. The objectives are to apply formulas to determine outputs for the perceptron and backpropagation models and run simulations in MATLAB. 3. The models are tested on various input patterns and the outputs are evaluated to verify the correct functioning of the OR and AND gates.

Uploaded by

M7. Sotari
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views

Test 2 Lab Simulation: Perception Using OR Gate Back Propagation AND Gate

1. The document describes a lab report on designing a perceptron neural network model to simulate an OR gate and a backpropagation model to simulate an AND gate using a sigmoid activation function. 2. The objectives are to apply formulas to determine outputs for the perceptron and backpropagation models and run simulations in MATLAB. 3. The models are tested on various input patterns and the outputs are evaluated to verify the correct functioning of the OR and AND gates.

Uploaded by

M7. Sotari
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

MECH633- INTRODUCTION TO FUZZY/NEURAL SYSTEM

Test 2 lab simulation

Perception using OR gate


Back propagation AND gate

:DONE BY

NAME: Majd Mohammed Anas

BH: 17500440

SECTION: MQ

:SUBMITTED TO

DR. satheees
:Introduction

The aim of this lap report to Design the perceptron neural network model for OR gate
And to Design the back propagation model for AND gate using binary sigmoidal activation
function

Objective:
:At the completion of this lab, the student will be able to

Apply the formulas and to determine the output using Perceptron neural .1
network and Back

.propagation algorithm

Run a neural network simulation using Mat lab and determine the outputs .2
using various

.inputs parameters

Equipment and Materials :


Computer with Matlab Softwar
Program codes:

Perception

x= [-1 -1 -1;-1 1 -1;1 -1 -1;1 1 1];

[w wb]= PERCEPTRON (x)

disp(‘ Enter inputs ‘);


x1=input(‘ Input x1= ‘);
x2=input(‘ Input x2= ‘);
R=([x1 x2]*w’ +wb) >=1
%initialization of weights
m=size(x,1) % number of training samples (rows)
n=size(x,2) % number of neurons
% disp(‘=====================’)
w=zeros(1,n-1);
wb=0;
for i=1:m
for j=1:n-1
%disp([‘ i= ‘num2str(i) ‘ j=’ num2str(j)])
%a=0.33

%w(j)=w(j)+a .+x(i,j).+x(i , n);

w(j)=w(j) +x (i,j) . +x(x(i , n);

end;

% wb =wb+a . *x(i,n);

wb= wb + x(i,n);

% disp( [‘ w,b= ‘ num2str(w) ‘ ‘ num2str(b) ] )

end

% disp (‘============’)
m=

n=

w=

2 2

wb =

-2

Enter inputs

Input x1=-1

Input x2=-1

R=

logical

m=

n=

w=

2 2

wb =

Enter inputs

Input x1=-1

Input x2=1

R=

logical

m=

4
n=

w=

2 2

wb =

Enter inputs

Input x1=-1

Input x2=1

R=

logical

m=

n=

w=

2 2

wb =

-2

Enter inputs

Input x1=-1

Input x2=1

R=

logical

m=

n=

3
w=

2 2

wb =

-2

Enter inputs

Input x1=1

Input x2=1

R=

logical

m=

n=

w=

2 2

wb =

Enter inputs

Input x1=1

Input x2=-1

R=

logical

1
Back propagation

close all;

clear all;

clc;

p = [ 0 1 0 1 ; 0 0 1 1 ] ; % Input pattern

t = [ 1 0 0 0 ] ; % Target

net = feedforwardnet([4,3],'trainlm'); % Feedforward - multilayer feed forward


network % T

net.trainParam.goal=0.0001*var(t',1);

net.layers{ 1 }.transferFcn='logsig'

net.layers{ 2 }.transferFcn='tansig'

net.divideFcn='dividetrain';

net.trainParam.goal=0.01; % 0.01 = 0.05

net.trainParam.show=1;

% net.trainParam.min_grad = 1e - 20 ;

% net.trainParam.mu_max = 1e20 ;

net.trainParam.lr=0.15 ;

net.trainParam.mc=0.9 ;

net=train(net,p,t ); % net- model

a= net( p )

save net net ;

p1 = [ 0 1 0 1 ; 0 0 1 1 ] ;
[m n] = size(p1);

y= sim(net,p1);

y=round(y)
Conclusion:

Using the procedural steps, the training and testing algorithm have been created
to test OR gates.
Using back propagation AND gate to test the requirement of the lap report
These have proven to be right using the procedure.

References:

https://towardsdatascience.com/implementing-the-xor-gate-using-
backpropagation-in-neural-networks-c1f255b4f20d
https://www.mathworks.com/matlabcentral/fileexchange/65211-exclusive-or-
code-using-back-propagation-neural-network

You might also like