0% found this document useful (0 votes)
2 views

sc

The document outlines various experiments implementing logic gates and neural network models using the McCulloch-Pitts neuron model and perceptrons. It includes code for AND, OR, and XOR functions, as well as K-means clustering and Radial Basis Function Neural Network implementations. Each section provides a clear structure for inputting weights, thresholds, and learning rates to achieve the desired outputs.

Uploaded by

Shruti Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

sc

The document outlines various experiments implementing logic gates and neural network models using the McCulloch-Pitts neuron model and perceptrons. It includes code for AND, OR, and XOR functions, as well as K-means clustering and Radial Basis Function Neural Network implementations. Each section provides a clear structure for inputting weights, thresholds, and learning rates to achieve the desired outputs.

Uploaded by

Shruti Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Exp-1 Implement logic AND using Culloch-Pits Neuron Model

Program to implement AND gate:

%AND function using McCulloch-Pitts neuron

clear; clc;

% Getting weights and threshold value disp('Enter Output:


the weights'); Enter the Weights

Weight w1=1
w1=input('Weight w1='); w2=input('Weight
Weight w2=1
w2='); disp('Enter threshold value');
Enter threshold value theta=2
theta=input('theta='); y=[0 0 0 0]; x1=[1 1 0
Output of net=1 0 0 0
0]; x2=[1 0 1 0]; z=[1 0 0 0];
McCulloch Pitts Net for AND function
con=1; while con zin Weights of neuron
= x1*w1+x2*w2; 1

for i=1:4 if 1

zin(i)>=theta y(i)=1; Threshold value=

2
else y(i)=0; end end

disp('Output of net=');

disp(y); if y==z con=0;

else disp('Net is not learning Enter another set of weights and threshold

value'); w1=input('Weight w1='); w2=input('Weight w2=');

theta=input('theta=');

end

end disp('McCulloch Pitts Net for ANDNOT function');

disp('Weights of neuron'); disp(w1); disp(w2);

disp('Threshold value='); disp(theta);


Exp 2- Implement logic OR using Culloch-Pits Neuron Model

Code:

%OR function using McCulloch-Pitts neuron

clear; clc;

% Getting weights and threshold

value disp('Enter the weights');

w1=input('Weight w1=');

w2=input('Weight w2=');

disp('Enter threshold value');

theta=input('theta='); y=[0 0 0

0]; x1=[1 1 0 0]; x2=[1 0 1 0];

z=[1 1 1 0];

con=1; while con zin =

x1*w1+x2*w2;

for i=1:4 if

zin(i)>=theta

y(i)=1; else

y(i)=0;

end end disp('Output of

net='); disp(y); if y==z

con=0;

else disp('Net is not learning Enter another set of weights and threshold value');

w1=input('Weight w1='); w2=input('Weight w2=');

theta=input('theta='); end end disp('McCulloch

Pitts Net for OR function'); disp('Weights of

neuron'); disp(w1); disp(w2); disp('Threshold

value='); disp(theta);
Exp-3 Implement perceptron network for emulating the behavior of AND gate

%Perceptron for AND funtion clear;


clc;
x=[1 1 -1 -1;1 -1 1 -1]; t=[1 -1 -1 -1];
w=[0 0]; b=0; alpha=input('Enter
Learning rate='); theta=input('Enter
Threshold value=');
con=1; epoch=0; while con
con=0; for i=1:4
yin=b+x(1,i)*w(1)+x(2,i)*w(2);
if yin>theta y=1; end if yin
<=theta & yin>=-theta y=0;
end if yin<-theta y=-1; end
if y-t(i)
con=1; for j=1:2 w(j)=w(j)
+alpha*t(i)*x(j,i); end
b=b+alpha*t(i); end end
epoch=epoch+1; end
disp('Perceptron for AND
function'); disp(' Final Weight
matrix'); disp(w); disp('Final Bias');
disp(b);
Exp-4 Implement perceptron network for emulating the behavior of OR gate

%Perceptron for OR funtion clear;


clc;
x=[1 1 -1 -1;1 -1 1 -1]; t=[1 1 1 -1];
w=[0 0]; b=0; alpha=input('Enter
Learning rate=');
theta=input('Enter Threshold
value=');
con=1; epoch=0; while con
con=0; for i=1:4
yin=b+x(1,i)*w(1)+x(2,i)*w(2);
if yin>theta y=1; end if yin
<=theta & yin>=-theta y=0;
end if yin<-theta y=-1;
end
if y-t(i)
con=1; for j=1:2 w(j)=w(j)
+alpha*t(i)*x(j,i); end
b=b+alpha*t(i); end end
epoch=epoch+1; end
disp('Perceptron for OR
function'); disp(' Final Weight
matrix'); disp(w);
disp('Final Bias');
disp(b);
Exp-5 Implement XOR function using McCulloch-Pitts neuron
% XOR function using McCulloch-Pitts neuron clear;

clc;

% Getting weights and threshold value

disp('Enter the weights');


w11=input('Weight w11=');
w12=input('Weight w12=');
w21=input('Weight w21=');
w22=input('Weight w22=');
v1=input('Weight v1=');
v2=input('Weight v2=');
disp('Enter threshold value');
theta=input('theta='); x1=[0 0
1 1]; x2=[0 1 0 1]; z=[0 1 1 0];
con=1; while con zin1 =
x1*w11+x2*w21; zin2 =
x1*w21+x2*w22; for i=1:4 if
zin1(i)>=theta y1(i)=1; else
y1(i)=0; end if zin2(i)>=theta
y2(i)=1; else y2(i)=0; end end
yin=y1*v1+y2*v2; for i=1:4 if
yin(i)>=theta; y(i)=1; else
y(i)=0;
end
end
disp('Output of net=');
disp(y); if y==z con=0;
else
disp('Net is not learning Enter another set
of weights and threshold value');
w11=input('Weight w11=');
w12=input('Weight w12=');
w21=input('Weight w21=');
w22=input('Weight w22=');
v1=input('Weight v1='); v2=input('Weight
v2='); theta=input('theta=');
end
end

disp('McCulloch Pitts Net for XOR


function'); disp('Weights of neuron Z1');
Exp-6 Implement K-means Clustering

% Generate a random dataset with two


clusters rng(1); % Set random seed for
reproducibility cluster1 = randn(100, 2) + [5,
5]; cluster2 = randn(100, 2) + [7, 8];
data = [cluster1; cluster2];

% Specify the number of clusters (K)


K = 2;

% Perform K-means clustering


[idx, centroids] = kmeans(data, K);

% Plot the data points and cluster centroids


figure;
scatter(data(:, 1), data(:, 2), 10, idx, 'filled');
hold on;
scatter(centroids(:, 1), centroids(:, 2), 100, 'k', 'x');
title('K-means Clustering');
legend('Cluster 1', 'Cluster 2',
'Centroids'); hold off;
Exp-7 Implement Radial Basis Function Neural Network.
clc
clear all
x=linspace(0,4*pi,3000)
y=(sin(x)+cos(x)+tanh(x))+ rand(1,3000);
net=newrb(x,y,0.1,1,100,10); view(net)
results=sim(net,x); scatter(x,y); hold on;
plot(x,results,'Linewidth',2,'markersize',10)
R=regression(y,results)
;
Exp-8 Implement a two input -one output FIS

Output:

You might also like