0% found this document useful (0 votes)
38 views

MECA406 Homework2

This document outlines two problems for a homework assignment on soft computing techniques. [1] The first problem involves implementing a binary Hopfield network to store patterns and visualize the convergence of distorted patterns to the stored patterns. [2] The second problem involves using a perceptron model to classify 3D points into two classes, plotting the error over epochs and the decision plane. Students are instructed to complete the problems, take screenshots, and submit both the PDF and m-file.

Uploaded by

Berkay Davulcu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

MECA406 Homework2

This document outlines two problems for a homework assignment on soft computing techniques. [1] The first problem involves implementing a binary Hopfield network to store patterns and visualize the convergence of distorted patterns to the stored patterns. [2] The second problem involves using a perceptron model to classify 3D points into two classes, plotting the error over epochs and the decision plane. Students are instructed to complete the problems, take screenshots, and submit both the PDF and m-file.

Uploaded by

Berkay Davulcu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

MECA 406 SOFT COMPUTING

Homework Assignment 2

Due Date: 23/05/2022, Monday, 11:59 pm

Problem 1) Binary Hopfield Model (50 points)

 Step 1: Given a set of literals, (for example you can select A, B, C and D), build a
Binary Hopfield Model such that these sampler patterns are stored as equilibrium points
of the network.
(Hint: Build a 8x8 grid to represent for each literal. Then convert each 8x8 matrix to a
64-dimensional vector.)

 Step 2: Show each sampler pattern graphically on this 8x8 grid.

 Step 3: Start the system by initializing the outputs of each neuron with an arbitrary
pattern. Choose arbitrary pattern such that the original patterns are distorted by zero
mean white noise with changing variance. Show each starting pattern graphically,
followed by iterative steps.

 Step 4: Show that if there is too much noise, there is no convergence to the desired
state.

Ex: 4x4 grid to represent the literal “T”


+1

-1

pattern_T=[ +1, +1, +1, −1, −1, +1, −1, −1, −1, +1, −1, −1, −1, +1, −1, −1]𝑇

distorted image:

Take screenshots for each step and insert these screens into MS_Word file. Save the
MS_word file as a pdf with the following naming convention: firstname_lastname_HW2_p1.pdf
and upload it to BilgiLearn. Upload the m-file as well.
Problem 2) Perceptron Model (50 points)

 Step 1: Assume a set of exampler patterns in 3-D space distributed in a cube as shown
in the figure below:

x3

x x: class A
x x x o: class B
x
o o x
o x x (class A and class B are given in
o x
o o x x
o x diffferent quadrants)
o o
o x
o o x x
o o x2
o

x1

Plot the sampler patterns in a 3-D graph. Choose approximately 25 points for each
class.

 Step 2: Implement the Perceptron Algorithm. Compute the total error for each exampler
set (i.e. for a total of 50 samples) and plot this error vs. number of epochs (iterations).
Observe the decaying error.

 Step 3: Finding the total weights using the Perceptron algorithm, plot the decision plane
in 3D space showing that this plane can be used for classification of these two classes.

Take screenshots for each step and insert these screens into MS_Word file. Save the
MS_word file as a pdf with the following naming convention:
firstname_lastname_HW2_p2.pdf and upload it to BilgiLearn. Upload the m-file as
well.

You might also like