MECA406 Homework2
MECA406 Homework2
Homework Assignment 2
Step 1: Given a set of literals, (for example you can select A, B, C and D), build a
Binary Hopfield Model such that these sampler patterns are stored as equilibrium points
of the network.
(Hint: Build a 8x8 grid to represent for each literal. Then convert each 8x8 matrix to a
64-dimensional vector.)
Step 3: Start the system by initializing the outputs of each neuron with an arbitrary
pattern. Choose arbitrary pattern such that the original patterns are distorted by zero
mean white noise with changing variance. Show each starting pattern graphically,
followed by iterative steps.
Step 4: Show that if there is too much noise, there is no convergence to the desired
state.
-1
pattern_T=[ +1, +1, +1, −1, −1, +1, −1, −1, −1, +1, −1, −1, −1, +1, −1, −1]𝑇
distorted image:
Take screenshots for each step and insert these screens into MS_Word file. Save the
MS_word file as a pdf with the following naming convention: firstname_lastname_HW2_p1.pdf
and upload it to BilgiLearn. Upload the m-file as well.
Problem 2) Perceptron Model (50 points)
Step 1: Assume a set of exampler patterns in 3-D space distributed in a cube as shown
in the figure below:
x3
x x: class A
x x x o: class B
x
o o x
o x x (class A and class B are given in
o x
o o x x
o x diffferent quadrants)
o o
o x
o o x x
o o x2
o
x1
Plot the sampler patterns in a 3-D graph. Choose approximately 25 points for each
class.
Step 2: Implement the Perceptron Algorithm. Compute the total error for each exampler
set (i.e. for a total of 50 samples) and plot this error vs. number of epochs (iterations).
Observe the decaying error.
Step 3: Finding the total weights using the Perceptron algorithm, plot the decision plane
in 3D space showing that this plane can be used for classification of these two classes.
Take screenshots for each step and insert these screens into MS_Word file. Save the
MS_word file as a pdf with the following naming convention:
firstname_lastname_HW2_p2.pdf and upload it to BilgiLearn. Upload the m-file as
well.