SlideShare a Scribd company logo
PRESENTED BY :
Ankita Pandey
ME ECE - 112604
CONTENT
             Introduction

             Properties of Hopfield network

             Hopfield network derivation

             Hopfield network example

             Applications

             References

10/31/2012                  PRESENTATION ON HOPFIELD NETWORK   2
INTRODUCTION

             Hopfield neural network is proposed by John
             Hopfield in 1982 can be seen

             • as a network with associative memory
             • can be used for different pattern recognition problems.

             It is a fully connected, single layer auto
             associative network

             • Means it has only one layer, with each neuron connected to
               every other neuron


             All the neurons act as input and output.



10/31/2012                     PRESENTATION ON HOPFIELD NETWORK             3
INTRODUCTION



     The Hopfield network(model)
     consists of a set of neurons and
    corresponding set of unit delays,
        forming a multiple loop
    feedback system as shown in fig.




10/31/2012                 PRESENTATION ON HOPFIELD NETWORK   4
INTRODUCTION

             The number of feedback loops is equal to
             the number of neurons.




             Basically, the output of the neuron is
             feedback, via a unit delay element, to
             each of the other neurons in the network.


             • no self feedback in the network.


10/31/2012                  PRESENTATION ON HOPFIELD NETWORK   5
PROPERTIES OF HOPFIELD
                   NETWORK
             • A recurrent network with all nodes connected to all other nodes.
       1

             • Nodes have binary outputs (either 0,1 or -1,1).
       2

             • Weights between the nodes are symmetric .
       3

             • No connection from a node to itself is allowed.
       4
             • Nodes are updated asynchronously ( i.e. nodes are selected at
       5       random).

             • The network has no hidden nodes or layer.
       6

10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                    6
HOPFIELD NETWORK
             Consider the noiseless, dynamical model of the neuron shown
             in fig. 1

             The synaptic weights w          j1
                                                  ,w    j2
                                                             ,...... w   jn   represents
             conductance’s.

             The respective inputs x1 t , x 2 t ,...... x n t                 represents
             the potentials, N is number of inputs.

             These inputs are applied to a current summing junction
             characterized as follows:
                 • Low input resistance.
                 • Unity current gain.
                 • High output resistance.

10/31/2012           PRESENTATION ON HOPFIELD NETWORK                                7
ADDITIVE MODEL OF A
              NEURON
                                              di




                                                           ei

                                                       Σ
               NEURAL
              NETWORK
               MODEL                              yi



10/31/2012     PRESENTATION ON HOPFIELD NETWORK                 8
HOPFIELD NETWORK
             The total current flowing toward the input node of the nonlinear
             element(activation function) is:
                                  N

                                          w   ji
                                                   xi t                 Ii
                              i       1



             Total current flowing away from the input node of the nonlinear
             element as follows:
                                  v   j
                                          t                dv   j
                                                                    t
                                                   C   j
                                      R   j
                                                            dt
                •    Where first term due to leakage resistance
                •    And second term due to leakage capacitance.



10/31/2012                   PRESENTATION ON HOPFIELD NETWORK                   9
HOPFIELD NETWORK
             • By applying KCL to the input node of the nonlinearity , we
             get
                                          N
                    dv j t    vj t
                 Cj                          w ji x i t   Ij
                      dt        Rj       i 1
                                                        ………..(1)
                                               dv       t
             • The capacitive term C     dt
                                           j
                                                    j
                                               add dynamics to the model
             of a neuron.
             • Output of the neuron j determined by using the non linear
             relation
                            xj t                    v j (t )
             • The RC model described by the eq. (1) is referred to the
             additive model

10/31/2012           PRESENTATION ON HOPFIELD NETWORK                10
HOPFIELD NETWORK
             A feature of Additive model is that the signal xᵢ(t) applied to the
             neuron j by adjoining neuron i

             • is a slowly varying function of the time t.



             Thus, a recurrent network consisting of an interconnection of N
             neurons,

             • each one of which is assumed to have the same mathematical
               model described by thev j t
                      dv j t           equation : N
                  Cj                                 w ji x i t I j,
                         dt             Rj       i 1

             •                                    j     1, 2 ,....., N
             •                                                           …….(2)
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                     11
HOPFIELD NETWORK
             Now, we use eq (2) which is based on the additive model of the
             neuron.



             Assumptions:

                                w ji    w ij
             • The matrix of synaptic weights is symmetric, as shown by:
             •                                for all i and j.
             • Each neuron has a nonlinear activation of its own, hence use
               of i        in eq.(2)
             • The inverse of the nonlinear activation function exists, so we
               can write
                                        1
               •               v      i
                                            x
                 ……….(3)
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                    12
HOPFIELD
                        NETWORK hyperbolic tangent
    • Let the sigmoid function          v
                               be defined by the
                                    i

    function
                                        aiv               1       exp   aiv
      x        i
                   v        tanh
                                            2             1       exp   aiv
    • Which has slope of a i / 2.
        •    a i refers as the gain of neuron i.
    The inverse I/O relation of eq.(3) may be written as
                        1                       1             1     x
               v             x                      log
                                            ai                1     x
    ………..(4)




10/31/2012                   PRESENTATION ON HOPFIELD NETWORK                 13
HOPFIELD
    •
                          NETWORK
        Standard form of the inverse I/O relation for a neuron of unity gain is:
                         1                            1      x
                             x              log
                                                      1      x
    • We can rewrite the eq. (4) in terms of standard relation as
                        1             1       1
                      i
                           x                     x
                                     ai




10/31/2012                       PRESENTATION ON HOPFIELD NETWORK                  14
Plot of (a) Sigmoidal Nonlinearity and (b) its
                   inverse


                                                     Error
                                                      ei


                                                     Model
             UNKNOW                                  Output       xi
                N
             SYSTEM                                           Σ
               f(.)




                     (a)                                               (b)
10/31/2012
             yi   PRESENTATION ON HOPFIELD NETWORK                           15
HOPFIELD
                        NETWORK
    • The energy function of the Hopfield network is defined by:
                                                                                                     x   j
                      N       N                                                  N                                                           N
              1                                                                              1                       1
     E                                    w        ji
                                                        xi x            j                                        j
                                                                                                                             x dx                    I jx    j
              2   i       1   j       1                                          j       1   R   j       0                               j       1


    • Differentiating E w.r.t. time , we get
                                  N                         N
             dE                                                                                      v   j
                                                                                                                                 dx      j
                                                                    w       ji
                                                                                 xi                                      I   j
             dt                   j       1             i       1                                R           j
                                                                                                                                    dt

    • by putting the value in parentheses from eq.2, we get
                                               N
                  dE                                                        dv       j
                                                                                             dx          j

                  dt
                                                            C       j
                                                                            dt                   dt
                                                                                                                             …………..(5)
                                               j   1




10/31/2012                                    PRESENTATION ON HOPFIELD NETWORK                                                                          16
HOPFIELD
  • The inverse relation NETWORK is
                         that defines in terms of x        v   j                           j
                                                       1
                           v                   i
                                                           x
  • By using above relation in eq. (5), we have
                               N                                       1
             dE                                            d       j
                                                                           x   j
                                                                                               dx   j
                                           C       j
             dt                j       1                           dt                          dt
                                                               2               1
                   N
                                           dx          j
                                                                   d       j
                                                                                       x   j
                           C       j
                                            dt                             dx                           …………..(6)
                   j   1                                                           j


  • From fig. (b) we see that the inverse I/O relation is monotonically
  increasing function of the output
  Therefore,



10/31/2012                             PRESENTATION ON HOPFIELD NETWORK                                             17
HOPFIELD
     Also,
                         NETWORK    2
                       dx    j
                                           0        for all x   j   .
                        dt
     Hence all the factors that make up the sum on R.H.S. of eq(6) are non-
     negative.
     Thus the energy function E defined as
                                   dE
                                                0
                                    dt




10/31/2012                       PRESENTATION ON HOPFIELD NETWORK             18
HOPFIELD
                          NETWORK
                                     • The energy function E is a Lyapunov
                                       funtion of the continuous Hopfield model.
       We may make the               • The model is stable in accordance with
        following two                  Lyapunov’s Theorem 1.
         statements:



      The time evolution of the      • Which seeks the minima of the energy
     continuous Hopfield model         function E and comes to stop at fixed
     described by the system of        points.
        nonlinear first order
        differential equations
     represents the trajectory in
            the state space

10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                     19
HOPFIELD
                          NETWORK       dE
     • From eq.(6) the derivative            vanishes only if
                                        dt

                           dx   j
                                    t
                                             0         for all j.
                              dt
     • Thus we can say,
             dE
                  0   expect at fixed point       ………(7)
             dt

     • The eq.(7) forms the basis for following theorem
         • The energy function E of a Hopfield network is a monotonically
         decreasing function of time.



10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                20
HOPFIELD NETWORK
                         EXAMPLE
 Connection of Hopfield Neural Network                          A Hopfield Neural network:

         Neuron     Neuron   Neuron       Neuron
         1 (N1)     2 (N2)   3 (N3)       4 (N4)


Neuron
         (N/A)      N2->N1   N3->N1       N4->N1
1 (N1)


Neuron
         N1->N2     (N/A)    N3->N2       N4->N2
2 (N2)


Neuron
         N1->N3     N2->N3   (N/A)        N4->N3
3 (N3)


Neuron
           N1->N4   N2->N4   N3->N4       (N/A)
4 (N4)10/31/2012                     PRESENTATION ON HOPFIELD NETWORK                    21
HOPFIELD NETWORK
                       EXAMPLE
• The connection weights put into this array, also called a weight matrix, allow
the neural network to recall certain patterns when presented.
• For example, the values shown in Table below show the correct values to use to
recall the patterns 0101 .

               Neuron 1   Neuron 2       Neuron 3       Neuron 4
                                                                     Weight Matrix used
                 (N1)       (N2)           (N3)           (N4)
                                                                     to recall 0101.
Neuron 1
                  0          -1               1             -1
  (N1)

Neuron 2
                  -1         0               -1              1
  (N2)

Neuron 3
                  1          -1               0             -1
  (N3)

Neuron 4
                  -1         1               -1              0
  (N4)
  10/31/2012                      PRESENTATION ON HOPFIELD NETWORK                   22
Calculating The Weight Matrix

    Step 1: Convert 0101 to bipolar

    • Bipolar is nothing more than a way to represent binary
    values as –1’s and 1’s rather than zero and 1’s.
    • To convert 0101 to bipolar we convert all of the zeros to –
    1’s. This results in:
    • 0 = -1
       1=1
       0 = -1
       1=1
    • The final result is the array (-1, 1, -1, 1)

10/31/2012              PRESENTATION ON HOPFIELD NETWORK            23
Calculating The Weight Matrix
  Step 2: Multiply (-1, 1, -1, 1) by its Inverse

 For this step we will consider -1, 1, -1, 1 to be a matrix.




 Taking the inverse of this matrix we have.


 Now, multiply these two matrices
               -1 X (-1) = 1    1 X (-1) = -1     -1 X (-1) = 1   1 X (-1) = -1
               -1 X 1 = -1      1X1=1             -1 X 1 = -1     1X1=1
               -1 X (-1) = 1    1 X (-1) = -1     -1 X (-1) = 1   1 X (-1) = -1
               -1 X 1 = -1      1X1=1             -1 X 1 = -1     1X1=1
10/31/2012                     PRESENTATION ON HOPFIELD NETWORK                   24
Calculating The Weight Matrix
 • And the matrix is:




  Step 3: Set the Northwest diagonal to zero




       • The reason behind this is, in Hopfield networks do not have their neurons
       connected to themselves.
       • So positions [1][1], [2][2], [3][3] and [4][4] in our two dimensional array
       or matrix, get set to zero. This results in the weight matrix for the bit pattern
       0101.
10/31/2012                    PRESENTATION ON HOPFIELD NETWORK                     25
Recalling Pattern
• To do this we present each input neuron, with the pattern. Each neuron will
activate based upon the input pattern.
• For example, when neuron 1 is presented with 0101 its activation will be the
sum of all weights that have a 1 in input pattern.
• The activation of each neuron is:
                   a              b                c           d    a+b+c+
                                                                      d
       N1          0             -1                0           -1     -2
       N2          0              1                0           0      1
       N3          0             -1                0           -1     -2
       N4          0              1                0           0      1
The final output vector then (-2,1,-2,1)
10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                 26
Recalling Pattern
  Now, Threshold value determines what range of values will cause the
  neuron to fire.


  The threshold usually used for a Hopfield network, is any value greater
  than zero.


  So the following neurons would fire.



  N1 activation is –2, would not fire (0)
  N2 activation is 1, would fire (1)
  N3 activation is –2, would not fire(0)
  N4 activation is 1 would fire (1).

10/31/2012                 PRESENTATION ON HOPFIELD NETWORK             27
Recalling Pattern

  We assign a binary 1 to all neurons that fired, and a binary 0 to all
  neurons that do not fire.



  The final binary output from the Hopfield network would be 0101.

  This is the same as the input pattern.




  An auto associative neural network, such as a Hopfield network

  Will echo a pattern back if the pattern is recognized.
10/31/2012                PRESENTATION ON HOPFIELD NETWORK                28
APPLICATION

              Image Detection and Recognition



                  Enhancing X-Ray Images



               In Medical Image Restoration




10/31/2012      PRESENTATION ON HOPFIELD NETWORK   29
References

     • Jacek M. Zurada, Introduction To Artificial Neural Systems (10th
     edition)
     • Simon Haykin, Neural Networks (2nd edition)
     • Satish Kumar, Neural Networks; A Classroom Approach (2nd
     Edition)
     • http://www.learnartificialneuralnetworks.com/hopfield.html
     • http://www.heatonresearch.com/articles/2/page6.html
     • http://www.thebigblob.com/hopfield-network/#associative-memory
     •http://www.dsi.unive.it/~pelillo/Didattica/RetiNeurali/Introduction_To
     _ANN_lesson_6.pdf.
     • http://en.wikipedia.org/wiki/Hopfield_network.


10/31/2012                  PRESENTATION ON HOPFIELD NETWORK                   30
10/31/2012   PRESENTATION ON HOPFIELD NETWORK   31

More Related Content

PPTX
Hopfield Networks
Kanchana Rani G
 
PDF
Three phase Induction Motor (Construction and working Principle)
Sharmitha Dhanabalan
 
PPTX
Kohonen self organizing maps
raphaelkiminya
 
PPSX
Perceptron (neural network)
EdutechLearners
 
PPTX
Smart irrigation system
VISHALDWIVEDI27
 
PDF
Adaptive Resonance Theory (ART)
Amir Masoud Sefidian
 
PPTX
Spintronics
Akshay Krishna K R
 
PPTX
Self-organizing map
Tarat Diloksawatdikul
 
Hopfield Networks
Kanchana Rani G
 
Three phase Induction Motor (Construction and working Principle)
Sharmitha Dhanabalan
 
Kohonen self organizing maps
raphaelkiminya
 
Perceptron (neural network)
EdutechLearners
 
Smart irrigation system
VISHALDWIVEDI27
 
Adaptive Resonance Theory (ART)
Amir Masoud Sefidian
 
Spintronics
Akshay Krishna K R
 
Self-organizing map
Tarat Diloksawatdikul
 

What's hot (20)

PPT
backpropagation in neural networks
Akash Goel
 
PPTX
Radial basis function network ppt bySheetal,Samreen and Dhanashri
sheetal katkar
 
PPT
Image segmentation ppt
Gichelle Amon
 
PPT
Chapter10 image segmentation
asodariyabhavesh
 
PPT
Adaptive Resonance Theory
Naveen Kumar
 
PPTX
Associative memory network
Dr. C.V. Suresh Babu
 
PPT
Interpixel redundancy
Naveen Kumar
 
PPTX
Point processing
panupriyaa7
 
PDF
Max net
Sandilya Sridhara
 
PPTX
Image Enhancement in Spatial Domain
DEEPASHRI HK
 
PPT
Back propagation
Nagarajan
 
PPTX
Multi Layer Network
International Islamic University
 
PPTX
Counter propagation Network
Akshay Dhole
 
PPTX
Perceptron & Neural Networks
NAGUR SHAREEF SHAIK
 
PDF
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
PDF
Artificial Neural Networks Lect1: Introduction & neural computation
Mohammed Bennamoun
 
PPT
Arithmetic coding
Vikas Goyal
 
backpropagation in neural networks
Akash Goel
 
Radial basis function network ppt bySheetal,Samreen and Dhanashri
sheetal katkar
 
Image segmentation ppt
Gichelle Amon
 
Chapter10 image segmentation
asodariyabhavesh
 
Adaptive Resonance Theory
Naveen Kumar
 
Associative memory network
Dr. C.V. Suresh Babu
 
Interpixel redundancy
Naveen Kumar
 
Point processing
panupriyaa7
 
Image Enhancement in Spatial Domain
DEEPASHRI HK
 
Back propagation
Nagarajan
 
Counter propagation Network
Akshay Dhole
 
Perceptron & Neural Networks
NAGUR SHAREEF SHAIK
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Mohammed Bennamoun
 
Artificial Neural Networks Lect1: Introduction & neural computation
Mohammed Bennamoun
 
Arithmetic coding
Vikas Goyal
 
Ad

Similar to HOPFIELD NETWORK (20)

PPTX
Artificial neural network
DEEPASHRI HK
 
PPTX
Mathematical Foundation of Discrete time Hopfield Networks
Akhil Upadhyay
 
PDF
ANN-lecture9
Laila Fatehy
 
PDF
Mapping the discrete logarithm
Joshua Holden
 
PDF
Arquitecturas Basicas Slides
ESCOM
 
PDF
The International Journal of Engineering and Science
theijes
 
PDF
The International Journal of Engineering and Science (IJES)
theijes
 
PDF
call for papers, research paper publishing, where to publish research paper, ...
International Journal of Engineering Inventions www.ijeijournal.com
 
PPTX
Perceptron for neuron (Single Neuron).pptx
salutiontechnology
 
PDF
Cancellation of Zigbee interference in OFDM based WLAN for multipath channel
IDES Editor
 
PDF
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Universitat Politècnica de Catalunya
 
PPTX
Unit iii update
Indira Priyadarsini
 
PPTX
Introduction to neural network (Module 1).pptx
archanac21
 
PDF
ZhaoLangIGARSS2011.pdf
grssieee
 
PPTX
hopfield neural network
Abhishikha Sinha
 
PPTX
Topic 3.NN and DL Hopfield Networks.pptx
ManjulaRavichandran5
 
PPTX
Neural network
KRISH na TimeTraveller
 
PDF
Drude Lorentz circuit Gonano Zich
Carlo Andrea Gonano
 
PPTX
Neural networks
Slideshare
 
PDF
(Kpi summer school 2015) theano tutorial part2
Serhii Havrylov
 
Artificial neural network
DEEPASHRI HK
 
Mathematical Foundation of Discrete time Hopfield Networks
Akhil Upadhyay
 
ANN-lecture9
Laila Fatehy
 
Mapping the discrete logarithm
Joshua Holden
 
Arquitecturas Basicas Slides
ESCOM
 
The International Journal of Engineering and Science
theijes
 
The International Journal of Engineering and Science (IJES)
theijes
 
call for papers, research paper publishing, where to publish research paper, ...
International Journal of Engineering Inventions www.ijeijournal.com
 
Perceptron for neuron (Single Neuron).pptx
salutiontechnology
 
Cancellation of Zigbee interference in OFDM based WLAN for multipath channel
IDES Editor
 
Deep Neural Networks (D1L2 Insight@DCU Machine Learning Workshop 2017)
Universitat Politècnica de Catalunya
 
Unit iii update
Indira Priyadarsini
 
Introduction to neural network (Module 1).pptx
archanac21
 
ZhaoLangIGARSS2011.pdf
grssieee
 
hopfield neural network
Abhishikha Sinha
 
Topic 3.NN and DL Hopfield Networks.pptx
ManjulaRavichandran5
 
Neural network
KRISH na TimeTraveller
 
Drude Lorentz circuit Gonano Zich
Carlo Andrea Gonano
 
Neural networks
Slideshare
 
(Kpi summer school 2015) theano tutorial part2
Serhii Havrylov
 
Ad

Recently uploaded (20)

PPTX
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 
PPTX
Dakar Framework Education For All- 2000(Act)
santoshmohalik1
 
PPTX
20250924 Navigating the Future: How to tell the difference between an emergen...
McGuinness Institute
 
PPTX
CONCEPT OF CHILD CARE. pptx
AneetaSharma15
 
PPTX
BASICS IN COMPUTER APPLICATIONS - UNIT I
suganthim28
 
PPTX
Information Texts_Infographic on Forgetting Curve.pptx
Tata Sevilla
 
PPTX
Care of patients with elImination deviation.pptx
AneetaSharma15
 
PPTX
Continental Accounting in Odoo 18 - Odoo Slides
Celine George
 
PPTX
An introduction to Dialogue writing.pptx
drsiddhantnagine
 
PPTX
Measures_of_location_-_Averages_and__percentiles_by_DR SURYA K.pptx
Surya Ganesh
 
PPTX
HISTORY COLLECTION FOR PSYCHIATRIC PATIENTS.pptx
PoojaSen20
 
PPTX
A Smarter Way to Think About Choosing a College
Cyndy McDonald
 
PPTX
Five Point Someone – Chetan Bhagat | Book Summary & Analysis by Bhupesh Kushwaha
Bhupesh Kushwaha
 
DOCX
SAROCES Action-Plan FOR ARAL PROGRAM IN DEPED
Levenmartlacuna1
 
PPTX
family health care settings home visit - unit 6 - chn 1 - gnm 1st year.pptx
Priyanshu Anand
 
PDF
What is CFA?? Complete Guide to the Chartered Financial Analyst Program
sp4989653
 
PPTX
An introduction to Prepositions for beginners.pptx
drsiddhantnagine
 
PPTX
Applications of matrices In Real Life_20250724_091307_0000.pptx
gehlotkrish03
 
PPTX
Cleaning Validation Ppt Pharmaceutical validation
Ms. Ashatai Patil
 
DOCX
Modul Ajar Deep Learning Bahasa Inggris Kelas 11 Terbaru 2025
wahyurestu63
 
Artificial Intelligence in Gastroentrology: Advancements and Future Presprec...
AyanHossain
 
Dakar Framework Education For All- 2000(Act)
santoshmohalik1
 
20250924 Navigating the Future: How to tell the difference between an emergen...
McGuinness Institute
 
CONCEPT OF CHILD CARE. pptx
AneetaSharma15
 
BASICS IN COMPUTER APPLICATIONS - UNIT I
suganthim28
 
Information Texts_Infographic on Forgetting Curve.pptx
Tata Sevilla
 
Care of patients with elImination deviation.pptx
AneetaSharma15
 
Continental Accounting in Odoo 18 - Odoo Slides
Celine George
 
An introduction to Dialogue writing.pptx
drsiddhantnagine
 
Measures_of_location_-_Averages_and__percentiles_by_DR SURYA K.pptx
Surya Ganesh
 
HISTORY COLLECTION FOR PSYCHIATRIC PATIENTS.pptx
PoojaSen20
 
A Smarter Way to Think About Choosing a College
Cyndy McDonald
 
Five Point Someone – Chetan Bhagat | Book Summary & Analysis by Bhupesh Kushwaha
Bhupesh Kushwaha
 
SAROCES Action-Plan FOR ARAL PROGRAM IN DEPED
Levenmartlacuna1
 
family health care settings home visit - unit 6 - chn 1 - gnm 1st year.pptx
Priyanshu Anand
 
What is CFA?? Complete Guide to the Chartered Financial Analyst Program
sp4989653
 
An introduction to Prepositions for beginners.pptx
drsiddhantnagine
 
Applications of matrices In Real Life_20250724_091307_0000.pptx
gehlotkrish03
 
Cleaning Validation Ppt Pharmaceutical validation
Ms. Ashatai Patil
 
Modul Ajar Deep Learning Bahasa Inggris Kelas 11 Terbaru 2025
wahyurestu63
 

HOPFIELD NETWORK

  • 1. PRESENTED BY : Ankita Pandey ME ECE - 112604
  • 2. CONTENT Introduction Properties of Hopfield network Hopfield network derivation Hopfield network example Applications References 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 2
  • 3. INTRODUCTION Hopfield neural network is proposed by John Hopfield in 1982 can be seen • as a network with associative memory • can be used for different pattern recognition problems. It is a fully connected, single layer auto associative network • Means it has only one layer, with each neuron connected to every other neuron All the neurons act as input and output. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 3
  • 4. INTRODUCTION The Hopfield network(model) consists of a set of neurons and corresponding set of unit delays, forming a multiple loop feedback system as shown in fig. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 4
  • 5. INTRODUCTION The number of feedback loops is equal to the number of neurons. Basically, the output of the neuron is feedback, via a unit delay element, to each of the other neurons in the network. • no self feedback in the network. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 5
  • 6. PROPERTIES OF HOPFIELD NETWORK • A recurrent network with all nodes connected to all other nodes. 1 • Nodes have binary outputs (either 0,1 or -1,1). 2 • Weights between the nodes are symmetric . 3 • No connection from a node to itself is allowed. 4 • Nodes are updated asynchronously ( i.e. nodes are selected at 5 random). • The network has no hidden nodes or layer. 6 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 6
  • 7. HOPFIELD NETWORK Consider the noiseless, dynamical model of the neuron shown in fig. 1 The synaptic weights w j1 ,w j2 ,...... w jn represents conductance’s. The respective inputs x1 t , x 2 t ,...... x n t represents the potentials, N is number of inputs. These inputs are applied to a current summing junction characterized as follows: • Low input resistance. • Unity current gain. • High output resistance. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 7
  • 8. ADDITIVE MODEL OF A NEURON di ei Σ NEURAL NETWORK MODEL yi 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 8
  • 9. HOPFIELD NETWORK The total current flowing toward the input node of the nonlinear element(activation function) is: N w ji xi t Ii i 1 Total current flowing away from the input node of the nonlinear element as follows: v j t dv j t C j R j dt • Where first term due to leakage resistance • And second term due to leakage capacitance. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 9
  • 10. HOPFIELD NETWORK • By applying KCL to the input node of the nonlinearity , we get N dv j t vj t Cj w ji x i t Ij dt Rj i 1 ………..(1) dv t • The capacitive term C dt j j add dynamics to the model of a neuron. • Output of the neuron j determined by using the non linear relation xj t v j (t ) • The RC model described by the eq. (1) is referred to the additive model 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 10
  • 11. HOPFIELD NETWORK A feature of Additive model is that the signal xᵢ(t) applied to the neuron j by adjoining neuron i • is a slowly varying function of the time t. Thus, a recurrent network consisting of an interconnection of N neurons, • each one of which is assumed to have the same mathematical model described by thev j t dv j t equation : N Cj w ji x i t I j, dt Rj i 1 • j 1, 2 ,....., N • …….(2) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 11
  • 12. HOPFIELD NETWORK Now, we use eq (2) which is based on the additive model of the neuron. Assumptions: w ji w ij • The matrix of synaptic weights is symmetric, as shown by: • for all i and j. • Each neuron has a nonlinear activation of its own, hence use of i in eq.(2) • The inverse of the nonlinear activation function exists, so we can write 1 • v i x ……….(3) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 12
  • 13. HOPFIELD NETWORK hyperbolic tangent • Let the sigmoid function v be defined by the i function aiv 1 exp aiv x i v tanh 2 1 exp aiv • Which has slope of a i / 2. • a i refers as the gain of neuron i. The inverse I/O relation of eq.(3) may be written as 1 1 1 x v x log ai 1 x ………..(4) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 13
  • 14. HOPFIELD • NETWORK Standard form of the inverse I/O relation for a neuron of unity gain is: 1 1 x x log 1 x • We can rewrite the eq. (4) in terms of standard relation as 1 1 1 i x x ai 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 14
  • 15. Plot of (a) Sigmoidal Nonlinearity and (b) its inverse Error ei Model UNKNOW Output xi N SYSTEM Σ f(.) (a) (b) 10/31/2012 yi PRESENTATION ON HOPFIELD NETWORK 15
  • 16. HOPFIELD NETWORK • The energy function of the Hopfield network is defined by: x j N N N N 1 1 1 E w ji xi x j j x dx I jx j 2 i 1 j 1 j 1 R j 0 j 1 • Differentiating E w.r.t. time , we get N N dE v j dx j w ji xi I j dt j 1 i 1 R j dt • by putting the value in parentheses from eq.2, we get N dE dv j dx j dt C j dt dt …………..(5) j 1 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 16
  • 17. HOPFIELD • The inverse relation NETWORK is that defines in terms of x v j j 1 v i x • By using above relation in eq. (5), we have N 1 dE d j x j dx j C j dt j 1 dt dt 2 1 N dx j d j x j C j dt dx …………..(6) j 1 j • From fig. (b) we see that the inverse I/O relation is monotonically increasing function of the output Therefore, 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 17
  • 18. HOPFIELD Also, NETWORK 2 dx j 0 for all x j . dt Hence all the factors that make up the sum on R.H.S. of eq(6) are non- negative. Thus the energy function E defined as dE 0 dt 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 18
  • 19. HOPFIELD NETWORK • The energy function E is a Lyapunov funtion of the continuous Hopfield model. We may make the • The model is stable in accordance with following two Lyapunov’s Theorem 1. statements: The time evolution of the • Which seeks the minima of the energy continuous Hopfield model function E and comes to stop at fixed described by the system of points. nonlinear first order differential equations represents the trajectory in the state space 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 19
  • 20. HOPFIELD NETWORK dE • From eq.(6) the derivative vanishes only if dt dx j t 0 for all j. dt • Thus we can say, dE 0 expect at fixed point ………(7) dt • The eq.(7) forms the basis for following theorem • The energy function E of a Hopfield network is a monotonically decreasing function of time. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 20
  • 21. HOPFIELD NETWORK EXAMPLE Connection of Hopfield Neural Network A Hopfield Neural network: Neuron Neuron Neuron Neuron 1 (N1) 2 (N2) 3 (N3) 4 (N4) Neuron (N/A) N2->N1 N3->N1 N4->N1 1 (N1) Neuron N1->N2 (N/A) N3->N2 N4->N2 2 (N2) Neuron N1->N3 N2->N3 (N/A) N4->N3 3 (N3) Neuron N1->N4 N2->N4 N3->N4 (N/A) 4 (N4)10/31/2012 PRESENTATION ON HOPFIELD NETWORK 21
  • 22. HOPFIELD NETWORK EXAMPLE • The connection weights put into this array, also called a weight matrix, allow the neural network to recall certain patterns when presented. • For example, the values shown in Table below show the correct values to use to recall the patterns 0101 . Neuron 1 Neuron 2 Neuron 3 Neuron 4 Weight Matrix used (N1) (N2) (N3) (N4) to recall 0101. Neuron 1 0 -1 1 -1 (N1) Neuron 2 -1 0 -1 1 (N2) Neuron 3 1 -1 0 -1 (N3) Neuron 4 -1 1 -1 0 (N4) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 22
  • 23. Calculating The Weight Matrix Step 1: Convert 0101 to bipolar • Bipolar is nothing more than a way to represent binary values as –1’s and 1’s rather than zero and 1’s. • To convert 0101 to bipolar we convert all of the zeros to – 1’s. This results in: • 0 = -1 1=1 0 = -1 1=1 • The final result is the array (-1, 1, -1, 1) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 23
  • 24. Calculating The Weight Matrix  Step 2: Multiply (-1, 1, -1, 1) by its Inverse For this step we will consider -1, 1, -1, 1 to be a matrix. Taking the inverse of this matrix we have. Now, multiply these two matrices -1 X (-1) = 1 1 X (-1) = -1 -1 X (-1) = 1 1 X (-1) = -1 -1 X 1 = -1 1X1=1 -1 X 1 = -1 1X1=1 -1 X (-1) = 1 1 X (-1) = -1 -1 X (-1) = 1 1 X (-1) = -1 -1 X 1 = -1 1X1=1 -1 X 1 = -1 1X1=1 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 24
  • 25. Calculating The Weight Matrix • And the matrix is:  Step 3: Set the Northwest diagonal to zero • The reason behind this is, in Hopfield networks do not have their neurons connected to themselves. • So positions [1][1], [2][2], [3][3] and [4][4] in our two dimensional array or matrix, get set to zero. This results in the weight matrix for the bit pattern 0101. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 25
  • 26. Recalling Pattern • To do this we present each input neuron, with the pattern. Each neuron will activate based upon the input pattern. • For example, when neuron 1 is presented with 0101 its activation will be the sum of all weights that have a 1 in input pattern. • The activation of each neuron is: a b c d a+b+c+ d N1 0 -1 0 -1 -2 N2 0 1 0 0 1 N3 0 -1 0 -1 -2 N4 0 1 0 0 1 The final output vector then (-2,1,-2,1) 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 26
  • 27. Recalling Pattern Now, Threshold value determines what range of values will cause the neuron to fire. The threshold usually used for a Hopfield network, is any value greater than zero. So the following neurons would fire. N1 activation is –2, would not fire (0) N2 activation is 1, would fire (1) N3 activation is –2, would not fire(0) N4 activation is 1 would fire (1). 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 27
  • 28. Recalling Pattern We assign a binary 1 to all neurons that fired, and a binary 0 to all neurons that do not fire. The final binary output from the Hopfield network would be 0101. This is the same as the input pattern. An auto associative neural network, such as a Hopfield network Will echo a pattern back if the pattern is recognized. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 28
  • 29. APPLICATION Image Detection and Recognition Enhancing X-Ray Images In Medical Image Restoration 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 29
  • 30. References • Jacek M. Zurada, Introduction To Artificial Neural Systems (10th edition) • Simon Haykin, Neural Networks (2nd edition) • Satish Kumar, Neural Networks; A Classroom Approach (2nd Edition) • http://www.learnartificialneuralnetworks.com/hopfield.html • http://www.heatonresearch.com/articles/2/page6.html • http://www.thebigblob.com/hopfield-network/#associative-memory •http://www.dsi.unive.it/~pelillo/Didattica/RetiNeurali/Introduction_To _ANN_lesson_6.pdf. • http://en.wikipedia.org/wiki/Hopfield_network. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 30
  • 31. 10/31/2012 PRESENTATION ON HOPFIELD NETWORK 31