Hopfield
Hopfield
1X
E=− wij si sj
2 i,j
• Here, si and sj are the states of neurons i and j, and wij is the weight
between them. The network evolves by updating the states of the neurons,
gradually decreasing the energy. The states corresponding to local minima
of the energy function are considered stable or "attractor" states, which
represent stored patterns.
1
P
1 X p p
wij = x x
N p=1 i j
This ensures that the stored patterns become stable attractors of the system.
This rule ensures that each neuron aligns itself with the weighted sum of the
inputs it receives from other neurons.
2
that minimizes some cost function. For example, it has been used for solving
combinatorial optimization problems like the Travelling Salesman Problem
(TSP).
Modern Variants
Hopfield networks have inspired more sophisticated models, like Boltzmann
machines and Restricted Boltzmann Machines (RBMs), which introduce
probabilistic elements and address some of the limitations of Hopfield networks
in modern machine learning contexts.
Conclusion
The Hopfield network is a foundational model in neural networks and computa-
tional neuroscience, representing how systems might store and recall information.
It plays a role in understanding associative memory and optimization, though it
has limitations in capacity and efficiency. Its energy-based dynamics and ability
to retrieve patterns from incomplete information make it an important stepping
stone in neural network research.