Nondirect Convergence Radius and Number of Iterations of the Hopfield Associative Memory

Research output: Contribution to journalArticlepeer-review

Abstract

We consider a Hopfield associative memory consist- ing of n neurons, designed to store an m-set of n-dimensional ± 1 statistically independent uniformly distributed random vectors (fundamental memories), using a connection matrix, constructed by the usual Hebbian rule. Previous results have indicated that the maximal value of m, such that almost all m vectors are stable points of the memory, in probability (i.e., with probability approaching one as n approaches infinity), is n / (2 log n)(n / (4 log n) if all m vectors must be stable simultaneously, in probability). Previous work further analyzed the direct convergence (i.e., convergence in one iteration) error-correcting power of the Hopfield memory. We rigorously analyze the general case of nondirect convergence, and prove that in the m = n / (2 log n) case, independently of the operation mode used (synchronous or asynchronous), almost all memories have an attraction radius of size n / 2 around them (in the n / (41ogn) case, all memories have such an attraction radius, in probability). This result, which was conjectured in the past but was never proved rigorously, combined with an old converse result that the network cannot store more than n / (2 log n)(n / (4 log n)) fundamental memories, gives a full picture of the error-correcting power of the Hebbian Hopfield network. We also upper bound the number of iterations required to achieve convergence.

Original languageEnglish
Pages (from-to)838-847
Number of pages10
JournalIEEE Transactions on Information Theory
Volume40
Issue number3
DOIs
StatePublished - May 1994

Keywords

  • Neural networks
  • associative memory
  • memory capacity

Fingerprint

Dive into the research topics of 'Nondirect Convergence Radius and Number of Iterations of the Hopfield Associative Memory'. Together they form a unique fingerprint.

Cite this