TY - JOUR

T1 - Nondirect Convergence Radius and Number of Iterations of the Hopfield Associative Memory

AU - Burshtein, David

PY - 1994/5

Y1 - 1994/5

N2 - We consider a Hopfield associative memory consist- ing of n neurons, designed to store an m-set of n-dimensional ± 1 statistically independent uniformly distributed random vectors (fundamental memories), using a connection matrix, constructed by the usual Hebbian rule. Previous results have indicated that the maximal value of m, such that almost all m vectors are stable points of the memory, in probability (i.e., with probability approaching one as n approaches infinity), is n / (2 log n)(n / (4 log n) if all m vectors must be stable simultaneously, in probability). Previous work further analyzed the direct convergence (i.e., convergence in one iteration) error-correcting power of the Hopfield memory. We rigorously analyze the general case of nondirect convergence, and prove that in the m = n / (2 log n) case, independently of the operation mode used (synchronous or asynchronous), almost all memories have an attraction radius of size n / 2 around them (in the n / (41ogn) case, all memories have such an attraction radius, in probability). This result, which was conjectured in the past but was never proved rigorously, combined with an old converse result that the network cannot store more than n / (2 log n)(n / (4 log n)) fundamental memories, gives a full picture of the error-correcting power of the Hebbian Hopfield network. We also upper bound the number of iterations required to achieve convergence.

AB - We consider a Hopfield associative memory consist- ing of n neurons, designed to store an m-set of n-dimensional ± 1 statistically independent uniformly distributed random vectors (fundamental memories), using a connection matrix, constructed by the usual Hebbian rule. Previous results have indicated that the maximal value of m, such that almost all m vectors are stable points of the memory, in probability (i.e., with probability approaching one as n approaches infinity), is n / (2 log n)(n / (4 log n) if all m vectors must be stable simultaneously, in probability). Previous work further analyzed the direct convergence (i.e., convergence in one iteration) error-correcting power of the Hopfield memory. We rigorously analyze the general case of nondirect convergence, and prove that in the m = n / (2 log n) case, independently of the operation mode used (synchronous or asynchronous), almost all memories have an attraction radius of size n / 2 around them (in the n / (41ogn) case, all memories have such an attraction radius, in probability). This result, which was conjectured in the past but was never proved rigorously, combined with an old converse result that the network cannot store more than n / (2 log n)(n / (4 log n)) fundamental memories, gives a full picture of the error-correcting power of the Hebbian Hopfield network. We also upper bound the number of iterations required to achieve convergence.

KW - Neural networks

KW - associative memory

KW - memory capacity

UR - http://www.scopus.com/inward/record.url?scp=0028425357&partnerID=8YFLogxK

U2 - 10.1109/18.335894

DO - 10.1109/18.335894

M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???

AN - SCOPUS:0028425357

VL - 40

SP - 838

EP - 847

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

SN - 0018-9448

IS - 3

ER -