Optimal firing in sparsely-connected low-activity attractor networks

Isaac Meilijson*, Eytan Ruppin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We examine the performance of Hebbian-like attractor neural networks, recalling stored memory patterns from their distorted versions. Searching for an activation (firing-rate) function that maximizes the performance in sparsely connected low-activity networks, we show that the optimal activation function is a threshold-sigmoid of the neuron's input field. This function is shown to be in close correspondence with the dependence of the firing rate of cortical neurons on their integrated input current, as described by neurophysiological recordings and conduction-based models. It also accounts for the decreasing-density shape of firing rates that has been reported in the literature.

Original languageEnglish
Pages (from-to)479-485
Number of pages7
JournalBiological Cybernetics
Volume74
Issue number6
DOIs
StatePublished - 1996

Fingerprint

Dive into the research topics of 'Optimal firing in sparsely-connected low-activity attractor networks'. Together they form a unique fingerprint.

Cite this