Abstract
We examine the performance of Hebbian-like attractor neural networks, recalling stored memory patterns from their distorted versions. Searching for an activation (firing-rate) function that maximizes the performance in sparsely connected low-activity networks, we show that the optimal activation function is a threshold-sigmoid of the neuron's input field. This function is shown to be in close correspondence with the dependence of the firing rate of cortical neurons on their integrated input current, as described by neurophysiological recordings and conduction-based models. It also accounts for the decreasing-density shape of firing rates that has been reported in the literature.
Original language | English |
---|---|
Pages (from-to) | 479-485 |
Number of pages | 7 |
Journal | Biological Cybernetics |
Volume | 74 |
Issue number | 6 |
DOIs | |
State | Published - 1996 |