Abstract
We study the selectivity properties of neurons based on BCM and kurtosis energy functions in a general case of noisy high-dimensional input space. The proposed approach, which is used for characterization of the stable states, can be generalized to a whole class of energy functions. We characterize the critical noise levels beyond which the selectivity is destroyed. We also perform a quantitative analysis of such transitions, which shows interesting dependency on data set size. We observe that the robustness to noise of the BCM neuron (Bienenstock, Cooper, & Munro, 1982; Intrator & Cooper, 1992) increases as a function of dimensionality. We explicitly compute the separability limit of BCM and kurtosis learning rules in the case of a bimodal input distribution. Numerical simulations show a stronger robustness of the BCM rule for practical data set size when compared with kurtosis.
Original language | English |
---|---|
Pages (from-to) | 1621-1640 |
Number of pages | 20 |
Journal | Neural Computation |
Volume | 15 |
Issue number | 7 |
DOIs | |
State | Published - Jul 2003 |
Externally published | Yes |