Optimal spontaneous activity in neural network modeling

D. Remondini*, N. Intrator, G. Castellani, F. Bersani, L. N. Cooper

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the origin of the high-dimensional input space as a variable which can be optimized before or during neuronal learning. This set of variables acts as a translation on the input space in order to find an optimal origin, and can be seen as an adaptive data preprocessing, included in a more general learning rule. In this framework, we can give a realistic biological interpretation to the new model. The proposed modification rule achieves the original objective of the neuronal learning while keeping the energy consumption that is required for the synaptic modification at a minimal level. This presynaptic bias can be related to the concept of "optimal spontaneous activity". It extends the properties of a familiar models such as Kurtosis, PCA, ICA and BCM, resulting in new insight and a better solution for problems such as clustering, feature extraction and data compression. The new learning rule competes with the fundamental approach of distinguishing between two clusters: unlike Fisher discriminant analysis where two (symmetric) clusters are being separated by a line that goes through their centers, our separation is achieved by a shift in the coordinate system to a location where one cluster is orthogonal to the separating vector and the other is not.

Original languageEnglish
Pages (from-to)591-595
Number of pages5
JournalNeurocomputing
Volume44-46
DOIs
StatePublished - 2002
Externally publishedYes

Keywords

  • Adaptive preprocessing
  • Cluster analysis
  • Synaptic plasticity
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'Optimal spontaneous activity in neural network modeling'. Together they form a unique fingerprint.

Cite this