TY - JOUR
T1 - Optimal spontaneous activity in neural network modeling
AU - Remondini, D.
AU - Intrator, N.
AU - Castellani, G.
AU - Bersani, F.
AU - Cooper, L. N.
PY - 2002
Y1 - 2002
N2 - We consider the origin of the high-dimensional input space as a variable which can be optimized before or during neuronal learning. This set of variables acts as a translation on the input space in order to find an optimal origin, and can be seen as an adaptive data preprocessing, included in a more general learning rule. In this framework, we can give a realistic biological interpretation to the new model. The proposed modification rule achieves the original objective of the neuronal learning while keeping the energy consumption that is required for the synaptic modification at a minimal level. This presynaptic bias can be related to the concept of "optimal spontaneous activity". It extends the properties of a familiar models such as Kurtosis, PCA, ICA and BCM, resulting in new insight and a better solution for problems such as clustering, feature extraction and data compression. The new learning rule competes with the fundamental approach of distinguishing between two clusters: unlike Fisher discriminant analysis where two (symmetric) clusters are being separated by a line that goes through their centers, our separation is achieved by a shift in the coordinate system to a location where one cluster is orthogonal to the separating vector and the other is not.
AB - We consider the origin of the high-dimensional input space as a variable which can be optimized before or during neuronal learning. This set of variables acts as a translation on the input space in order to find an optimal origin, and can be seen as an adaptive data preprocessing, included in a more general learning rule. In this framework, we can give a realistic biological interpretation to the new model. The proposed modification rule achieves the original objective of the neuronal learning while keeping the energy consumption that is required for the synaptic modification at a minimal level. This presynaptic bias can be related to the concept of "optimal spontaneous activity". It extends the properties of a familiar models such as Kurtosis, PCA, ICA and BCM, resulting in new insight and a better solution for problems such as clustering, feature extraction and data compression. The new learning rule competes with the fundamental approach of distinguishing between two clusters: unlike Fisher discriminant analysis where two (symmetric) clusters are being separated by a line that goes through their centers, our separation is achieved by a shift in the coordinate system to a location where one cluster is orthogonal to the separating vector and the other is not.
KW - Adaptive preprocessing
KW - Cluster analysis
KW - Synaptic plasticity
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=0036069690&partnerID=8YFLogxK
U2 - 10.1016/S0925-2312(02)00445-9
DO - 10.1016/S0925-2312(02)00445-9
M3 - מאמר
AN - SCOPUS:0036069690
VL - 44-46
SP - 591
EP - 595
JO - Neurocomputing
JF - Neurocomputing
SN - 0925-2312
ER -