Dynamic proximity of spatio-temporal sequences

David Horn, Gideon Dror, Brigitte Quenet

Research output: Contribution to journalArticlepeer-review

Abstract

Recurrent networks can generate spatio-temporal neural sequences of very large cycles, having an apparent random behavior. Nonetheless a proximity measure between these sequences may be defined through comparison of the synaptic weight matrices that generate them. Following the dynamic neural filter (DNF) formalism we demonstrate this concept by comparing teacher and student recurrent networks of binary neurons. We show that large sequences, providing a training set well exceeding the Cover limit, allow for good determination of the synaptic matrices. Alternatively, assuming the matrices to be known, very fast determination of the biases can be achieved. Thus, a spatio-temporal sequence may be regarded as spatio-temporal encoding of the bias vector. We introduce a linear support vector machine (SVM) variant of the DNF in order to specify an optimal weight matrix. This approach allows us to deal with noise. Spatio-temporal sequences generated by different DNFs with the same number of neurons may be compared by calculating correlations of the synaptic matrices of the reconstructed DNFs. Other types of spatio-temporal sequences need the introduction of hidden neurons, and/or the use of a kernel variant of the SVM approach. The latter is being defined as a recurrent support vector network (RSVN).

Original languageEnglish
Pages (from-to)1002-1008
Number of pages7
JournalIEEE Transactions on Neural Networks
Volume15
Issue number5
DOIs
StatePublished - Sep 2004

Fingerprint

Dive into the research topics of 'Dynamic proximity of spatio-temporal sequences'. Together they form a unique fingerprint.

Cite this