Efficient training of recurrent neural network with time delays

Barak Cohen, David Saad, Emanuel Marom

Research output: Contribution to journalArticlepeer-review

Abstract

Training recurrent neural networks to perform certain tasks is known to be difficult. The possibility of adding synaptic delays to the network properties makes the training task more difficult. However, the disadvantage of tough training procedure is diminished by the improved network performance. During our research of training neural networks with time delays we encountered a robust method for accomplishing the training task. The method is based on adaptive simulated annealing algorithm (ASA) which was found to be superior to other training algorithms. It requires no tuning and is fast enough to enable training to be held on low end platforms such as personal computers. The implementation of the algorithm is presented over a set of typical benchmark tests of training recurrent neural networks with time delays.

Original languageEnglish
Pages (from-to)51-59
Number of pages9
JournalNeural Networks
Volume10
Issue number1
DOIs
StatePublished - Jan 1997

Keywords

  • recurrent neural networks
  • synaptic time delays
  • training

Fingerprint

Dive into the research topics of 'Efficient training of recurrent neural network with time delays'. Together they form a unique fingerprint.

Cite this