Efficient training of recurrent neural network with time delays

Barak Cohen, David Saad*, Emanuel Marom

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


Training recurrent neural networks to perform certain tasks is known to be difficult. The possibility of adding synaptic delays to the network properties makes the training task more difficult. However, the disadvantage of tough training procedure is diminished by the improved network performance. During our research of training neural networks with time delays we encountered a robust method for accomplishing the training task. The method is based on adaptive simulated annealing algorithm (ASA) which was found to be superior to other training algorithms. It requires no tuning and is fast enough to enable training to be held on low end platforms such as personal computers. The implementation of the algorithm is presented over a set of typical benchmark tests of training recurrent neural networks with time delays.

Original languageEnglish
Pages (from-to)51-59
Number of pages9
JournalNeural Networks
Issue number1
StatePublished - Jan 1997


  • recurrent neural networks
  • synaptic time delays
  • training


Dive into the research topics of 'Efficient training of recurrent neural network with time delays'. Together they form a unique fingerprint.

Cite this