Abstract
Training recurrent neural networks to perform certain tasks is known to be difficult. The possibility of adding synaptic delays to the network properties makes the training task more difficult. However, the disadvantage of tough training procedure is diminished by the improved network performance. During our research of training neural networks with time delays we encountered a robust method for accomplishing the training task. The method is based on adaptive simulated annealing algorithm (ASA) which was found to be superior to other training algorithms. It requires no tuning and is fast enough to enable training to be held on low end platforms such as personal computers. The implementation of the algorithm is presented over a set of typical benchmark tests of training recurrent neural networks with time delays.
Original language | English |
---|---|
Pages (from-to) | 51-59 |
Number of pages | 9 |
Journal | Neural Networks |
Volume | 10 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1997 |
Keywords
- recurrent neural networks
- synaptic time delays
- training