Division of Informatics

Forrest Hill & 80 South Bridge

## MSc Thesis #92153 | |
---|---|

Title: | Real-Time Recurrent Learning for Neural Networks: Reducing Training Time and Learning Linear Tasks |

Authors: | Tipping,ME |

Date: | 1992 |

Presented: | |

Keywords: | |

Abstract: | The project work described in this report is in two main parts - both concerning the training of fully recurrent artificial neural networks via Williams and Zipser's Real-Time Recurrent Learning (RTRL) algorithm. In the first, I have looked at how methods for accelerating learning in multilayer back-propagation networks might also be applied to RTRL problems, with the aim of similarly minimising learning times. None of the techniques that I examined proved to be effective in this respect, with the exception of the use of symmetric inputs and activation values, and then only for certain problems. In most cases, I concluded that this lack of success was due to the dynamical nature of the RTRL algorithm. For the second part, I have examined how recurrent networks may learn to perform tasks which require linear calculation. By the addition of extra, linear, "context" units, I was able to train a general recurrent network, using RTRL, to be a simple digital filter. The non-linear recurrent units, which are essential for learning purposes, appeared to have no effect on the output of the trained network. I then demonstrated that this was possible because the vectors, each composed of the weights into a particular recurrent unit from all other such units, became linearly dependent during training. |

Download: | NO ONLINE COPY |