Carnegie Mellon University
Browse

Dynamic recurrent neural networks

Download (1.57 MB)
journal contribution
posted on 2001-06-01, 00:00 authored by Barak Pearlmutter
Abstract: "We survey learning algorithms for recurrent neural networks with hidden units and attempt to put the various techniques into a common framework. We discuss fixpoint learning algorithms, namely recurrent backpropagation and deterministic Boltzmann Machines, and non-fixpoint algorithms, namely backpropagation through time, Elman's history cutoff nets, and Jordan's output feedback architecture. Forward propagation, an online technique that uses adjoint equations, is also discussed. In many cases, the unified presentation leads to generalizations of various sorts. Some simulations are presented, and at the end, issues of computational complexity are addressed."

History

Publisher Statement

© ACM, 2001. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution

Date

2001-06-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC