Carnegie Mellon University
Browse

The meta-generalized delta rule : a new algorithm for learning in connectionist networks

Download (4.89 MB)
journal contribution
posted on 1994-01-01, 00:00 authored by Dean A. Pomerleau, Artificial Intelligence and Psychology Project.
Abstract: "Currently the most popular learning algorithm for connectionist networks is the generalized delta rule (GDR) developed by Rumelhart, Hinton & Williams (1986). The GDR learns by performing gradient descent on the error surface in weight space whose height at any point is equal to a measure of the network's error. The GDR is plagued by two major problems. First, the progress towards a solution using the GDR is often quite slow. Second, networks employing the GDR frequently become trapped in local minima on the error surface and hence do not reach good solutions. To solve the problems of the GDR, a new connectionist architecture and learning algorithm is developed in this thesis.The new architectural components are called meta-connections, which are connections from a unit to the connection between two other units. Meta-connections are able to temporarily alter the weight of the connection to which they are connected. In doing this, meta-connections are able to tailor the weights of individual connections for particular input/output patterns. The new learning algorithm, called the meta-generalized delta rule (MGDR), is an extension of the GDR to provide for learning the proper weights for meta-connections. Empirical tests show that the tailoring of weights using meta-connections allows the MGDR to develop solutions more quickly and reliably than the GDR in a wide range of problems."

History

Date

1994-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC