Carnegie Mellon University
Browse
- No file added yet -

A probabilistic computational framework for neural network models

Download (1.79 MB)
journal contribution
posted on 1987-01-01, 00:00 authored by Richard M. Golden, Artificial Intelligence and Psychology Project.
Abstract: "Information retrieval in a "connectionist" or neural network is viewed as computing the most probable value of the information to be retrieved with respect to a probability density function, P. with a minimal number of assumptions, the "energy" function that a neural network minimizes during information retrieval is shown to uniquely specify P. Inspection of the form of P indicates the class of probabilistic environments that can be learned.Learning algorithms can be analyzed and designed by using maximum likelihood estimation techniques to estimate the parameters of P. The large class of nonlinear auto-associative networks analyzed by Cohen and Grossberg (1983), nonlinear associative multi-layer back-propagation networks (Rumelhart, Hinton, & Williams, 1986), and certain classes of nonlinear multi-stage networks are analyzed within the proposed computational framework."

History

Publisher Statement

All Rights Reserved

Date

1987-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC