Carnegie Mellon University
Browse
file.pdf (480.99 kB)

Scaling properties of coarse-coded symbol memories

Download (480.99 kB)
journal contribution
posted on 2000-05-01, 00:00 authored by Ronald Rosenfeld, David S Touretzky, Artificial Intelligence and Psychology Project.
Abstract: "Coarse coded memories have appeared in several neural network symbol processing models, such as Touretzky and Hinton's distributed connectionist production system DCPS, Touretzky's distributed implementation of Lisp S-expressions on a Boltzmann machine, and St. John and McClelland's PDP model of case role defaults. In order to determine how these models would scale, one must first have some understanding of the mathematics of coarse coded representations. For example, the working memory of DCPS, which stores triples of symbols and consists of 2,000 units, can hold roughly 20 items at a time out of a 15,625-symbol alphabet.How would DCPS scale if the alphabet size were raised to 50,000? With the current alphabet size, how many units would have to be added simply to double the working memory capacity to 40 triples? We present some analytical results related to these questions."

History

Date

2000-05-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC