posted on 1988-01-01, 00:00authored byDean C. Mumme, Artificial Intelligence and Psychology Project.
Abstract: "This paper presents a characterization of a simple connectionist-system, the linear-associator, as both a memory and a classifier. Toward this end, a theory of memory based on information-theory is devised. The principles of the information-theory of memory are then used in conjunction with the dynamics of the linear-associator to discern its storage capacity and classification capabilities as they scale with system size. To determine storage capacity, a set of M vector-pairs called 'items' are stored in an associator with N connection-weights. The number of bits of information stored by the system is then determined to be about (N/2) log[subscript 2]M.The maximum number of items storable is found to be half the number of weights so that the information capacity of the system is quantified to be (N/2)log[subscript 2]N. Classification capability is determined by allowing vectors not stored by the associator to appear at its input. Conditions necessary for the associator to make a correct response are derived from constraints of information-throughput of the associator, the amount of information that must be present in an input-vector and the number of vectors that can be classified by an associator of a given size with a given storage load.Figures of merit are obtained that allow comparison of capabilities of general memory/classifier systems. For an associator with a simple non-linearity on its output, the merit figures are evaluated and shown to be suboptimal. Constant attention is devoted to relative parameter size required to obtain the derived performance characteristics. Large systems are shown to perform nearest the optimum performance limits and suggestions are made concerning system architecture needed for best results. Finally, avenues for extension of the theory to more general systems are indicated."