Carnegie Mellon University
Browse

Chaitin-Kolmogorov Complexity and Generalization in Neural Networks

Download (2.84 MB)
conference contribution
posted on 2023-02-03, 21:20 authored by Ronald RosenfeldRonald Rosenfeld, Barak Pearlmutter

We present a unified framework for a number of different ways of failing to generalize properly. During learning, sources of random information contaminate the network, effectively augmenting the training data with random information. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical networks are independently trained on the same data and their results averaged. We conclude that replication almost always results in a decrease in the expected complexity of the network, and that replication therefore increases expected generalization. Simulations confirming the effect are also presented.

History

Publisher Statement

In Lippman, D. S., Moody, J. E., and Touretzky, D. S., editors, Advances in Neural Information Processing Systems 3, pages 925-931

Date

1990-10-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC