Carnegie Mellon University
Browse
file.pdf (607.95 kB)

Microchoice Bounds and Self Bounding Learning Algorithms

Download (607.95 kB)
journal contribution
posted on 1981-01-01, 00:00 authored by John Langford, Avrim Blum
A major topic in machine learning is to determine good upper bounds on the true error rates of learned hypotheses based upon their empirical performance on training data. In this paper, we demonstrate new adaptive bounds designed for learning algorithms that operate by making a sequence of choices. These bounds, which we call Microchoice bounds, are similar to Occam-style bounds and can be used to make learning algorithms self-bounding in the style of Freund (1998). We then show how to combine these bounds with Freund’s querytree approach producing a version of Freund’s query-tree structure that can be implemented with much more algorithmic efficiency.

History

Publisher Statement

All Rights Reserved

Date

1981-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC