posted on 1993-01-01, 00:00authored byOded Maron, Andrew W. Moore
Selecting a good model of a set of input points by cross validation is a computationally
intensive process, especially if the number of possible models or the number of training points is
high. Techniques such as gradient descent are helpful in searching through the space of models,
but problems such as local minima, and more importantly, lack of a distance metric between
various models reduce the applicability of these search methods. Hoeffding Races is a technique
for finding a good model for the data by quickly discarding bad models, and concentrating
the computational effort at differentiating between the better ones. This paper focuses on the
special case of leave-one-out cross validation applied to memory-based learning algorithms, but
we also argue that it is applicable to any class of model selection problems.