posted on 2005-03-06, 00:00authored byKevin T. Kelly
A finite data set is consistent with infinitely many alternative theories. Scientific realists
recommend that we prefer the simplest one. Anti-realists ask how a fixed simplicity bias
could track the truth when the truth might be complex. It is no solution to impose a prior
probability distribution biased toward simplicity, for such a distribution merely embodies
the bias at issue without explaining its efficacy. In this note, I argue, on the basis of
computational learning theory, that a fixed simplicity bias is necessary if inquiry is to
converge to the right answer efficiently, whatever the right answer might be. Efficiency is
understood in the sense of minimizing the least fixed bound on retractions or errors prior
to convergence.