posted on 1971-01-01, 00:00authored byAdam R. Klivans, Ryan O'Donnell, Rocco Servedio
We study the learnability of sets in Rn under the Gaussian distribution, taking Gaussian surface area
as the “complexity measure” of the sets being learned. Let CS denote the class of all (measurable) sets
with surface area at most S. We first show that the class CS is learnable to any constant accuracy in
time nO(S2), even in the arbitrary noise (“agnostic”) model. Complementing this, we also show that any
learning algorithm for CS information-theoretically requires 2Ω(S2) examples for learning to constant
accuracy. These results together show that Gaussian surface area essentially characterizes the computational
complexity of learning under the Gaussian distribution.