Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition
We present a polynomial-time noise-robust margin-based active learning algorithm to find homogeneous (passing the origin) linear separators and analyze its statistical rate of error convergence when labels are corrupted by noise. We show that when the imposed noise satisfies the Tsybakov low noise condition [MT+99, Tsy04] the algorithm is able to adapt to unknown level of noise and achieves optimal statistical rate up to polylogarithmic factors. In addition, the presented algorithm is simple and does not require prior knowledge of the amount of noise in the label distribution. We also derive lower bounds for margin based active learning algorithms under Tsybakov noise conditions (TNC) for the membership query synthesis scenario [Ang88]. Our result implies lower bounds for the stream based selective sampling scenario [Coh90] under TNC for some fairly simple data distributions. Quite surprisingly, we show that the sample complexity cannot be improved even if the underlying data distribution is as simple as the uniform distribution on the unit ball. Our proof involves the construction of a well-separated hypothesis set on the d-dimensional unit ball along with carefully designed label distributions for the Tsybakov noise condition. Our analysis might provide insights for other forms of lower bounds as well.