Carnegie Mellon University
Browse
file.pdf (551.01 kB)

Optimal rates for stochastic convex optimization under Tsybakov noise condition

Download (551.01 kB)
journal contribution
posted on 2013-06-01, 00:00 authored by Aaditya Ramdas, Aarti Singh

We focus on the problem of minimizing a convex function f over a convex set S given T queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimum x∗f,S, as quantified by a Tsybakov-like noise condition. Specifically, we prove that if fgrows at least as fast as ∥x−xf,S∥κ around its minimum, for some κ>1, then the optimal rate of learning f(x∗f,S) isΘ(T−κ/2κ−2). The classic rate Θ(1/T√) for convex functions and Θ(1/T) for strongly convex functions are special cases of our result for κ→∞ and κ=2, and even faster rates are attained for 1<κ<2. We also derive tight bounds for the complexity of learning x∗f,S, where the optimal rate is Θ(T−1/2κ−2). Interestingly, these precise rates also characterize the complexity of active learning and our results further strengthen the connections between the fields of active learning and convex optimization, both of which rely on feedback-driven queries.

History

Publisher Statement

Copyright 2013 by the author(s).

Date

2013-06-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC