Carnegie Mellon University
Browse
vsadhanala_MachineLearning_2019.pdf (2.3 MB)

Nonparametric Methods with Total Variation Type Regularization

Download (2.3 MB)
thesis
posted on 2019-07-01, 20:05 authored by Veeranjaneyulu SadhanalaVeeranjaneyulu Sadhanala
We consider two special cases of the classical nonparametric regression problem of estimating a function f : Rd --> R given n noisy observations at inputs x1, ..., xn E Rd . In the first case, the input points correspond to nodes on a d-dimensional regular lattice (having equal side lengths n1=d). We defi ne two new higher-order TV classes with signals which are allowed to exhibit different amounts of smoothness at different regions in the lattice, and derive lower bounds on their minimax estimation errors. We analyze two naturally associated estimators and show that they are rate optimal over the appropriate
classes. Linear smoothers are shown to be minimax suboptimal over these TV classes.
In the second case, we consider additive models built with kth order trend fi lters resulting in kth degree piecewise polynomial components. We derive fast error rates for additive trend filtering and prove that these rates are minimax optimal when the underlying function is
itself additive and has component functions whose derivatives have bounded kth order TV.
We show that such rates are unattainable by additive models built from linear smoothers. We also study an extension of the Kolmogorov-Smirnov (KS) two-sample test, which can be more sensitive to differences in the tails. Our test statistic is an integral probability metric (IPM) de ned over a higher-order total variation ball, recovering the original KS test as its simplest case.

History

Date

2019-06-12

Degree Type

  • Dissertation

Department

  • Machine Learning

Degree Name

  • Doctor of Philosophy (PhD)

Advisor(s)

Ryan Tibshirani