This thesis studies spectral methods on neighborhood graphs. These methods operate on point-cloud data.
They form a neighborhood graph over this data, use the eigenvectors and eigenvalues of a graph Laplacian
matrix to map each point to a set of data-dependent features, then run a simple algorithm which uses these
features to perform a downstream task. Such methods are very general, and can be used to solve many different learning problems. They are also flexible, in that the feature mapping adapts to the structure of the data. However, this generality and flexibility also makes it challenging to understand the theoretical properties of spectral methods.
To understand these theoretical properties, we adopt the classical perspective of nonparametric statistics.
We provide precise guarantees which establish that spectral methods on neighborhood graphs can effectively solve various statistical problems. Chapter 2 analyzes a local spectral clustering method, PPR clustering, and shows that it approximately recovers well-conditioned density clusters. Chapters 3 and 4 consider two methods for regression, Laplacian smoothing and Laplacian eigenmaps, and show that each method is minimax optimal under various data models. In Chapter 5, we discuss assorted other properties of these methods, which differentiate them from more classical nonparametric approaches.