10.1184/R1/6604295.v1 Shuheng Zhou Shuheng Zhou John Lafferty John Lafferty Larry Wasserman Larry Wasserman Compressed Regression Carnegie Mellon University 1998 computer sciences 1998-01-01 00:00:00 Journal contribution https://kilthub.cmu.edu/articles/journal_contribution/Compressed_Regression/6604295 Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. In this paper we study a variant of this problem where the original n input variables are compressed by a random linear transformation to m<1-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called “sparsistence.” In addition, we show that ℓ<sub>1</sub>-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called “persistence.” Finally, we characterize the privacy properties of the compression procedure in information-theoretic terms, establishing upper bounds on the rate of information communicated between the compressed and uncompressed data that decay to zero.