%0 Journal Article %A Zhou, Shuheng %A Lafferty, John %A Wasserman, Larry %D 1998 %T Compressed Regression %U https://kilthub.cmu.edu/articles/journal_contribution/Compressed_Regression/6604295 %R 10.1184/R1/6604295.v1 %2 https://kilthub.cmu.edu/ndownloader/files/12094697 %K computer sciences %X Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. In this paper we study a variant of this problem where the original n input variables are compressed by a random linear transformation to m<1-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called “sparsistence.” In addition, we show that ℓ1-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called “persistence.” Finally, we characterize the privacy properties of the compression procedure in information-theoretic terms, establishing upper bounds on the rate of information communicated between the compressed and uncompressed data that decay to zero. %I Carnegie Mellon University