Nothing Special   »   [go: up one dir, main page]

skip to main content
10.5555/3294771.3294972guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
Article
Free access

Consistent robust regression

Published: 04 December 2017 Publication History

Abstract

We present the first efficient and provably consistent estimator for the robust regression problem. The area of robust learning and optimization has generated a significant amount of interest in the learning and statistics communities in recent years owing to its applicability in scenarios with corrupted data, as well as in handling model mis-specifications. In particular, special interest has been devoted to the fundamental problem of robust linear regression where estimators that can tolerate corruption in up to a constant fraction of the response variables are widely studied. Surprisingly however, to this date, we are not aware of a polynomial time estimator that offers a consistent estimate in the presence of dense, unbounded corruptions. In this work we present such an estimator, called CRR. This solves an open problem put forward in the work of [3]. Our consistency analysis requires a novel two-stage proof technique involving a careful analysis of the stability of ordered lists which may be of independent interest. We show that CRR not only offers consistent estimates, but is empirically far superior to several other recently proposed algorithms for the robust regression problem, including extended Lasso and the TORRENT algorithm. In comparison, CRR offers comparable or better model recovery but with runtimes that are faster by an order of magnitude.

References

[1]
J. Ámos Visek. The least trimmed squares. Part I: Consistency. Kybernetika, 42:1-36, 2006.
[2]
J. Ámos Visek. The least trimmed squares. Part II: √n-consistency. Kybernetika, 42:181-202, 2006.
[3]
K. Bhatia, P. Jain, and P. Kar. Robust Regression via Hard Thresholding. In Proceedings of the 29th Annual Conference on Neural Information Processing Systems (NIPS), 2015.
[4]
E. J. Candès, X. Li, and J. Wright. Robust Principal Component Analysis? Journal of the ACM, 58(1):1-37, 2009.
[5]
M. Charikar, J. Steinhardt, and G. Valiant. Learning from Untrusted Data. arXiv:1611.02315 [cs.LG], 2016.
[6]
Y. Chen, C. Caramanis, and S. Mannor. Robust Sparse Regression under Adversarial Corruption. In Proceedings of the 30th International Conference on Machine Learning (ICML), 2013.
[7]
Y. Chen and A. S. Dalalyan. Fused sparsity and robust estimation for linear models with unknown variance. In Proceedings of the 26th Annual Conference on Neural Information Processing Systems (NIPS), 2012.
[8]
Y. Cherapanamjeri, K. Gupta, and P. Jain. Nearly-optimal Robust Matrix Completion. arXiv:1606.07315 [cs.LG], 2016.
[9]
F. Cucker and S. Smale. On the Mathematical Foundations of Learning. Bulleting of the American Mathematical Society, 39(1):1-49, 2001.
[10]
M. A. Davenport, J. N. Laska, P. T. Boufounos, and R. G. Baraniuk. A Simple Proof that Random Matrices are Democratic. Technical Report TREE0906, Rice University, Department of Electrical and Computer Engineering, 2009.
[11]
J. Feng, H. Xu, S. Mannor, and S. Yan. Robust Logistic Regression and Classification. In Proceedings of the 28th Annual Conference on Neural Information Processing Systems (NIPS), 2014.
[12]
R. Garg and R. Khandekar. Gradient Descent with Sparsification: An Iterative Algorithm for Sparse Recovery with Restricted Isometry Property. In Proceedings of the 26th International Conference on Machine Learning (ICML), 2009.
[13]
P. Jain, A. Tewari, and P. Kar. On Iterative Hard Thresholding Methods for High-dimensional M-estimation. In Proceedings of the 28th Annual Conference on Neural Information Processing Systems (NIPS), 2014.
[14]
B. McWilliams, G. Krummenacher, M. Lucic, and J. M. Buhmann. Fast and Robust Least Squares Estimation in Corrupted Linear Models. In 28th Annual Conference on Neural Information Processing Systems (NIPS), 2014.
[15]
N. M. Nasrabadi, T. D. Tran, and N. Nguyen. Robust Lasso with Missing and Grossly Corrupted Observations. In Advances in Neural Information Processing Systems, pages 1881-1889, 2011.
[16]
N. H. Nguyen and T. D. Tran. Exact recoverability from dense corrupted observations via ℓ1-minimization. IEEE transactions on information theory, 59(4):2017-2035, 2013.
[17]
N. H. Nguyen and T. D. Tran. Robust Lasso With Missing and Grossly Corrupted Observations. IEEE Transaction on Information Theory, 59(4):2036-2058, 2013.
[18]
P. J. Rousseeuw. Least Median of Squares Regression. Journal of the American Statistical Association, 79(388):871-880, 1984.
[19]
P. J. Rousseeuw and A. M. Leroy. Robust Regression and Outlier Detection. John Wiley and Sons, 1987.
[20]
C. Studer, P. Kuppinger, G. Pope, and H. Bölcskei. Recovery of Sparsely Corrupted Signals. IEEE Transaction on Information Theory, 58(5):3115-3130, 2012.
[21]
J. Wright and Y. Ma. Dense Error Correction via ℓ1 Minimization. IEEE Transactions on Information Theory, 56(7):3540-3560, 2010.
[22]
J. Wright, A. Y. Yang, A. Ganesh, S. S. Sastry, and Y. Ma. Robust Face Recognition via Sparse Representation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 31(2):210-227, 2009.
[23]
A. Y. Yang, Z. Zhou, A. G. Balasubramanian, S. S. Sastry, and Y. Ma. Fast ℓ1 -minimization algorithms for robust face recognition. IEEE Transactions on Image Processing, 22(8):3234-3246, 2013.

Cited By

View all
  • (2021)Robust and differentially private mean estimationProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540558(3887-3901)Online publication date: 6-Dec-2021
  • (2019)Efficient algorithms and lower bounds for robust linear regressionProceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms10.5555/3310435.3310605(2745-2754)Online publication date: 6-Jan-2019
  • (2019)Robust Gaussian Process Regression for Real-Time High Precision GPS Signal EnhancementProceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining10.1145/3292500.3330695(2838-2847)Online publication date: 25-Jul-2019
  • Show More Cited By
  1. Consistent robust regression

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    NIPS'17: Proceedings of the 31st International Conference on Neural Information Processing Systems
    December 2017
    7104 pages

    Publisher

    Curran Associates Inc.

    Red Hook, NY, United States

    Publication History

    Published: 04 December 2017

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)37
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 20 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Robust and differentially private mean estimationProceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540558(3887-3901)Online publication date: 6-Dec-2021
    • (2019)Efficient algorithms and lower bounds for robust linear regressionProceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms10.5555/3310435.3310605(2745-2754)Online publication date: 6-Jan-2019
    • (2019)Robust Gaussian Process Regression for Real-Time High Precision GPS Signal EnhancementProceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining10.1145/3292500.3330695(2838-2847)Online publication date: 25-Jul-2019
    • (2018)Byzantine stochastic gradient descentProceedings of the 32nd International Conference on Neural Information Processing Systems10.5555/3327345.3327372(4618-4628)Online publication date: 3-Dec-2018

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media