Nothing Special   »   [go: up one dir, main page]

skip to main content
10.5555/2999134.2999275guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
Article

Fused sparsity and robust estimation for linear models with unknown variance

Published: 03 December 2012 Publication History

Abstract

In this paper, we develop a novel approach to the problem of learning sparse representations in the context of fused sparsity and unknown noise level. We propose an algorithm, termed Scaled Fused Dantzig Selector (SFDS), that accomplishes the aforementioned learning task by means of a second-order cone program. A special emphasize is put on the particular instance of fused sparsity corresponding to the learning in presence of outliers. We establish finite sample risk bounds and carry out an experimental evaluation on both synthetic and real data.

References

[1]
Stephen Becker, Emmanuel Candès, and Michael Grant. Templates for convex cone problems with applications to sparse signal recovery. Math. Program. Comput., 3(3):165-218, 2011.
[2]
A. Belloni, Victor Chernozhukov, and L. Wang. Square-root lasso: Pivotal recovery of sparse signals via conic programming. Biometrika, to appear, 2012.
[3]
Peter J. Bickel, Ya'acov Ritov, and Alexandre B. Tsybakov. Simultaneous analysis of lasso and Dantzig selector. Ann. Statist., 37(4):1705-1732, 2009.
[4]
Emmanuel Candes and Terence Tao. The Dantzig selector: statistical estimation when p is much larger than n. Ann. Statist., 35(6):2313-2351, 2007.
[5]
Emmanuel J. Candes. The restricted isometry property and its implications for compressed sensing. C. R. Math. Acad. Sci. Paris, 346(9-10):589-592, 2008.
[6]
Emmanuel J. Candes and Paige A. Randall. Highly robust error correction by convex programming. IEEE Trans. Inform. Theory, 54(7):2829-2840, 2008.
[7]
Arnak S. Dalalyan and Renaud Keriven. L1-penalized robust estimation for a class of inverse problems arising in multiview geometry. In NIPS, pages 441-449, 2009.
[8]
Arnak S. Dalalyan and Renaud Keriven. Robust estimation for an inverse problem arising in multiview geometry. J. Math. Imaging Vision., 43(1):10-23, 2012.
[9]
Eric Gautier and Alexandre Tsybakov. High-dimensional instrumental variables regression and confidence sets. Technical Report arxiv:1105.2454, September 2011.
[10]
Christophe Giraud, Sylvie Huet, and Nicolas Verzelen. High-dimensional regression with unknown variance. submitted, page arXiv:1109.5587v2 [math.ST].
[11]
Z. Harchaoui and C. Lévy-Leduc. Multiple change-point estimation with a total variation penalty. J. Amer. Statist. Assoc., 105(492):1480-1493, 2010.
[12]
Zaïd Harchaoui and Céline Lévy-Leduc. Catching change-points with lasso. In John Platt, Daphne Koller, Yoram Singer, and Sam Roweis, editors, NIPS. Curran Associates, Inc., 2007.
[13]
R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, June 2004.
[14]
A. Iouditski, F. Kilinc Karzan, A. S. Nemirovski, and B. T. Polyak. On the accuracy of l1-filtering of signals with block-sparse structure. In NIPS 24, pages 1260-1268. 2011.
[15]
S. Lambert-Lacroix and L. Zwald. Robust regression through the Huber's criterion and adaptive lasso penalty. Electron. J. Stat., 5:1015-1053, 2011.
[16]
David G. Lowe. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2):91-110, 2004.
[17]
E. Mammen and S. van de Geer. Locally adaptive regression splines. Ann. Statist., 25(1):387-413, 1997.
[18]
Nam H. Nguyen, Nasser M. Nasrabadi, and Trac D. Tran. Robust lasso with missing and grossly corrupted observations. In J. Shawe-Taylor, R.S. Zemel, P. Bartlett, F.C.N. Pereira, and K.Q. Weinberger, editors, Advances in Neural Information Processing Systems 24, pages 1881-1889. 2011.
[19]
A. Rinaldo. Properties and refinements of the fused lasso. Ann. Statist., 37(5B):2922-2952, 2009.
[20]
Nicolas Städler, Peter Bühlmann, and Sara van de Geer. ℓ1-penalization for mixture regression models. TEST, 19(2):209-256, 2010.
[21]
C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen. On benchmarking camera calibration and multi-view stereo for high resolution imagery. In Conference on Computer Vision and Pattern Recognition, pages 1-8, 2009.
[22]
J. F. Sturm. Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones. Optim. Methods Softw., 11/12(1-4):625-653, 1999.
[23]
T. Sun and C.-H. Zhang. Comments on: ℓ1-penalization for mixture regression models. TEST, 19(2): 270-275, 2010.
[24]
T. Sun and C.-H. Zhang. Scaled sparse linear regression. arXiv.1104.4595, 2011.
[25]
R. Szeliski. Computer Vision: Algorithms and Applications. Texts in Computer Science. Springer, 2010.
[26]
Robert Tibshirani. Regression shrinkage and selection via the lasso. J. Roy. Statist. Soc. Ser. B, 58(1): 267-288, 1996.
[27]
Robert Tibshirani, Michael Saunders, Saharon Rosset, Ji Zhu, and Keith Knight. Sparsity and smoothness via the fused lasso. J. R. Stat. Soc. Ser. B Stat. Methodol., 67(1):91-108, 2005.
[28]
Sara A. van de Geer and Peter Bühlmann. On the conditions used to prove oracle results for the Lasso. Electron. J. Stat., 3:1360-1392, 2009.

Cited By

View all
  • (2019)Robust Gaussian Process Regression for Real-Time High Precision GPS Signal EnhancementProceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining10.1145/3292500.3330695(2838-2847)Online publication date: 25-Jul-2019
  • (2018)Non-convex Optimization for Machine LearningFoundations and Trends® in Machine Learning10.1561/220000005810:3-4(142-336)Online publication date: 13-Dec-2018
  • (2017)Consistent robust regressionProceedings of the 31st International Conference on Neural Information Processing Systems10.5555/3294771.3294972(2107-2116)Online publication date: 4-Dec-2017
  1. Fused sparsity and robust estimation for linear models with unknown variance

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    NIPS'12: Proceedings of the 25th International Conference on Neural Information Processing Systems - Volume 1
    December 2012
    3328 pages

    Publisher

    Curran Associates Inc.

    Red Hook, NY, United States

    Publication History

    Published: 03 December 2012

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 20 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Robust Gaussian Process Regression for Real-Time High Precision GPS Signal EnhancementProceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining10.1145/3292500.3330695(2838-2847)Online publication date: 25-Jul-2019
    • (2018)Non-convex Optimization for Machine LearningFoundations and Trends® in Machine Learning10.1561/220000005810:3-4(142-336)Online publication date: 13-Dec-2018
    • (2017)Consistent robust regressionProceedings of the 31st International Conference on Neural Information Processing Systems10.5555/3294771.3294972(2107-2116)Online publication date: 4-Dec-2017

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media