Nothing Special   »   [go: up one dir, main page]

Skip to main content

Consistent Model Combination of Lasso via Regularization Path

  • Conference paper
  • First Online:
Pattern Recognition (CCPR 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 662))

Included in the following conference series:

  • 1801 Accesses

Abstract

It is well-known that model combination can improve prediction performance of regression model. We investigate the model combination of Lasso with regularization path in this paper. We first define the prediction risk of Lasso estimator, and prove that Lasso regularization path contains at least one prediction consistent estimator. Then we establish the prediction consistency for convex combination of Lasso estimators, which gives the mathematical justification for model combination of Lasso on regularization path. With the inherent piecewise linearity of Lasso regularization path, we construct the initial candidate model set, then select the models for combination with Occam’s Window method. Finally, we carry out the combination on the selected models using the Bayesian model averaging. Theoretical analysis and experimental results suggest the feasibility of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Datasets Available: http://archive.ics.uci.edu/ml/datasets.html.

  2. 2.

    http://CRAN.R-project.org/package=ipred.

References

  1. Draper, D.: Assessment and propagation of model uncertainty. J. Roy. Stat. Soc. B 57, 45–97 (1995)

    MathSciNet  MATH  Google Scholar 

  2. Nilsen, T., Aven, T.: Models and model uncertainty in the context of risk analysis. Reliab. Eng. Syst. Saf. 79(3), 309–317 (2003)

    Article  Google Scholar 

  3. Raftery, A.E., Madigan, D., Hoeting, J.A.: Bayesian model averaging for linear regression models. J. Am. Stat. Assoc. 92(437), 179–191 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  4. Yang, Y.: Adaptive regression by mixing. J. Am. Stat. Assoc. 96(454), 574–588 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  5. Yuan, Z., Yang, Y.: Combining linear regression models: when and how. J. Am. Stat. Assoc. 100(472), 1202–1214 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  6. Kittler, J.: Combining classifiers: a theoretical framework. Pattern Anal. Appl. 1, 18–27 (1998)

    Article  Google Scholar 

  7. Wang, M., Liao, S.: Model combination for support vector regression via regularization path. In: Anthony, P., Ishizuka, M., Lukose, D. (eds.) PRICAI 2012. LNCS (LNAI), vol. 7458, pp. 649–660. Springer, Heidelberg (2012). doi:10.1007/978-3-642-32695-0_57

    Chapter  Google Scholar 

  8. Wang, M., Song, K., Lv, H., Liao, S.: Consistent model combination for svr via regularization path. J. Comput. Inf. Syst. 10(22), 9609–9617 (2014)

    Google Scholar 

  9. Lugosi, G., Vayatis, N.: On the Bayes-risk consistency of regularized boosting methods. Ann. Stat. 32(1), 30–55 (2004)

    MathSciNet  MATH  Google Scholar 

  10. Steinwart, I.: Consistency of support vector machines and other regularized kernel classifiers. IEEE Trans. Inf. Theory 51(1), 128–142 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  11. Tibshirant, R.: Regression shrinkage, selection via the lasso. J. Roy. Stat. Soc. B (Methodol.) 58(1), 267–288 (1996)

    Google Scholar 

  12. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Stat. 32, 407–499 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  13. Zou, H., Hastie, T., Tibshirani, R.: On the “degrees of freedom” of the lasso. Ann. Stat. 35(5), 2173–2192 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  14. Fraley, C., Hesterberg, T.: Least angle regression and lasso for large datasets. Stat. Anal. Data Min. 1(4), 251–259 (2009)

    Article  MathSciNet  Google Scholar 

  15. Hoeting, J.A., Madigan, D., Raftery, A.E., Volinsky, C.T.: Bayesian model averaging: a tutorial. Stat. Sci. 14, 382–417 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  16. Polikar, R.: Ensemble based systems in decision making. IEEE Circuits Syst. Mag. 6(3), 21–45 (2006)

    Article  Google Scholar 

  17. Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)

    Article  MathSciNet  Google Scholar 

  18. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  19. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: a new explanation for the effectiveness of voting methods. Ann. Stat. 26(5), 1651–1686 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  20. Wang, M., Liao, S.: Model combination of lasso on regularization path. J. Comput. Inf. Syst. 10(2), 755–762 (2014)

    Google Scholar 

  21. Wang, M., Liao, S.: Thress-step bayesian combination of SVM on regularization path. J. Comput. Res. Dev. 50(9), 1855–1864 (2013)

    Google Scholar 

  22. Chatterjee, S.: Assumptionless consistency of the lasso, pp. 1–10 (2014). arXiv:1303.5817v5 [math.ST]

  23. Madigan, D., Raftery, A.E.: Model selection and accounting for model uncertainty in graphical models using Occam’s window. J. Am. Stat. Assoc. 89(428), 1535–1546 (1994)

    Article  MATH  Google Scholar 

Download references

Acknowledgement

Project supported by the Natural Science Foundation of Heilongjiang Province (No. F2015020, E2016008), the Beijing Postdoctoral Research Foundation (No. 2015ZZ-120), Chaoyang District of Beijing Postdoctoral Foundation (No. 2014ZZ-14), the Natural Science Foundation of China (No. 51104030, 61170019), the Northeast Petroleum University Cultivation Foundation (No. XN2014102).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Erlong Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Nature Singapore Pte Ltd.

About this paper

Cite this paper

Wang, M., Sun, Y., Yang, E., Song, K. (2016). Consistent Model Combination of Lasso via Regularization Path. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_45

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-3002-4_45

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-3001-7

  • Online ISBN: 978-981-10-3002-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics