Abstract
It is well-known that model combination can improve prediction performance of regression model. We investigate the model combination of Lasso with regularization path in this paper. We first define the prediction risk of Lasso estimator, and prove that Lasso regularization path contains at least one prediction consistent estimator. Then we establish the prediction consistency for convex combination of Lasso estimators, which gives the mathematical justification for model combination of Lasso on regularization path. With the inherent piecewise linearity of Lasso regularization path, we construct the initial candidate model set, then select the models for combination with Occam’s Window method. Finally, we carry out the combination on the selected models using the Bayesian model averaging. Theoretical analysis and experimental results suggest the feasibility of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Datasets Available: http://archive.ics.uci.edu/ml/datasets.html.
- 2.
References
Draper, D.: Assessment and propagation of model uncertainty. J. Roy. Stat. Soc. B 57, 45–97 (1995)
Nilsen, T., Aven, T.: Models and model uncertainty in the context of risk analysis. Reliab. Eng. Syst. Saf. 79(3), 309–317 (2003)
Raftery, A.E., Madigan, D., Hoeting, J.A.: Bayesian model averaging for linear regression models. J. Am. Stat. Assoc. 92(437), 179–191 (1997)
Yang, Y.: Adaptive regression by mixing. J. Am. Stat. Assoc. 96(454), 574–588 (2001)
Yuan, Z., Yang, Y.: Combining linear regression models: when and how. J. Am. Stat. Assoc. 100(472), 1202–1214 (2005)
Kittler, J.: Combining classifiers: a theoretical framework. Pattern Anal. Appl. 1, 18–27 (1998)
Wang, M., Liao, S.: Model combination for support vector regression via regularization path. In: Anthony, P., Ishizuka, M., Lukose, D. (eds.) PRICAI 2012. LNCS (LNAI), vol. 7458, pp. 649–660. Springer, Heidelberg (2012). doi:10.1007/978-3-642-32695-0_57
Wang, M., Song, K., Lv, H., Liao, S.: Consistent model combination for svr via regularization path. J. Comput. Inf. Syst. 10(22), 9609–9617 (2014)
Lugosi, G., Vayatis, N.: On the Bayes-risk consistency of regularized boosting methods. Ann. Stat. 32(1), 30–55 (2004)
Steinwart, I.: Consistency of support vector machines and other regularized kernel classifiers. IEEE Trans. Inf. Theory 51(1), 128–142 (2005)
Tibshirant, R.: Regression shrinkage, selection via the lasso. J. Roy. Stat. Soc. B (Methodol.) 58(1), 267–288 (1996)
Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Stat. 32, 407–499 (2004)
Zou, H., Hastie, T., Tibshirani, R.: On the “degrees of freedom” of the lasso. Ann. Stat. 35(5), 2173–2192 (2007)
Fraley, C., Hesterberg, T.: Least angle regression and lasso for large datasets. Stat. Anal. Data Min. 1(4), 251–259 (2009)
Hoeting, J.A., Madigan, D., Raftery, A.E., Volinsky, C.T.: Bayesian model averaging: a tutorial. Stat. Sci. 14, 382–417 (1999)
Polikar, R.: Ensemble based systems in decision making. IEEE Circuits Syst. Mag. 6(3), 21–45 (2006)
Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: a new explanation for the effectiveness of voting methods. Ann. Stat. 26(5), 1651–1686 (1998)
Wang, M., Liao, S.: Model combination of lasso on regularization path. J. Comput. Inf. Syst. 10(2), 755–762 (2014)
Wang, M., Liao, S.: Thress-step bayesian combination of SVM on regularization path. J. Comput. Res. Dev. 50(9), 1855–1864 (2013)
Chatterjee, S.: Assumptionless consistency of the lasso, pp. 1–10 (2014). arXiv:1303.5817v5 [math.ST]
Madigan, D., Raftery, A.E.: Model selection and accounting for model uncertainty in graphical models using Occam’s window. J. Am. Stat. Assoc. 89(428), 1535–1546 (1994)
Acknowledgement
Project supported by the Natural Science Foundation of Heilongjiang Province (No. F2015020, E2016008), the Beijing Postdoctoral Research Foundation (No. 2015ZZ-120), Chaoyang District of Beijing Postdoctoral Foundation (No. 2014ZZ-14), the Natural Science Foundation of China (No. 51104030, 61170019), the Northeast Petroleum University Cultivation Foundation (No. XN2014102).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, M., Sun, Y., Yang, E., Song, K. (2016). Consistent Model Combination of Lasso via Regularization Path. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_45
Download citation
DOI: https://doi.org/10.1007/978-981-10-3002-4_45
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-3001-7
Online ISBN: 978-981-10-3002-4
eBook Packages: Computer ScienceComputer Science (R0)