Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1553374.1553510acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmlConference Proceedingsconference-collections
research-article

More generality in efficient multiple kernel learning

Published: 14 June 2009 Publication History

Abstract

Recent advances in Multiple Kernel Learning (MKL) have positioned it as an attractive tool for tackling many supervised learning tasks. The development of efficient gradient descent based optimization schemes has made it possible to tackle large scale problems. Simultaneously, MKL based algorithms have achieved very good results on challenging real world applications. Yet, despite their successes, MKL approaches are limited in that they focus on learning a linear combination of given base kernels.
In this paper, we observe that existing MKL formulations can be extended to learn general kernel combinations subject to general regularization. This can be achieved while retaining all the efficiency of existing large scale optimization algorithms. To highlight the advantages of generalized kernel learning, we tackle feature selection problems on benchmark vision and UCI databases. It is demonstrated that the proposed formulation can lead to better results not only as compared to traditional MKL but also as compared to state-of-the-art wrapper and filter methods for feature selection.

References

[1]
Andrew, G., & Gao, J. (2007). Scalable training of L 1-regularized log-linear models. Proceedings of the International Conference on Machine Learning (pp. 33--40).
[2]
Argyriou, A., Micchelli, C. A., & Pontil, M. (2005). Learning convex combinations of continuously parameterized basic kernels. Proceedings of the Workshop on Computational Learning Theory (pp. 338--352).
[3]
Bach, F. R. (2008). Exploring large feature spaces with hierarchical multiple kernel learning. Advances in Neural Information Processing Systems (pp. 105--112).
[4]
Bach, F. R., Lanckriet, G. R. G., & Jordan, M. I. (2004). Multiple kernel learning, conic duality, and the SMO algorithm. Proceedings of the International Conference on Machine Learning (pp. 6--13).
[5]
Baluja, S., & Rowley, H. (2007). Boosting sex identification performance. International Journal of Computer Vision, 71, 111--119.
[6]
Bi, J., Zhang, T., & Bennet, K. P. (2004). Columngeneration boosting methods for mixture of kernels. Proc. SIGKDD (pp. 521--526).
[7]
Chan, A. B., Vasconcelos, N., & Lanckriet, G. (2007). Direct convex relaxations of sparse SVM. Proceedings of the International Conference on Machine Learning (pp. 145--153).
[8]
Chapelle, O., Vapnik, V., Bousquet, O., & Mukherjee, S. (2002). Choosing multiple parameters for Support Vector Machines. Machine Learning, 46, 131--159.
[9]
Crammer, K., Keshet, J., & Singer, Y. (2002). Kernel design using boosting. Advances in Neural Information Processing Systems (pp. 537--544).
[10]
Cristianini, N., Shawe-Taylor, J., Elisseeff, A., & Kandola, J. (2001). On kernel-target alignment. Advances in Neural Information Processing Systems (pp. 367--373).
[11]
Danskin, J. M. (1967). The theorey of max-min and its applications to weapons allocation problems.
[12]
Fung, G., & Mangasarian, O. L. (2002). A feature selection newton method for support vector machine classification (Technical Report 02-03). Univ. of Wisconsin.
[13]
Kloft, M., Brefeld, U., Laskov, P., & Sonnenburg, S. (2008). Non-sparse Multiple Kernel Learning. NIPS Workshop on Kernel Learning.
[14]
Lanckriet, G. R. G., Cristianini, N., Bartlett, P., El Ghaoui, L., & Jordan, M. I. (2004). Learning the kernel matrix with semidefinite programming. Journal of Machine Learning Research, 5, 27--72.
[15]
Moghaddam, B., & Yang, M. H. (2002). Learning gender with support faces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24, 707--711.
[16]
Ong, C. S., Smola, A. J., & Williamson, R. C. (2005). Learning the kernel with hyperkernels. Journal of Machine Learning Research, 6, 1043--1071.
[17]
Rakotomamonjy, A., Bach, F., Grandvalet, Y., & Canu, S. (2008). Simplemkl. Journal of Machine Learning Research, 9, 2491--2521.
[18]
Song, L., Smola, A., Gretton, A., Borgwardt, K., & Bedo, J. (2007). Supervised feature selection via dependence estimation. Proceedings of the International Conference on Machine Learning (pp. 823--830).
[19]
Sonnenburg, S., Raetsch, G., Schaefer, C., & Schoelkopf, B. (2006). Large scale multiple kernel learning. Journal of Machine Learning Research, 7, 1531--1565.
[20]
Varma, M., & Ray, D. (2007). Learning the discriminative power-invariance trade-off. Proceedings of the International Conference on Computer Vision.
[21]
Zien, A., & Ong, C. S. (2007). Multiclass multiple kernel learning. Proceedings of the International Conference on Machine Learning (pp. 1191--1198).

Cited By

View all
  • (2024)A maximal accuracy and minimal difference criterion for multiple kernel learningExpert Systems with Applications10.1016/j.eswa.2024.124378254(124378)Online publication date: Nov-2024
  • (2024)Multi-omics fusion with soft labeling for enhanced prediction of distant metastasis in nasopharyngeal carcinoma patients after radiotherapyComputers in Biology and Medicine10.1016/j.compbiomed.2023.107684168(107684)Online publication date: Jan-2024
  • (2024)Neural Generalization of Multiple Kernel LearningNeural Processing Letters10.1007/s11063-024-11516-056:1Online publication date: 6-Feb-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICML '09: Proceedings of the 26th Annual International Conference on Machine Learning
June 2009
1331 pages
ISBN:9781605585161
DOI:10.1145/1553374

Sponsors

  • NSF
  • Microsoft Research: Microsoft Research
  • MITACS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2009

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

ICML '09
Sponsor:
  • Microsoft Research

Acceptance Rates

Overall Acceptance Rate 140 of 548 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)0
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A maximal accuracy and minimal difference criterion for multiple kernel learningExpert Systems with Applications10.1016/j.eswa.2024.124378254(124378)Online publication date: Nov-2024
  • (2024)Multi-omics fusion with soft labeling for enhanced prediction of distant metastasis in nasopharyngeal carcinoma patients after radiotherapyComputers in Biology and Medicine10.1016/j.compbiomed.2023.107684168(107684)Online publication date: Jan-2024
  • (2024)Neural Generalization of Multiple Kernel LearningNeural Processing Letters10.1007/s11063-024-11516-056:1Online publication date: 6-Feb-2024
  • (2023)Comparative survey of multigraph integration methods for holistic brain connectivity mappingMedical Image Analysis10.1016/j.media.2023.10274185(102741)Online publication date: Apr-2023
  • (2023)Vehicle detection systems for intelligent driving using deep convolutional neural networksDiscover Artificial Intelligence10.1007/s44163-023-00062-83:1Online publication date: 2-May-2023
  • (2022)Kernel Matrix-Based Heuristic Multiple Kernel LearningMathematics10.3390/math1012202610:12(2026)Online publication date: 11-Jun-2022
  • (2022)Alternating Minimization-Based Sparse Least-Squares Classifier for Accuracy and Interpretability Improvement of Credit Risk AssessmentInternational Journal of Information Technology & Decision Making10.1142/S021962202250044422:01(537-567)Online publication date: 25-Aug-2022
  • (2022)Embedded Feature Selection Based on Relevance Vector Machines With an Approximated Marginal Likelihood and its Industrial ApplicationIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2021.304959752:4(2601-2614)Online publication date: Apr-2022
  • (2022)Multiple Kernel Subspace Learning for Clustering and ClassificationIEEE Transactions on Knowledge and Data Engineering10.1109/TKDE.2022.3200723(1-14)Online publication date: 2022
  • (2022)Feature selection for kernel methods in systems biologyNAR Genomics and Bioinformatics10.1093/nargab/lqac0144:1Online publication date: 7-Mar-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media