Nothing Special   »   [go: up one dir, main page]

Skip to main content

Extreme Algorithm Selection with Dyadic Feature Representation

  • Conference paper
  • First Online:
Discovery Science (DS 2020)

Abstract

Algorithm selection (AS) deals with the automatic selection of an algorithm from a fixed set of candidate algorithms most suitable for a specific instance of an algorithmic problem class, e.g., choosing solvers for SAT problems. Benchmark suites for AS usually comprise candidate sets consisting of at most tens of algorithms, whereas in algorithm configuration (AC) and combined algorithm selection and hyperparameter optimization (CASH) problems the number of candidates becomes intractable, impeding to learn effective meta-models and thus requiring costly online performance evaluations. In this paper, we propose the setting of extreme algorithm selection (XAS), which, despite assuming limited time resources and hence excluding online evaluations at prediction time, allows for considering thousands of candidate algorithms and thereby facilitates meta learning. We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic representation, in which both problem instances and algorithms are described in terms of feature vectors. We find this approach to significantly improve over the current state of the art in various metrics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/alexandertornede/extreme_algorithm_selection.

  2. 2.

    https://docs.openml.org/benchmark/#openml-cc18 (Excluding datasets 554, 40923, 40927, 40996 due to technical issues.).

  3. 3.

    The standard deviation of the performance values per dataset is on average 0.101, minimum 0.0064 and maximum 0.33.

References

  1. Bengio, S., Dembczynski, K., Joachims, T., Kloft, M., Varma, M.: Extreme classification (dagstuhl seminar 18291). Dagstuhl Reports 8(7), 62–80 (2018)

    Google Scholar 

  2. Bischl, B., et al.: Aslib: a benchmark library for algorithm selection. Artif. Intell. 237, 41–58 (2016)

    Article  MathSciNet  Google Scholar 

  3. Cao, Z., Qin, T., Liu, T., Tsai, M., Li, H.: Learning to rank: from pairwise approach to listwise approach. In: ICML (2007)

    Google Scholar 

  4. Cheng, W., Hüllermeier, E., Dembczynski, K.J.: Label ranking methods based on the plackett-luce model. In: ICML (2010)

    Google Scholar 

  5. Cunha, T., Soares, C., de Carvalho, A.C.: A label ranking approach for selecting rankings of collaborative filtering algorithms. In: SAC (2018)

    Google Scholar 

  6. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., Hutter, F.: Efficient and robust automated machine learning. In: NIPS (2015)

    Google Scholar 

  7. Frank, E., Hall, M., Witten, I.: The WEKA Workbench. Data Mining (2016)

    Google Scholar 

  8. Goldberg, D., Nichols, D., Oki, B.M., Terry, D.: Using collaborative filtering to weave an information tapestry. Commun. ACM 35(12), 61–70 (1992)

    Article  Google Scholar 

  9. Haim, S., Walsh, T.: Restart strategy selection using machine learning techniques. In: SAT (2009)

    Google Scholar 

  10. Happe, M., Meyer auf der Heide, F., Kling, P., Platzner, M., Plessl, C.: On-the-fly computing: a novel paradigm for individualized it services. In: ISORC (2013)

    Google Scholar 

  11. Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: LION (2011)

    Google Scholar 

  12. Kerschke, P., Hoos, H.H., Neumann, F., Trautmann, H.: Automated algorithmselection: Survey and perspectives. ECJ 27(1), 3–45 (2019)

    Google Scholar 

  13. Leyton-Brown, K., Nudelman, E., Shoham, Y.: Learning the empirical hardness of optimization problems: the case of combinatorial auctions. In: CP (2002)

    Google Scholar 

  14. Malitsky, Y., O’Sullivan, B.: Latent features for algorithm selection. In: SOCS (2014)

    Google Scholar 

  15. Mısır, M., Sebag, M.: Alors: an algorithm recommender system. Artif. Intell. 244, 219–344 (2017)

    Article  MathSciNet  Google Scholar 

  16. Mohr, F., Wever, M., Hüllermeier, E.: ML-Plan: automated machine learning via hierarchical planning. Mach. Learn. 107(8), 1495–1515 (2018). https://doi.org/10.1007/s10994-018-5735-z

    Article  MathSciNet  MATH  Google Scholar 

  17. Mohr, F., Wever, M., Tornede, A., Hüllermeier, E.: From automated to on-the-fly machine learning. In: INFORMATIK (2019)

    Google Scholar 

  18. Nguyen, P., Hilario, M., Kalousis, A.: Using meta-mining to support data mining workflow planning and optimization. JAIR 51, 605–644 (2014)

    Article  Google Scholar 

  19. Oentaryo, R.J., Handoko, S.D., Lau, H.C.: Algorithm selection via ranking. In: AAAI (2015)

    Google Scholar 

  20. Rice, J.R.: The algorithm selection problem. In: Advances in computers, vol. 15. Elsevier (1976)

    Google Scholar 

  21. Schäfer, D., Hüllermeier, E.: Dyad ranking using plackett-lucemodels based on joint feature representations. Mach. Learn. 107(5), 903–941 (2018)

    Article  MathSciNet  Google Scholar 

  22. Schneider, M., Hoos, H.H.: Quantifying homogeneity of instance sets for algorithm configuration. In: LION (2012)

    Google Scholar 

  23. Stern, D.H., Samulowitz, H., Herbrich, R., Graepel, T., Pulina, L., Tacchella, A.: Collaborative expert portfolio management. In: AAAI (2010)

    Google Scholar 

  24. Thornton, C., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Auto-weka: combined selection and hyperparameter optimization of classification algorithms. In: SIGKDD (2013)

    Google Scholar 

  25. Tornede, A., Wever, M., Hüllermeier, E.: Algorithm selection as recommendation: from collaborative filtering to dyad ranking. In: CI Workshop, Dortmund (2019)

    Google Scholar 

  26. Tornede, A., Wever, M., Werner, S., Mohr, F., Hüllermeier, E.: Run2survive: a decision-theoretic approach to algorithm selection based on survival analysis. CoRR abs/2007.02816 (2020)

    Google Scholar 

  27. Vanschoren, J., van Rijn, J.N., Bischl, B., Torgo, L.: Openml: networked science in machine learning. SIGKDD Explorations 15(2), 49–60 (2013)

    Article  Google Scholar 

  28. Vembu, S., Gärtner, T.: Label ranking algorithms: a survey. In: Preference Learning. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-14125-6_3

  29. Wang, W., Zheng, V.W., Yu, H., Miao, C.: A survey of zero-shot learning: settings, methods, and applications. ACM TIST (2019)

    Google Scholar 

  30. Wang, Y., Wang, L., Li, Y., He, D., Chen, W., Liu, T.: A theoretical analysis of NDCG ranking measures. In: COLT (2013)

    Google Scholar 

  31. Weimer, M., Karatzoglou, A., Le, Q.V., Smola, A.J.: Cofi rank-maximum margin matrix factorization for collaborative ranking. In: NIPS (2008)

    Google Scholar 

  32. Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Satzilla: portfolio-based algorithm selection for sat. JAIR 32, 565–606 (2008)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the German Research Foundation (DFG) within the Collaborative Research Center “On-The-Fly Computing” (SFB 901/3 project no. 160364472) and by the Paderborn Center for Parallel Computing (PC\(^2\)) through computing time.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Tornede .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tornede, A., Wever, M., Hüllermeier, E. (2020). Extreme Algorithm Selection with Dyadic Feature Representation. In: Appice, A., Tsoumakas, G., Manolopoulos, Y., Matwin, S. (eds) Discovery Science. DS 2020. Lecture Notes in Computer Science(), vol 12323. Springer, Cham. https://doi.org/10.1007/978-3-030-61527-7_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-61527-7_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-61526-0

  • Online ISBN: 978-3-030-61527-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics