Nothing Special   »   [go: up one dir, main page]

Skip to main content

Discovering Hierarchical Neural Archetype Sets

  • Chapter
  • First Online:
Progresses in Artificial Intelligence and Neural Systems

Abstract

In the field of machine learning, coresets are defined as subsets of the training set that can be used to obtain a good approximation of the behavior that a given algorithm would have on the whole training set. Advantages of using coresets instead of the training set include improving training speed and allowing for a better human understanding of the dataset. Not surprisingly, coreset discovery is an active research line, with several notable contributions in literature. Nevertheless, restricting the search for representative samples to the available data points might impair the final result. In this work, neural networks are used to create sets of virtual data points, named archetypes, with the objective to represent the information contained in a training set, in the same way a coreset does. Starting from a given training set, a hierarchical clustering neural network is trained and the weight vectors of the leaves are used as archetypes on which the classifiers are trained. Experimental results on several benchmarks show that the proposed approach is competitive with traditional coreset discovery techniques, delivering results with higher accuracy, and showing a greater ability to generalize to unseen test data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    scikit-learn: Machine Learning in Python, http://scikit-learn.org/stable/.

References

  1. Bachem, O., Lucic, M., Krause, A.: Practical coreset constructions for machine learning (2017). arXiv:1703.06476

  2. Campbell, T., Broderick, T.: Bayesian coreset construction via greedy iterative geodesic ascent. In: International Conference on Machine Learning (ICML) (2018). https://arxiv.org/pdf/1802.01737.pdf

  3. Clarkson, K.L.: Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm. In: ACM Transactions on Algorithms (2010). http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.145.9299&rep=rep1&type=pdf

  4. Efroymson, M.A.: Multiple regression analysis. In: Mathematical Methods for Digital Computers (1960)

    Google Scholar 

  5. Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Ann. Statis. 32(2), 407–451 (2004). https://arxiv.org/pdf/math/0406456.pdf

  6. Boutsidis, C., Drineas, P., Magdon-Ismail, M.: Near-optimal coresets for least-squares regression. Technical Report (2013). https://arxiv.org/pdf/1202.3505.pdf

  7. Mallat, S., Zhang, Z.: Matching pursuits with time-frequency dictionaries. IEEE Trans. Signal Process. 42(12), 3397–3415 (1993)

    Article  Google Scholar 

  8. Pati, Y., Rezaiifar, R., Krishnaprasad, P.: Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In: Proceedings of 27th Asilomar Conference on Signals, Systems and Computers, pp. 40–44 (1993). http://ieeexplore.ieee.org/document/342465/

  9. Barbiero, P., Ciravegna, G., Piccolo, E., Cirrincione, G., Cirrincione, M., Bertotti, A.: Neural biclustering in gene expression analysis. In: 2017 International Conference on Computational Science and Computational Intelligence (CSCI), pp. 1238–1243, Dec. 2017

    Google Scholar 

  10. Tsang, I.W., Kwok, J.T., Cheung, P.-M.: Core vector machines: fast SVM training on very large data sets. J. Mach. Learn. Res. 6(Apr), 363–392 (2005)

    Google Scholar 

  11. Campbell, T., Broderick, T.: Automated Scalable Bayesian Inference via Hilbert Coresets (2017). http://arxiv.org/abs/1710.05053

  12. Cirrincione, G., Ciravegna, G., Barbiero, P., Randazzo, V., Pasero, E.: The GH-EXIN neural network for hierarchical clustering. Neural Netw. 121, 57–73 (2020). http://www.sciencedirect.com/science/article/pii/S0893608019302060

  13. Ciravegna, G., Cirrincione, G., Marcolin, F., Barbiero, P., Dagnes, N., Piccolo, E.: Assessing discriminating capability of geometrical descriptors for 3D face recognition by using the GH-EXIN neural network, pp. 223–233. Springer, Singapore (2020). https://doi.org/10.1007/978-981-13-8950-4_21

  14. Breiman, L.: Pasting small votes for classification in large databases and on-line. Mach. Learn. 36(1–2), 85–103 (1999)

    Article  Google Scholar 

  15. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  16. Tikhonov, A.N.: On the stability of inverse problems. Dokl. Akad. Nauk SSSR 39(5), 195–198 (1943)

    MathSciNet  Google Scholar 

  17. Hearst, M.A., Dumais, S.T., Osman, E., Platt, J., Scholkopf, B.: Support vector machines. IEEE Intell. Syst. Appl. 13(4), 18–28 (1998)

    Article  Google Scholar 

  18. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  19. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Eugen. 7(2), 179–188 (1936)

    Article  Google Scholar 

  20. Dheeru, D., Karra Taniskidou, E.: UCI Machine Learning Repository (2017). http://archive.ics.uci.edu/ml

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pietro Barbiero .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ciravegna, G., Barbiero, P., Cirrincione, G., Squillero, G., Tonda, A. (2021). Discovering Hierarchical Neural Archetype Sets. In: Esposito, A., Faundez-Zanuy, M., Morabito, F., Pasero, E. (eds) Progresses in Artificial Intelligence and Neural Systems. Smart Innovation, Systems and Technologies, vol 184. Springer, Singapore. https://doi.org/10.1007/978-981-15-5093-5_24

Download citation

Publish with us

Policies and ethics