Abstract
Kernel machines are widely known as powerful tools for various fields of information science. In general, they are designed based on a generalization criterion related to the complexity of the model and intuitive but ad hoc philosophy such as maximal margin principle shown in SVM. On the other hand, the projection learning scheme was proposed in the field of neural networks. In the projection learning, the generalization ability is evaluated by the distance between the unknown target function and the estimated one. In this paper, we construct projection learning based kernel machines and propose a method of making a kernel function that has necessary representability for the task. The method is reduced to a selection of an appropriate reproducing kernel Hilbert space from a series of monotone increasing subspaces. We also verify the efficacy of the proposed method by numerical examples.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Muller, K., Mika, S., Ratsch, G., Tsuda, K., Scholkopf, B.: An introduction to kernel-based learning algorithms. IEEE Transactions on Neural Networks 12, 181–201 (2001)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1999)
Ogawa, H.: Neural Networks and Generalization Ability. IEICE Technical Report NC95-8, 57–64 (1995)
Sugiyama, M., Ogawa, H.: Incremental Projection Learning for Optimal Generalization. Neural Networks 14, 53–66 (2001)
Aronszajn, N.: Theory of Reproducing Kernels. Transactions of the American Mathematical Society 68, 337–404 (1950)
Mercer, J.: Functions of Positive and Negative Type and Their Connection with The Theory of Integral Equations. Transactions of the London Philosophical Society A, 415–446 (1909)
Shatten, R.: Norm Ideals of Completely Continuous Operators. Springer, Berlin (1960)
Imai, H., Tanaka, A., Miyakoshi, M.: The family of parametric projection filters and its properties for perturbation. The IEICE Transactions on Information and Systems E80–D, 788–794 (1997)
Rao, C.R., Mitra, S.K.: Generalized Inverse of Matrices and its Applications. John Wiley, Chichester (1971)
Sugiyama, M., Ogawa, H.: Subspace Information Criterion for Model Selection. Neural Computation 13, 1863–1889 (2001)
Blake, C., Merz, C.: UCI repository of machine learning databases (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tanaka, A., Takigawa, I., Imai, H., Kudo, M., Miyakoshi, M. (2004). Projection Learning Based Kernel Machine Design Using Series of Monotone Increasing Reproducing Kernel Hilbert Spaces. In: Negoita, M.G., Howlett, R.J., Jain, L.C. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2004. Lecture Notes in Computer Science(), vol 3213. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30132-5_143
Download citation
DOI: https://doi.org/10.1007/978-3-540-30132-5_143
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23318-3
Online ISBN: 978-3-540-30132-5
eBook Packages: Springer Book Archive