Abstract
We consider the classification problem on a finite set of objects. Some of them are labeled, and the task is to predict the labels of the remaining unlabeled ones. Such an estimation problem is generally referred to as transductive inference. It is well-known that many meaningful inductive or supervised methods can be derived from a regularization framework, which minimizes a loss function plus a regularization term. In the same spirit, we propose a general discrete regularization framework defined on finite object sets, which can be thought of as discrete analogue of classical regularization theory. A family of transductive inference schemes is then systemically derived from the framework, including our earlier algorithm for transductive inference, with which we obtained encouraging results on many practical classification problems. The discrete regularization framework is built on discrete analysis and geometry developed by ourselves, in which a number of discrete differential operators of various orders are constructed, which can be thought of as discrete analogues of their counterparts in the continuous case.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Chung, F.: Spectral Graph Theory. CBMS-NSF Regional Conference Series in Mathematics, vol. 92. SIAM, Philadelphia (1997)
Eells, J., Sampson, J.H.: Harmonic mappings of Riemannian manifolds. American Journal of Mathematics 86, 109–160 (1964)
Hardt, R., Lin, F.H.: Mappings minimizing the L p norm of the gradient. Communications on Pure and Applied Mathematics 40, 556–588 (1987)
Heinonen, J., Kilpeläinen, T., Martio, O.: Nonlinear Potential Theory of Degenerate Elliptic Equations. Oxford University Press, Oxford (1993)
Jensen, R.: Uniqueness of Lipschitz extensions: minimizing the sup-norm of the gradient. Arch. Rat. Mech. Anal. 123(1), 51–74 (1993)
Jost, J.: Riemannian Geometry and Geometric Analysis, 3rd edn. Springer, Heidelberg (2002)
Tikhonov, A.N., Arsenin, V.Y.: Solutions of Ill-posed Problems. W.H. Winston, Washington (1977)
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Wahba, G.: Spline Models for Observational Data. CBMS-NSF Regional Conference Series in Applied Mathematics, vol. 59. SIAM, Philadelphia (1990)
Yamasaki, M.: Ideal boundary limit of discrete Dirichlet functions. Hiroshima Math. J. 16(2), 353–360 (1986)
Zhou, D., Bousquet, O., Lal, T.N., Weston, J., Schölkopf, B.: Learning with local and global consistency. In: Advances in Neural Information Processing Systems, vol. 16. MIT Press, Cambridge (2004)
Zhou, D., Schölkopf, B., Hofmann, T.: Semi-supervised learning on directed graphs. In: Advances in Neural Information Processing Systems, vol. 17. MIT Press, Cambridge (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhou, D., Schölkopf, B. (2005). Regularization on Discrete Spaces. In: Kropatsch, W.G., Sablatnig, R., Hanbury, A. (eds) Pattern Recognition. DAGM 2005. Lecture Notes in Computer Science, vol 3663. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11550518_45
Download citation
DOI: https://doi.org/10.1007/11550518_45
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28703-2
Online ISBN: 978-3-540-31942-9
eBook Packages: Computer ScienceComputer Science (R0)