Abstract
We have previously (MCS2001) presented a mathematical metaphor setting out an equivalence between multiple expert fusion and the process of tomographic reconstruction familiar from medical imaging. However, the discussion took place only in relation to a restricted case: namely, classifiers containing discrete feature sets. This, its sequel paper, will therefore endeavour to extend the methodology to the fully general case.
The investigation is thus conducted initially within the context of classical feature selection (that is, selection algorithms that place no restriction upon the overlap of feature sets), the findings in relation to which demonstrating the necessity of a re-evaluation of the role of feature-selection when conducted within an explicitly combinatorial framework. When fully enunciated, the resulting investigation leads naturally to a completely generalised, morphologically-optimal strategy for classifier combination.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
D. Windridge, J. Kittler, “Classifier Combination as a Tomographic Process”, (Multiple Classifier Systems, LNCS. Vol. 2096, 2001.)
R A Jacobs, “Methods for combining experts’ probability assessments”, Neural Computation, 3, pp 79–87, 1991
J. Kittler, M. Hatef, R.P.W. Duin, and J. Matas, “On combining classifiers”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 3, 1998, 226–239
L. Lam and C.Y. Suen, “Optimal combinations of pattern classifiers”, Pattern Recognition Letters, vol. 16, no. 9, 1995, 945–954.
A F R Rahman and M C Fairhurst, “An evaluation of multi-expert configurations for the recognition of handwritten numerals”, Pattern Recognition Letters, 31, pp 1255–1273, 1998
A F R Rahman and M C Fairhurst, “A new hybrid approach in combining multiple experts to recognise handwritten numerals”, Pattern Recognition Letters, 18, pp 781–790, 1997
K Woods, W P Kegelmeyer and K Bowyer, “Combination of multiple classifiers using local accuracy estimates”, IEEE Trans. Pattern Analysis and Machine Intelligence, 19, pp 405–410, 1997
Neal R., Probabilistic inference using Markov chain Monte Carlo methods. Tech. rep. CRG-TR-93-1 1993, Department of Computer Science, University of Toronto, Toronto, CA.
Breiman L., Bagging predictors, Machine Learning 1996; vol. 24, no. 2:123–140.
Drucker H., Cortes C., Jackel L. D., Lecun Y., and Vapnik V., Boosting and other ensemble methods, Neural Computation, 1994; vol. 6, no. 6:1289–1301
Windridge D., Kittler J., “Combined Classifier Optimisation via Feature Selection”, Proceedings “Advances in Pattern Recognition”, Joint IAPR International Workshops SSPR 2000 and SPR 2000 Alicante, Spain, August 30-September 1, 2000; Lecture Notes in Computer Science.VOL. 1876
Dietterich T. G., Bakiri G. ”Solving Multiclass Learning Problems via Error-Correcting Output Codes”. Journal of Artificial Intelligence Research, 1995; Vol 2.: 263–286.
D. Windridge, J. Kittler, “Combined Classifier Optimisation via Feature Selection”, Proceedings “Advances in Pattern Recognition”, Joint IAPR International Workshops SSPR 2000 and SPR 2000 Alicante, Spain, August 30-September 1, 2000, Lecture Notes in Computer Science.VOL. 1876
D. Windridge, J. Kittler, “A Generalised Solution to the Problem of Multiple Expert Fusion”, Univ. of Surrey Technical Report: VSSP-TR-5/2000
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Windridge, D., Kittler, J. (2002). On the General Application of the Tomographic Classifier Fusion Methodology. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_15
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_15
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive