Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Bayesian interpretation to generalize adaptive mean shift algorithm

Published: 01 January 2016 Publication History

Abstract

The Adaptive Mean Shift (AMS) algorithm is a popular and simple non-parametric clustering approach based on Kernel Density Estimation. In this paper the AMS is reformulated in a Bayesian framework, which permits a natural generalization in several directions and is shown to improve performance. The Bayesian framework considers the AMS to be a method of obtaining a posterior mode. This allows the algorithm to be generalized with three components which are not considered in the conventional approach: node weights, a prior for a particular location, and a posterior distribution for the bandwidth. Practical methods of building the three different components are considered.

References

[1]
Fukunaga K. and Hostetler L., The estimation of the gradient of a density function, with applications in pattern recognition, IEEE Trans Inf Theory 21(1) (1975), 32–40.
[2]
Fukunaga K., Introduction to Statistical Pattern Recognition, Academic Press, 2 edition, 1990.
[3]
Comaniciu D. and Meer P., Mean shift: A robust approach toward feature space analysis, IEEE Trans Pattern Anal and Mach Intell 24(5) (2002), 603–619.
[4]
Mahmood Q., Chodorowski A., Mehnert A. and Persson M., A novel bayesian approach to adaptive mean shift segmentation of brain images, In Computer-Based Medical Systems (CBMS), 2012 25th International Symposium on (2012), pp. 1–6.
[5]
Mahmood Q., Chodorowski A., Ehteshami B.B. and Person M., A fully automatic unsupervised segmentation framework for the brain tissues in MR images, 2014.
[6]
Jones M.C., Marron J.S. and Sheather S.J., A brief survey of bandwidth selection for density estimation, Journal of the American Statistical Association 90 (1995).
[7]
Shimshoni I., Georgescu B. and Meer P., Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing). The MIT Press, (2006). The Chapter 9: Adaptive Mean Shift Based Clustering in High Dimensions.
[8]
Shen C., Brooks M.J. and Van A., Den, Hengel, Fast global kernel density mode seeking: Applications to localization and tracking, IEEE Trans Image Process 16(5) (2007), 1457–1469.
[9]
Silverman B.W., Density Estimation for Statistics and Data Analysis. Chapman & Hall/CRC, 1986.
[10]
Comaniciu D., Ramesh V. and Meer P., The variable bandwidth mean shift and data-driven scale selection, In Proceedings of 8th International Conference on Computer Vision volume 1, 2001, pp. 438–445.
[11]
Georgescu B., Shimshoni I. and Meer P., Mean shift based clustering in high dimensions: a texture classification example, In Proceedings of 9th IEEE International Conference on Computer Vision, volume 1, 2003, pp. 456–463.
[12]
Ren Y.-Z., Domeniconi C., Zhang G. and Yu G.-X., A Weighted Adaptive Mean Shift Clustering Algorithm. In SDM, 2014, pp. 794–802.
[13]
Yoon J. and Wilson S.P., Improved Mean Shift AlgorithmWith Heterogeneous Node Weights, In International Conference on Pattern Recognition (ICPR) 2010.
[14]
Scott D.W., Multivariate Density Estimation: Theory, Practice, and Visualization, Wiley-Interscience, 1992.
[15]
Bradski G.R., Computer vision face tracking for use in a perceptual user interface, In Proceedings of IEEE Workshop on Applications of Computer Vision, 1998, pp. 214–219.
[16]
Bertsekas D.P., Nonlinear Programming. Athena Scientific, 2nd Edition, 1999.
[17]
Dempster A.P., Laird N.M. and Rubin D.B., Maximum like-lihood from incomplete data via the em algorithm, Journal of the Royal Statistical Society, Series B 39(1) (1977), 1–38.
[18]
Andrieu C., de Freitas N., Doucet A. and Jordan M., An introduction to mcmc for machine learning, Machine Learning 50(1-2) (2003), 5–43.
[19]
Casella G. and Robert C.P., Rao-Blackwellisation of sampling schemes, Biometrika 83(1) (1996), 81–94.
[20]
Liu J.S., Monte Carlo Strategies in Scientific Computing. Springer, first edition 2008.
[21]
Amigó E., Gonzalo J., Artiles J. and Verdejo F., A coparismon of extrinsic clustering evaluation metrics based on formal constraints, Inf Retr 12(4) (2009), 461–486.
[22]
Liu Y., Li Z., Xiong H., Gao X. and Wu J., Understanding of Internal Clustering Validation Measures, In Proceedings of the 2010 IEEE International Conference on Data Mining, ICDM ’10, Washington, DC, USA, 2010, pp. 911–916.IEEE Computer Society.
[23]
Georghiades A.S., Belhumeur P.N. and Kriegman D.J., From Few to many: Illumination cone models for face recognition under variable lighting and pose, IEEE Trans Pattern Anal and Mach Intell 23 (2001), 643–660.
[24]
Breitenbach M. and Grudic G.Z., Clustering through Ranking on Manifolds, In Proceedings of the 22nd international conference on Machine learning (ICML), 2005, pp. 73–80. ACM Press.
[25]
Coffey C., Pozdnoukhov A. and Calabrese F., Time of arrival predictability horizons for public bus routes, In Proceedings of the 4th ACM SIGSPATIAL International Workshop on Computational Transportation Science, CTS ’11, New York, NY, USA, 2011, pp. 1–5. ACM.
Index terms have been assigned to the content through auto-classification.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology  Volume 30, Issue 6
2016
670 pages

Publisher

IOS Press

Netherlands

Publication History

Published: 01 January 2016

Author Tags

  1. Adaptive mean shift algorithm
  2. kernel density estimation

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media