Abstract
In the computer vision field, many real-world applications are based on detecting and tracking moving objects. One of the most important challenges in these applications is tracking occluded objects. Actually, when two or multiple objects occlude, the used tracking system suffers from information loss which negatively influences its tracking performance. The present paper introduces a new method to overcome this problem using only one target image and without any classification or learning phase. Indeed, a tracking system is established by combining the chromatic co-occurrence matrices and the particle filter in order to evaluate the occluded target position. The qualitative and quantitative studies show that the results obtained by the proposed approach are very competitive in comparison with several state-of-the-art methods.
Similar content being viewed by others
References
Gabriel, P.F., Verly, J.G., Piater, J.H., Genon, A.: The state of the art in multiple object tracking under occlusion in video sequences. In: Advanced Concepts for Intelligent Vision Systems (ACIVS), pp. 166–173. Ghent, Belgium (2003)
Zhou, Y., Tao, H.: A background layer model for object tracking through occlusion. In: Ninth IEEE International Conference on Computer Vision, vol. 2, pp. 1079–1085. Nice, France (2003)
Jepson, A.D., Fleet, D.J., Black, M.J.: A layered motion representation with occlusion and compact spatial support. In: ECCV 2002, pp. 692–706. Copenhagen, Denmark (2002)
Yilmaz, A., Li, X., Shah, M.: Contour-based object tracking with occlusion handling in video acquired using mobile cameras. IEEE Trans. Pattern Anal. Mach. Intell. 26(11), 1531–1536 (2004)
MacCormick, J., Blake, A.: A probabilistic exclusion principle for tracking multiple objects. In: The Proceedings of the Seventh IEEE International Conference on Computer Vision, pp. 572–578, vol. 1. Kekyra, Greece (1999)
Wu, B., Nevatia, R.: Tracking of multiple, partially occluded humans based on static body part detection. In: IEEE computer society conference on computer vision and pattern recognition, vol. 1, pp. 951–958. New York, USA (2006)
Wu, B., Nevatia, R.: Detection and tracking of multiple, partially occluded humans by bayesian combination of edgelet based part detectors. Int. J. Comput. Vis. 75(2), 247–266 (2007)
Senior, A., Hampapur, A., Tian, Y.-L., Brown, L., Pankanti, S., Bolle, R.: Appearance models for occlusion handling. Image Vis. Comput. 24(11), 1233–1243 (2006)
Ding, J., Tang, Y., Tian, H., Liu, W., Huang, Y.: Robust tracking with adaptive appearance learning and occlusion detection. Multimed. Syst. 22, 1–15 (2015)
Haralick, R.M., Shanmugam, K., Dinstein, I.: Textural features for image classification. IEEE Trans. Syst. Man Cybern SMC–3(6), 610–621 (1973)
Penatti, O.A.B., Valle, E., da Torres, R.: Comparative study of global color and texture descriptors for web image retrieval, J. Vis. Commun. Image Represent. 23(2), 359–380 (2012)
Upneja, R., Singh, C.: Fast computation of Jacobi–Fourier moments for invariant image recognition. Pattern Recognit. 48(5), 1836–1843 (2015)
Tahmasbi, A., Saki, F., Shokouhi, S.B.: Classification of benign and malignant masses based on Zernike moments. Comput. Biol. Med. 41(8), 726–735 (2011)
Skrzypniak, M., Macaire, L., Postaire, J.-G.: Indexation d’images de personnes par analyse de matrices de co-occurrences couleur. In: Actes CORESA’00 Journ. D’études D’échanges Compression Représentation Signaux Audiov, pp. 411 – 418. Poitiers, France (2000)
Muselet, D.: Reconnaissance automatique d’objets sous éclairage non contrôlé par analyse d’images couleur. Ph.D. thesis, Lille 1 University, France (2005)
Jaward, M., Mihaylova, L., Canagarajah, N., Bull, D.: Multiple object tracking using particle filters. In: IEEE Aerospace Conference, pp. 1–8. Montana, USA (2006)
Shan, C., Wei, Y., Tan, T., Ojardias, F.: Real time hand tracking by combining particle filtering and mean shift. In: Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 669–674. Seoul, South Korea (2004)
Medeiros, H., Park, J., Kak, A.: A parallel color-based particle filter for object tracking. In: IEEE Computer Vision and Pattern Recognition Workshops (CVPRW ’08), pp. 1–8, Anchorage, Alaska (2008)
Nummiaro, K., Koller-Meier, E., Van Gool, L.: An adaptive color-based particle filter. Image Vis. Comput. 21(1), 99–110 (2003)
BoBoT: http://www.iai.uni-bonn.de/~kleind/tracking/index.htm. Accessed May 2016
PETS2000: ftp://ftp.pets.rdg.ac.uk/pub/PETS2000/. Accessed May 2016
Wu, Y., Lim, J., Yang, M.H.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)
VTB: http://cvlab.hanyang.ac.kr/tracker_benchmark/. Accessed Jan 2018
Wu, Y., Shen, B., Ling, H.: Online robust image alignment via iterative convex optimization. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1808–1814. Providence, Rhode Island (2012)
Oron, S., Bar-Hillel, A., Levi, D., Avidan, S.: Locally orderless tracking. Int. J. Comput. Vis. 111(2), 213–228 (2014)
Ross, D.A., Lim, J., Lin, R.-S., Yang, M.-H.: Incremental learning for robust visual tracking. Int. J. Comput. Vis. 77(1–3), 125–141 (2007)
Zhang, K., Zhang, L., Yang, M.H.: Fast compressive tracking. IEEE Trans. Pattern Anal. Mach. Intell. 36(10), 2002–2015 (2014)
Sevilla-Lara, L., Learned-Miller, E.: Distribution fields for tracking. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1910–1917. Providence, Rhode Island (2012)
Dinh, T.B., Vo, N., Medioni, G.: Context tracker: exploring supporters and distracters in unconstrained environments. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1177–1184. Colorado Springs (2011)
Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1838–1845. Colorado Springs (2011)
Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: European Conference on Computer Vision (ECCV), pp. 702–715. Firenze, Italy (2012)
Bao, C., Wu, Y., Ling, H., Ji, H.: Real time robust L1 tracker using accelerated proximal gradient approach. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1830–1837. Providence, Rhode Island (2012)
Jia, X., Lu, H., Yang, M.H.: Visual tracking via adaptive structural local sparse appearance model,” in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1822–1829. Providence, Rhode Island. (June 2012)
Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Robust visual tracking via multi-task sparse learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2042–2049. Providence, Rhode Island (2012)
VIVID Tracking Evaluation Web Site: http://vision.cse.psu.edu/data/vividEval/main.html. Accessed 01 Nov 2017
Collins, R., Zhou, X., Teh, S.K.: An open source tracking testbed and evaluation web site. In: IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, vol. 35. Beijing (2005)
Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)
Zhang, B., et al.: Output constraint transfer for kernelized correlation filter in tracking. IEEE Trans. Syst. Man Cybern. Syst. 47(4), 693–703 (2017)
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Supplementary material 3 (avi 1426 KB)
Supplementary material 4 (avi 7975 KB)
Supplementary material 5 (avi 4533 KB)
Supplementary material 7 (avi 1517 KB)
Rights and permissions
About this article
Cite this article
Elafi, I., Jedra, M. & Zahid, N. Tracking occluded objects using chromatic co-occurrence matrices and particle filter. SIViP 12, 1227–1235 (2018). https://doi.org/10.1007/s11760-018-1273-1
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11760-018-1273-1