Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A novel kernelized correlation filter by fusing multiple feature response maps, enhanced target re-detection, and improved model updating for visual tracking

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Visual target tracking remains a hard research problem due to many unpredictable challenges (e.g., complete occlusion, out-of-view, and fast motion). This paper proposes a novel kernelized correlation filter architecture based on the fusion of the multiple feature response maps (FRMs), enhanced target re-detection, and improved online model updating. First, the multi-feature fusion strategy reveals that complementary feature information paves the way for effectively improving the target recognition ability. Meanwhile, a novel target re-detection module performs superior in relocating the tracked target and mitigating the model drift problem. Moreover, most existing trackers may indiscriminately update the model when encountering inaccurate predictions, eventually induce undesired model drift and tracking performance decay. Inspiring by the average peak-to-correlation energy, we apply an improved model updating scheme to ascertain the relative fluctuated level of FRMs. This novel model updating method can reject deteriorated samples and improves the discriminative power of the model. As evaluated on the four popular tracking benchmark datasets (including OTB-2013, OTB-2015, UAV123, and LaSOT), benefiting from proposed methods, our tracker obtains better effectiveness and efficiency than that of 28 state-of-the-art competitors (including 17 correlation filter-based competitors and 11 deep learning-based competitors). Meanwhile, our proposed tracker can meet the requirements of limited storage space and real-time performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Bai, B., Zhong, B.N., Ouyang, G., Wang, P.F., Liu, X., Chen, Z.Y., Wang, C.: Kernel correlation filters for visual tracking with adaptive fusion of heterogeneous cues. Neurocomputing 286, 109–120 (2018)

    Article  Google Scholar 

  2. Zhang, H., Liu, G.: Coupled-layer based visual tracking via adaptive kernelized correlation filter. Vis. Comput. 34(1), 41–54 (2018)

    Article  MathSciNet  Google Scholar 

  3. Wang, Y., Wei, X., Ding, L., et al.: A robust visual tracking method via local feature extraction and saliency detection. Vis. Comput. 36, 683–700 (2020)

    Article  Google Scholar 

  4. Han, Y.M., Zhang, P., Huang, W., Zha, Y.F., Cooper, G.D., Zhang, Y.N.: Robust Visual Tracking based on adversarial unlabeled instance generation with label smoothing loss regularization. Pattern Recognit. 97, 107027 (2020)

    Article  Google Scholar 

  5. Tian, S.J., Shen, S.W., Tian, G.Q., et al.: End-to-end deep metric network for visual tracking. Vis. Comput. 36, 1219–1232 (2020)

    Article  Google Scholar 

  6. Zhang, W.C., Du, Y.Z., Chen, Z., et al.: Robust adaptive learning with Siamese network architecture for visual tracking. Vis. Comput. 37, 881–894 (2021)

    Article  Google Scholar 

  7. Zhang, D., Zhang, Z., Zou, L., Xie, Z., He, F., Wu, Y., Tu, Z.: Part-based visual tracking with spatially regularized correlation filters. Vis. Comput. 36(3), 509–527 (2020)

    Article  Google Scholar 

  8. Han, Z.J., Wang, P., Ye, Q.X.: Adaptive discriminative deep correlation filter for visual object tracking. IEEE Trans. Circuits Syst. Video Technol. 30(1), 155–166 (2020)

    Article  Google Scholar 

  9. Liu, F.H., Gong, C., Huang, X.L., Zhou, T., Yang, J., Tao, D.C.: Robust visual tracking revisited: from correlation filter to template matching. IEEE Trans. Image Process. 27(6), 2777–2790 (2018)

    Article  MathSciNet  Google Scholar 

  10. Mbelwa, J., Zhao, Q., Lu, Y., Liu, H., Wang, F., Mbise, M.: Objectness-based smoothing stochastic sampling and coherence approximate nearest neighbor for visual tracking. Vis. Comput. 35(3), 371–384 (2019)

    Article  Google Scholar 

  11. Bolme, D.S., Beveridge, J.R., Draper, B.A., Liu, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2544–2550 (2010)

  12. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: 2012 European Conference on Computer Vision, pp. 702–715 (2012)

  13. Danelljan, M., Khan, F.S., Felsberg, M., van de Weijer, J.: Adaptive color attributes for real-time visual tracking. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1090–1097 (2014)

  14. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

    Article  Google Scholar 

  15. Wang, J., Liu, W., Xing, W., Zhang, S.: Visual object tracking with multi-scale superpixels and color-feature guided kernelized correlation filters. Signal Process.-Image Commun. 63, 44–62 (2018)

    Article  Google Scholar 

  16. Li, D., Wen, G., Kuai, Y., Porikli, F.: Learning padless correlation filters for boundary-effect free tracking. IEEE Sens. J. 18(8), 7721–7729 (2018)

    Article  Google Scholar 

  17. Yan, J., Zhong, L., Yao, Y., et al.: Dual-template adaptive correlation filter for real-time object tracking. Multimed. Tools Appl. 80, 2355–2376 (2021)

    Article  Google Scholar 

  18. Dong, E., Deng, M., Wang, Z.: A robust tracking algorithm with on online detector and high-confidence updating strategy. Vis. Comput. 37, 567–585 (2020)

    Article  Google Scholar 

  19. Zhao, L., Zhao, Q., Liu, H., Lv, P., Dongbing, G.: Structural sparse representation-based semi-supervised learning and edge detection proposal for visual tracking. Vis. Comput. 33(9), 1169–1184 (2017)

    Article  Google Scholar 

  20. Wang, L.J., Ouyang W.L., Wang X.G., Lu, H.C.: Visual tracking with fully convolutional networks. In: 2015 IEEE International Conference on Computer Vision, pp. 3119–3127 (2015)

  21. Nam, H., Han, B.: Learning multi-domain convolutional neural networks for visual tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4293–4302 (2016)

  22. Danelljan, M., Robinson, A., Khan, F.S.: Beyond correlation filters learning continuous convolution operators for visual tracking. In: 2016 European Conference Computer Vision, pp. 472–488 (2016)

  23. Danelljan, M., Bhat, G., Khan, F.S.: ECO: efficient convolution operators for tracking. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 6931–6939 (2017)

  24. Dong, X.P., Shen, J.B., Wang, W.G., Liu, Y., Shao, L., Porikli, F.: Hyperparameter optimization for tracking with continuous deep Q-learning. In: 2018 IEEE Conference on Computer Vision and Pattern Recognition, pp. 518–527 (2018)

  25. Zhang, Z.P., Peng, H.: Deeper and wider Siamese networks for real-time visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4586–4595 (2019)

  26. Li, B., Yan, J.J., Wu, W., et al.: High performance visual tracking with Siamese region proposal network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8971–8980 (2018)

  27. Dong, X., Shen, J.: Triplet loss in Siamese network for object tracking. In: Proceedings of the European Conference Computer Vision, pp. 472–488 (2018)

  28. Danelljan, M., Hager, G., Khan, F.S.: Accurate scale estimation for robust visual tracking. In: 2014 British Machine Vision Conference, pp. 1–11 (2014)

  29. Danelljan, M., Hager, G., Khan, F.S., Felsberg, M.: Learning spatially regularized correlation filters for visual tracking. In: 2015 IEEE International Conference on Computer Vision, pp. 4310−4318 (2015)

  30. Galoogahi, H.K., Fagg, A., Lucey, S.: Learning background-aware correlation filters for visual tracking. In: 2017 IEEE International Conference on Computer Vision, pp. 1144–1152 (2017)

  31. Xu, T.Y., Feng, Z.H., Wu, X.J., Kittler, J.: Joint group feature selection and discriminative filter learning for robust visual object tracking. In: 2019 IEEE International Conference on Computer Vision, pp. 7949–7959 (2019)

  32. Huang, B., Xu, T.F., Li, J.N., Shen, Z.Y., Chen, Y.W.: Transfer learning-based discriminative correlation filter for visual tracking. Pattern Recognit. 100, 107157 (2020)

    Article  Google Scholar 

  33. Fazl-Ersi, E., Kazemi, N.M.: Revisiting correlation-based filters for low-resolution and long-term visual tracking. Vis. Comput. 35, 1447–1459 (2019)

    Article  Google Scholar 

  34. Feng, W., Han, R., Guo, Q., et al.: Dynamic saliency-aware regularization for correlation filter-based object tracking. IEEE Trans. Image Process. 28(7), 3232–3245 (2019)

    Article  MathSciNet  Google Scholar 

  35. Guo, Q., Han, R., Feng, W., et al.: Selective spatial regularization by reinforcement learned decision making for object tracking. IEEE Trans. Image Process. 29, 2999–3013 (2020)

    Article  Google Scholar 

  36. Wang, X., Hou, Z.Q., Yu, W.S., et al.: Online scale adaptive visual tracking based on multilayer convolutional features. IEEE Trans. Cybern. 49(1), 146–158 (2019)

    Article  Google Scholar 

  37. Wang, M.M., Liu, Y., Huang, Z.Y.: Large margin object tracking with circulant feature maps. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4800–4808 (2017)

  38. Gao, L., Li, Y.S., Ning, J.F.: Maximum margin object tracking with weighted circulant feature maps. IET Comput. Vis. 13(1), 71–78 (2019)

    Article  Google Scholar 

  39. Zhang, Y., Wang, L., Qi, J., Wang, D., Feng, M., Lu, H.: Structured siamese network for real-Time visual tracking. In: 2018 Processing of the European Conference Computer Vision, pp. 355–370 (2018)

  40. Guo, Q., Feng, W., Zhou, C., Huang, R., Wan, L., Wang, S.: Learning dynamic Siamese network for visual object tracking. In: 2017 IEEE International Conference on Computer Vision, pp. 1781–1789 (2017)

  41. Fan, H., Ling, H.: Parallel tracking and verifying: a framework for real-time and high accuracy visual tracking. In: 2017 IEEE International Conference on Computer Vision, pp. 5487–5495 (2017)

  42. Valmadre, J., Bertinetto, L., Henriques, J., Vedaldi, A., Torr., P.: End-to-end representation learning for correlation filter based tracking. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 5000–5008 (2017)

  43. Lukezic, A., Vojir, T., Zajc, L., Matas, J., Kristan, M.: Discriminative correlation filter with channel and spatial reliability. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition, pp. 4847–4856 (2017)

  44. Danelljan, M., Hager, G., Khan, F.S., Felsberg, M.: Discriminative scale space tracking. IEEE Trans. Pattern Anal. Mach. Intell. 39(8), 1561–1575 (2017)

    Article  Google Scholar 

  45. Danelljan, M., Hager, G., Khan, F.S.: Adaptive decontamination of the training set: a unified equation for discriminative visual tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1430–1438 (2016)

  46. Bertinetto, L., Valmadre, J., Henriques, J.F., Vedaldi, A., Torr, P.H.S.: Fully-convolutional siamese networks for object tracking. In: 2016 European Conference Computer Vision, pp. 850–865 (2016)

  47. Tao, R., Gavves, E., Smeulders, A.W.M.: Siamese instance search for tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1420–1429 (2016)

  48. Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.S.: Staple: complementary learners for real-time tracking. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)

  49. Ma, C., Huang, J.B., Yang, X.K., Yang, M.H.: Hierarchical convolutional features for visual tracking. In: 2015 IEEE International Conference on Computer Vision, pp. 3074–3082 (2015)

  50. Ma, C., Yang, X.K., Zhang, C.Y., Yang, M.H.: Long-term correlation tracking. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition, pp. 5388–5396 (2015)

  51. Zhang, J.M., Ma, S.G., Sclaroff, S.: MEEM: robust tracking via multiple experts using entropy minimization. In: 2014 European Conference on Computer Vision, pp. 188–203 (2014)

  52. Li, Y., Zhu, J.K.: A scale adaptive kernel correlation filter tracker with feature integration. In: 2014 European Conference on Computer Vision, pp. 254–265 (2014)

  53. Wu, Y., Lim, J., Yang, M.H.: Online object tracking: a benchmark. In: 2013 IEEE Conference on Computer Vision and Pattern Recognition, pp. 2411–2418 (2013)

  54. Wu, Y., Lim, J., Yang, M.: Object tracking benchmark. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1834–1848 (2015)

    Article  Google Scholar 

  55. Mueller, M., Smith, N., Ghanem, B.: A benchmark and simulator for UAV tracking. In: 2016 European Conference on Computer Vision, pp. 445–461 (2016)

  56. Fan, H., Lin, L.L., Yang, F., Chu, P., Deng, G., Yu, S.J., Bai, H.X.: LaSOT: a high-quality benchmark for large-scale single object tracking. In: 2019 IEEE Conference on Computer Vision and Pattern Recognition, pp. 5369–5378 (2019)

Download references

Acknowledgements

This research was supported by the National Natural Science Foundation of China (No. 62001149), and the Key R&D Program of Zhejiang Province (No. 2020C03098)

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhiwei He.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Du, C., Ji, Z., Dong, Z. et al. A novel kernelized correlation filter by fusing multiple feature response maps, enhanced target re-detection, and improved model updating for visual tracking. Vis Comput 38, 1883–1900 (2022). https://doi.org/10.1007/s00371-021-02247-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-021-02247-7

Keywords

Navigation