Abstract
Hardware design for object detection is one of the upcoming technical fields in VLSI design. With the increase in popularity of artificial intelligence and machine learning, object detection and visual tracking are becoming crucial as well. These techniques are very complex and frequently require more parallelized approach to the algorithm. General-purpose CPU core thus is not suitable for these applications. GPU and ASICs designed for such parallelized loads are thus usually needed. But with increasing popularity and complexity, developing discrete hardware capable for processing such tasks has become feasible. The visual tracking system is complex and is comprised of many parts. This chapter focuses on proposing a system with all the complexity defined and each part discussed in detail. FPGA and its architecture as needed by the system are also discussed. This entails discussing different algorithms available for hardware tracking. Each technique is thoroughly discussed, and hardware/software co-system for implementing those algorithms is proposed. The chapter underlines the benefits and shortcomings of each algorithm and then enables the reader to create complex systems that are both efficient and fast and also discusses the new implementations for systems using multiple camera and views.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kumar, A. (2023). Visual object tracking using deep learning. CRC Press.
Lazareva, O. (2017). Depth perception. In T. K. Shackelford & V. A. WeekesShackelford (Eds.), Encyclopedia of evolutionary psychological science (pp. 1–6). Springer International Publishing.
Pedrazzini, F. (2018). 3d position estimation using deep learning.
Matthews, L., Ishikawa, T., & Baker, S. (2004). The template update problem. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26(6), 810–815.
Suseendran, G., Akila, D., Vijaykumar, H., Jabeen, T. N., Nirmala, R., & Nayyar, A. (2022). Multi-sensor information fusion for efficient smart transport vehicle tracking and positioning based on deep learning technique. The Journal of Supercomputing, 78(5), 6121–6146.
Siddiqui, F., Amiri, S., Minhas, U. I., Deng, T., Woods, R., Rafferty, K., & Crookes, D. (2019). FPGA-based processor acceleration for image processing applications. Journal of Imaging, 5(1), 16.
Ali, U., & Malik, M. B. (2010). Hardware/software co-design of a real-time kernel based tracking system. Journal of Systems Architecture, 56(8), 317–326.
Nayyar, A., & Puri, V. (2016). A review of arduino board’s, lilypad’s & arduino shields. In 2016 3rd international conference on computing for sustainable global development (indiacom) (pp. 1485–1492).
Nayyar, A. (2011). Integrated security solution for moving object tracking system. International Journal of Engineering Science, 5, 1–11.
El-Shafie, A.-H. A., & Habib, S. E. (2019). Survey on hardware implementations of visual object trackers. IET Image Processing, 13(6), 863–876.
Bradski, G. R. (1998). Computer vision face tracking for use in a perceptual user interface.
Comaniciu, D., Ramesh, V., & Meer, P. (2000). Real-time tracking of non-rigid objects using mean shift. In Proceedings IEEE conference on computer vision and pattern recognition. CVPR 2000 (cat. no. pr00662) (Vol. 2, pp. 142–149).
Kumar, A., Walia, G. S., & Sharma, K. (2020). Recent trends in multicue based visual tracking: A review. Expert Systems with Applications, 162, 113711.
Li, S.-A., Hsu, C.-C., Lin, W.-L., & Wang, J.-P. (2011). Hardware/software co-design of particle filter and its application in object tracking. In Proceedings 2011 international conference on system science and engineering (pp. 87–91).
Perez, P., Hue, C., Vermaak, J., & Gangnet, M. (2002). Color-based probabilistic tracking. In European conference on computer vision (pp. 661–675).
Singh, S., Shekhar, C., & Vohra, A. (2017). Real-time FPGA-based object tracker with automatic pan-tilt features for smart video surveillance systems. Journal of Imaging, 3(2), 18.
Letessier, J., & Berard, F. (2004). Visual tracking of bare fingers for interactive surfaces. In Proceedings of the 17th annual ACM symposium on user interface software and technology (pp. 119–122).
Ishii, I., Sukenobe, R., Moriue, Y., & Yamamoto, K. (2009). Real-time feature point tracking at 1000 fps. In 2009 İEEE international symposium on computational intelligence in robotics and automation-(cira) (pp. 515–520).
Kumar, A., Walia, G. S., & Sharma, K. (2020). Real-time visual tracking via multi-cue based adaptive particle filter framework. Multimedia Tools and Applications, 79, 20639–20663.
Robert Selje, I., & Sun, L. (n.d.). A survey of hardware advances and techniques for vision-based object detection, classification, and tracking.
Danelljan, M., Bhat, G., Khan, F. S., & Felsberg, M. (2019). Atom: Accurate tracking by overlap maximization. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (pp. 4660–4669).
Zhang, Z., Liu, Y., Wang, X., Li, B., & Hu, W. (2021). Learn to match: Automatic matching network design for visual tracking. In Proceedings of the IEEE/CVF international conference on computer vision (pp. 13339–13348).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Sharma, M., Bhatnagar, E. (2023). Hardware Design Aspects of Visual Tracking System. In: Kumar, A., Jain, R., Vairamani, A.D., Nayyar, A. (eds) Object Tracking Technology. Contributions to Environmental Sciences & Innovative Business Technology. Springer, Singapore. https://doi.org/10.1007/978-981-99-3288-7_6
Download citation
DOI: https://doi.org/10.1007/978-981-99-3288-7_6
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-3287-0
Online ISBN: 978-981-99-3288-7
eBook Packages: Computer ScienceComputer Science (R0)