Abstract
Heart rate is one of the important physiological parameters of the human body, and the detection of heart rate can directly reflect the physiological health state of the human body. The existing non-contact heart rate detection methods are mainly based on a single type of sensor, which are easily affected by changes of external environment. We propose a non-contact heart rate detection method based on image fusion algorithm. The heart rate detection model is established based on the principle of PhotoPlethysmoGraphy method. A video containing human faces is used to extract information to calculate the heart rate value. Image fusion is completed by neural network and the backbone of the network is an auto-encoder with a double-branch structure. Infrared images and visible images are input to the network. The network can extract the detailed features and semantic features of images, and obtain the fusion features through the fusion layer. The decoder obtains the fusion image by reconstructing the fusion feature. The infrared lens and the visible lens record a video at the same time. After the two videos are fused frame by frame, the fused video is passed into the heart rate detection model to obtain the final heart rate value. The experimental results show that the method proposed in this paper is closer to the result of contact heart rate measurement, the result of the 3.5 m long-distance heart rate measurement has an error of 1.2%. Taking the detection value of contact equipment as a reference, our algorithm has higher accuracy and is hopefully to replace the common contact heart rate detection equipment.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Hertzman, A.B.: The blood supply of various skin areas as estimated by the photoelectric plethysmograph. Am. J. Physiol. 124(2), 328–340 (1938)
Liu, Y., Chen, X., Wang, Z.J., Ward, R.K., Wang, X.: Deep learning for pixel-level image fusion: Recent advances and future prospects. Inf. Fusion 42, 158–173 (2018)
Allen, J.: Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 28(3), 1–39 (2007)
Wieringa, F.P., Mastik, F., Steen, A.: Contactless multiple wavelength photoplethysmographic imaging: a first step toward “spo2 camera” technology. Ann. Biomed. Eng. 33(8), 1034–1041 (2005)
Takano, C., Ohta, Y.: Heart rate measurement based on a time-lapse image. Med. Eng. Phys. 29(8), 853–857 (2007)
Hu, S., Zheng, J., Chouliaras, V., Summers, R.: Feasibility of imaging photoplethysmography. In: International Conference on Biomedical Engineering & Informatics pp. 72-75 (2008)
Poh, M.Z., Mcduff, D.J., Picard, R.W.: Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt. Express 18(10), 10762–10774 (2010)
Rubins, U., Upmalis, V., Rubenis, O., Jakovels, D., Spigulis, J.: Real-time photoplethysmography imaging system. In: 15th Nordic-Baltic Conference on Biomedical Engineering and Medical Physics, pp. 183–186 (2011)
Sun, Y., Hu, S., Azorin-Peris, V., Kalawsky, R., Greenwald, S.: Noncontact imaging photo- plethysmography to effectively access pulse rate variability. J. Biomed. Opt. 18(6), 61205 (2013)
Sheth, R., Lu, W., Yu, Y., Fedkiw, R.P.: Fully momentum-conserving reduced deformable bodies with collision, contact, articulation, and skinning. In: the 14th ACM SIGGRAPH/ES, pp. 45–54 (2015)
Chen, Y., Toyoda, K.: Ohtsuki: Blind source separation on non-contact heartbeat detection by non-negative matrix factorization algorithms. IEEE Trans. Biomed. Eng. 62(2), 482–494 (2020)
Cui, G., Feng, H., Xu, Z., Chen, Y.: Detail preserved fusion of visible and infrared images using regional saliency extraction and multi-scale image decomposition. Optics Communications 341, 199–209 (2015)
Zhang, B., Lu, X., Pei, H., Ying, Z.: A fusion algorithm for infrared and visible images based on saliency analysis and non-subsampled Shearlet transform. Infrared Phys. Technol. 73, 286–297 (2015)
Hui, L., Wu, X.J., Kittler, J.: Infrared and visible image fusion using a deep learning framework. In: 2018 24th International Conference on Pattern Recognition (ICPR), pp. 2705– 2710 (2018)
Prabhakar, K.R., Srikar, V.S., Babu, R.V.: DeepFuse: a deep unsupervised approach for exposure fusion with extreme exposure image pairs. In: ICCV, pp. 4724–4732 (2017)
Xu, H., Ma, J., Le, Z., Guo, X.: FusionDN: a unified densely connected network for image fusion. In: Proceedings of the AAAI Conference on Artificial Intelligence 34, pp. 12484–12491 (2020)
Huang, G., Liu, Z., Laurens, V., Weinberger, K.: Densely connected convolutional networks. In: CVPR, pp. 2261–2269 (2016)
Zhang, Y., Liu, Y., Sun, P., Yan, H., Zhao, X., Zhang, L.: IFCNN: A general image fusion framework based on convolutional neural network. Inf. Fusion 54, 99–118 (2020)
Hou, R., et al.: VIF-Net: an unsupervised framework for infrared and visible image fusion. IEEE Trans. Comput. Imaging 6, 640–651 (2020)
Fu, Y., Wu, X.J.: A Dual-branch network for infrared and visible image fusion. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 10675–10680 (2020)
Zou, J., Zhang, S., Ge, B.: On the target area tracking method for heart rate measurement using deep learning strategy. In: 2019 11th International Conference on Digital Image Processing, pp. 469–476 (2019)
Wang, W., Brinker, A.C., Stuijk, S., Haan, G.: Algorithmic principles of remote PPG. IEEE Trans. Biomed Eng. 64, 1479–1491 (2017)
Qin, R., Chen, Z.: Non-contact stable heart rate measurement algorithm under face motion conditions. Optical Technol. 47(1), 87–92 (2021)
Johnson, J., Alahi, A., Fei-Fei, L.: Perceptual losses for real-time style transfer and super-resolution. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9906, pp. 694–711. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46475-6_43
Li, Z., Zou, J., Yan, P., Hong, D.: Non-contact real-time monitoring of driver’s physiological parameters under ambient light condition. Intell. Autom. Soft Comput. 28(3), 811–822 (2021)
Yan, P., Zou, J., Li, Z., Yang, X.: Infrared and visible image fusion based on nsst and rdn. Intell. Autom. Soft Comput. 28(1), 213–225 (2021)
Yen, C., Liao, C.: Blood pressure and heart rate measurements using photoplethysmography with modified lrcn. Comput., Mater. Continua 71(1), 1973–1986 (2022)
Zeng, W., Sheng, Y., Hu, Q., Huo, Z., Zhang, Y.: Heart rate detection using svm based on video imagery. Intell. Autom. Soft Comput. 32(1), 377–387 (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wei, J., Zou, J., Li, J., Li, Z., Yang, X. (2022). Non-contact Heart Rate Detection Based on Fusion Method of Visible Images and Infrared Images. In: Sun, X., Zhang, X., Xia, Z., Bertino, E. (eds) Artificial Intelligence and Security. ICAIS 2022. Lecture Notes in Computer Science, vol 13339. Springer, Cham. https://doi.org/10.1007/978-3-031-06788-4_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-06788-4_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-06787-7
Online ISBN: 978-3-031-06788-4
eBook Packages: Computer ScienceComputer Science (R0)