Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Illumination consistency based on single low dynamic range images

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In this paper, a light consistency solution for generating high dynamic range (HDR) images based on a single low dynamic range image(LDR) is proposed, and the virtual object is rendered by illumination. The solution can reduce the time of image acquisition and processing, and solve the problems caused by the limitations of image acquisition equipment. The solution is divided into three stages: image preprocessing, high dynamic range image generation and virtual object relighting. Firstly, in the stage of image pretreatment, the wavelet noise reduction method based on a Gaussian mixture model is used to remove image noise and avoid image detail distortion. The inverse camera response function is utilized to linearize the image, the pixel brightness range is expanded based on the inverse tone mapping function, and the threshold segmentation method is combined with flooding Gaussian smoothing to calculate the highlight spread diagram to compensate for scene highlights lost during camera shooting. Then, the extended dynamic range image is interpolated linearly by using the specular expansion image to get the high dynamic range image. Based on the analysis and experimental simulation, compared with other methods, it is found that using a single low-dynamic-range image can greatly reduce the time of image acquisition and processing and reduce the limitations of image acquisition equipment, while maintaining good light fusion. Based on the simulation results, the efficiency of light consistency processing is improved.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Agrafiotis P, Stathopoulou E K, Georgopoulos A, et al. (2015) HDR imaging for enhancing people detection and tracking in indoor environments[C]//. Int Conf Comput Vision Theory Appl: 2756–2762

  2. Barnard K, Cardei V (2002) C. a comparison of computational color constancy algorithms part I: methodology and experiments with synthesized data. IEEE Trans Image Process 11(9):972–984

    Article  Google Scholar 

  3. Barnard K, Martin L (2002) A comparison of computational color constancy algorithms part II: experiments with image data. Image Process IEEE Trans 11(9):985–996

    Article  Google Scholar 

  4. Barreira J, Bessa M, Barbosa L et al. (2018) A context-aware method for authentically simulating outdoors shadows for mobile augmented reality[J]. IEEE Trans Vis Comput Graph (99):1–1

  5. Boivin S, Gagalowicz A (2001) Image-based rendering of diffuse, specular and glossy surfaces from a single image. Proc 28th Ann Conf Comput Graph Interact Techn. ACM: 107–116

  6. Chantler M, Petrou M, Penirsche A et al (2005) Classifying surface texture while simultaneously estimating illumination direction. Int J Comput Vis 62(1-2):83–96

    Article  Google Scholar 

  7. Cho SY, Chow T (2001) Neural computation approach for developing a 3D shape reconstruction model. Neural Netw IEEE Trans 12(5):1204–1214

    Article  Google Scholar 

  8. Chow CK, Yuen SY (2010) Illumination direction estimation for augmented reality using a surface input real valued output regression network. Pattern Recogn 43(4):1700–1716

    Article  Google Scholar 

  9. Debevec P (2002) Image-based lighting. IEEE Comput Graph Appl 22(2):26–34

    Article  Google Scholar 

  10. Debevec P (2008) Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. ACM SIGGRAPH 2008 classes. ACM: 32–36

  11. Debevec PE, Malik J (2008) Recovering high dynamic range radiance maps from photographs. ACM SIGGRAPH 2008 classes. ACM: 369–378

  12. Imai Y, Yu K, Kadoi H, et al. (2011) Estimation of multiple illuminants based on specular highlight detection[C]//. Int Conf Comput Color Imag. Springer-Verlag: 85-98

  13. Jacobs K, Loscos C, Ward G (2008) Automatic high-dynamic range image generation for dynamic scenes. Comput Graphics Appl 28(2):84–93

    Article  Google Scholar 

  14. Jin-tao MA, Ya Z, Wei L, Hong W, Xian-peng L (2008) Real-time illuminant direction detection algorithm. J Image Graphics 13(4):780–785

    Google Scholar 

  15. Kamgaru S, Fukumi M, Akamatsu N (2002) Identifying scene illumination using genetic algorithms and neural networks. Neural Inform Process Proc 9th Int Conf: 881–885

  16. Kán P, Unterguggenberger J, Kaufmann H. (2015) High-quality consistent illumination in Mobile augmented reality by radiance convolution on the GPU[M]//. Adv Visual Comput. Springer Int Publ:574-585

  17. Kanbara M, Yokoya N (2004) Real-time estimation of light source environment for photorealistic augmented reality. Int Committee Prostitutes' Rights : 911-914

  18. Karungaru S, Fukumi M (2003) Neural networks and genetic algorithms for learning the scene illumination in color images. Comput Intell Robot Auto IEEE Int Symp: 1085–1089

  19. Kontogianni G, Georgopoulos A (2014) Investigating the effect of HDR images for the 3D documentation of cultural heritage[C]//. Int Conf Cult Herit. EUROMED

  20. Kontogianni G, Stathopoulou E K, Georgopoulos A, et al. (2015) Hdr imaging for feature detection on detailed architectural scenes[J]. ISPRS - Int Arch Photogrammetr Remote Sens Spat Inform Sci

  21. Liu Y, Qin X, Xu S et al (2009) Light source estimation of outdoor scenes for mixed reality[J]. Visual Comput Int J Comput Graph 25(5-7):637–646

    Google Scholar 

  22. Madsen CB, Lal BB (2011) Outdoor illumination estimation in image sequences for augmented reality[J] :129–139

  23. Mitsunaga T, Nayar SK (1999) Radiometric self calibration. Comput Vision Pattern Recogn IEEE Comput Soc Conf: 374–380

  24. Panagopoulos A, Vicente TFY, Samaras D (2011) Illumination estimation from shadow borders[C]//. IEEE Int Conf Comput Vision Workshops. IEEE::798–805

  25. Panagopoulos A, Samaras D, Paragios N (2011) Illumination estimation and cast shadow detection through a higher-order graphical model[C]//. IEEE Conf Comput Vision Pattern Recogn. IEEE Comput Soc: 673–680

  26. Rohmer K, Jendersie J, Grosch T (2017) Natural environment illumination: coherent interactive augmented reality for mobile and non-mobile devices[J]. IEEE Trans Visual Comput Graph (99):1–1

  27. Sato I, Sato Y, Ikeuchi K (1999) Illumination distribution from brightness in shadows: adaptive estimation of illumination distribution with unknown reflectance properties in shadow regions. Comput Vision Proc Seventh IEEE Int Conf: 875–882

  28. Sato I, Sato Y, Ikeuchi K (1999) Acquiring a radiance distribution to superimpose virtual objects onto a real scene. Visual Comput Graphics 5(1):1–12

    Article  Google Scholar 

  29. Sun N, Mansour H, Ward R (2010) HDR image construction from multi-exposed stereo LDR images[C]//. IEEE Int Conf Image Process IEEE::2973–2976

  30. Yu Y, Debevec P, Malik J, et al. (1999) Inverse global illumination: Recovering reflectance models of real scenes from photographs. Proc 26th Ann Conf Comput Graph Interact Techn. ACM Press/ Addison-Wesley Publishing Co: 215–224

  31. Zhang F, Zhao Y, Wang Z, et al. (2017) An efficient all-frequency environment rendering method for mixed reality[C]//. Int Congress Image Signal Process Biomed Eng Inform IEEE : 358-362

  32. Zhou W, Kambhamettu C (2008) A unified framework for scene illuminant estimation. Image Vis Comput 26(3):415–429

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by National Natural Science Foundation of China (No. 61502185) and the Fundamental Research Funds for the Central Universities(No: 2017KFYXJJ071).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dan Li.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yuan, L., Hu, Y., Li, D. et al. Illumination consistency based on single low dynamic range images. Multimed Tools Appl 79, 3189–3215 (2020). https://doi.org/10.1007/s11042-018-6799-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-6799-2

Keywords

Navigation