Underwater Optical-Sonar Image Fusion Systems
<p>Two image coordinate systems, (<b>a</b>) Cartesian and (<b>b</b>) fan-shaped image coordinate systems of optical and multi-beam sonar sensors.</p> "> Figure 2
<p>Optical-sonar fusion system, (<b>a</b>) schematic design and (<b>b</b>) real hardware comprising two underwater cameras and one multi-beam sonar.</p> "> Figure 3
<p>Schematic of the bracket equipped with a servo motor for tilting imaging sensors.</p> "> Figure 4
<p>Software to control the display and acquisition of optical and sonar images, light operation, and enhancing and fusing both images.</p> "> Figure 5
<p>Calibration phantom, (<b>a</b>) schematic and (<b>b</b>) real RGB phantom for geometric calibration of underwater optical-sonar fusion system.</p> "> Figure 6
<p>Experimental setup of the RGB phantom and optical-sonar fusion system for acquisition of calibration data.</p> "> Figure 7
<p>World coordinates (<math display="inline"><semantics> <mrow> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>W</mi> </mstyle> <mi mathvariant="bold-italic">x</mi> </msub> <mo>,</mo> <mo> </mo> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>W</mi> </mstyle> <mi mathvariant="bold-italic">y</mi> </msub> <mo>,</mo> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>W</mi> </mstyle> <mi mathvariant="bold-italic">z</mi> </msub> </mrow> </semantics></math>), optical, and sonar image coordinates (<math display="inline"><semantics> <mrow> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>X</mi> </mstyle> <mi mathvariant="bold-italic">n</mi> </msub> <mo>,</mo> <mo> </mo> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>Y</mi> </mstyle> <mi mathvariant="bold-italic">n</mi> </msub> <mo>;</mo> <mi mathvariant="bold-italic">n</mi> <mo>=</mo> <mn mathvariant="bold">1</mn> <mo>,</mo> <mn mathvariant="bold">2</mn> <mo>,</mo> <mn mathvariant="bold">3</mn> </mrow> </semantics></math>).</p> "> Figure 8
<p>Corner detection numbering, (<b>a</b>) thirty-six optical and (<b>b</b>) eight sonar corner image points for estimating coordinate transformation matrices.</p> "> Figure 9
<p>Single optical image enhancement, (<b>a</b>) original underwater optical image and (<b>b</b>) its enhanced image by image fusion.</p> "> Figure 10
<p>Single sonar image enhancement, (<b>a</b>) original sonar image and (<b>b</b>) its enhanced image by median filter (5 × 5) and gamma correction (<math display="inline"><semantics> <mrow> <msub> <mi mathvariant="bold-italic">γ</mi> <mn mathvariant="bold">2</mn> </msub> </mrow> </semantics></math> = 0.2).</p> "> Figure 11
<p>Enhanced and fused optical and sonar images taken at different distances of 5 and 6 m from the fusion system, (<b>a</b>) enhanced optical images (top and bottom rows are images measured from camera 1 and 2 at each distance, respectively), (<b>b</b>) enhanced sonar images, (<b>c</b>) overlayered optical color image on the sonar image.</p> ">
Abstract
:1. Introduction
Image Type | Enhancement Method [Reference] | Description |
---|---|---|
Optical image | Empirical mode decomposition [3] | Decompose the color spectrum components of underwater images, and improve the images by applying different weights on the color spectrum components |
CLAHE-mix [4] | Apply CLAHE on the image in RGB and HSV color models and combine two contrast-enhanced images by Euclidean norm | |
Image fusion [5] | Apply three successive steps of white balancing, contrast and edge enhancing, and fusing | |
CLAHE-HF [6] | Enhance contrast of underwater images by CLAHE, and reduce noise by homomorphic filtering (HF) | |
Red channel restoration model [7] | Apply a red channel model, which is a variation of DCP, to improve the most attenuated red channel signal of the underwater image | |
Underwater IFM-based algorithm [8] | Recover the original image with the determined transmission map of direct transmitted, forward and backward scattered light | |
DCP and depth transmission map [9] | Fuse DCP and depth map, which are the difference between the bright and the dark channels and the difference of wavelength-dependent light absorption, respectively | |
UGAN [10] | Train underwater GAN (UGAN) from the paired clean and underwater images to learn the difference between the paired images, and generate enhanced underwater images using the trained UGAN | |
CNNs for estimation of transmission and global ambient light [11] | Train two parallel CNN branches to estimate the blue channel transmission map and global ambient light signal | |
FUnIE-GAN [12] | Train fast underwater image enhancement GAN (FUnIE-GAN) to learn global content, color, texture, and style information of underwater images | |
Sonar image | Median filter [13] | Reduce noise in sonar images by median filter |
Gabor filter [14] | Improve edge signal in sonar images by Gabor filter | |
NACA [15] | Apply adaptive initialization algorithm to obtain a better initial clustering center and quantum inspired shuffled frog leaping algorithm to update cultural individuals | |
CNN based auto encoder [16] | Train auto encoder from 13,650 multi-beam sonar images for enhancing resolution and denoising | |
GAN based algorithm [17] | Train GAN using high- and low-resolution sonar image pairs for enhancing resolution | |
YOLO [18] | Train you only look once (YOLO) network from the crosstalk noise sonar image dataset, and then remove the detected crosstalk noise |
2. Materials and Methods
2.1. Underwater Optical-Sonar Fusion System
2.2. Enhancement of Underwater Optical and Sonar Images
2.3. Calibration and Fusion of Underwater Optical-Sonar Fusion System
3. Results
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mobley, C.D.; Mobley, C.D. Light and Water: Radiative Transfer in Natural Waters; Academic Press: Cambridge, MA, USA, 1994. [Google Scholar]
- Blondel, P. The Handbook of Sidescan Sonar; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
- Çelebi, A.T.; Ertürk, S. Visual enhancement of underwater images using empirical mode decomposition. Expert Syst. Appl. 2012, 39, 800–805. [Google Scholar] [CrossRef]
- Hitam, M.S.; Awalludin, E.A.; Yussof, W.N.J.H.W.; Bachok, Z. Mixture contrast limited adaptive histogram equalization for underwater image enhancement. In Proceedings of the 2013 International conference on computer applications technology (ICCAT), Sousse, Tunisia, 20–22 January 2013; pp. 1–5. [Google Scholar]
- Ancuti, C.O.; Ancuti, C.; De Vleeschouwer, C.; Bekaert, P. Color balance and fusion for underwater image enhancement. IEEE Trans. Image Process. 2017, 27, 379–393. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Luo, M.; Fang, Y.; Ge, Y. An effective underwater image enhancement method based on CLAHE-HF. J. Phys. Conf. Ser. 2019, 1237, 032009. [Google Scholar] [CrossRef]
- Galdran, A.; Pardo, D.; Picón, A.; Alvarez-Gila, A. Automatic red-channel underwater image restoration. J. Vis. Commun. Image Represent. 2015, 26, 132–145. [Google Scholar] [CrossRef] [Green Version]
- Park, E.; Sim, J.Y. Underwater image restoration using geodesic color distance and complete image formation model. IEEE Access 2020, 8, 157918–157930. [Google Scholar] [CrossRef]
- Yu, H.; Li, X.; Lou, Q.; Lei, C.; Liu, Z. Underwater image enhancement based on DCP and depth transmission map. Multimed. Tools Appl. 2020, 79, 20373–20390. [Google Scholar] [CrossRef]
- Fabbri, C.; Islam, M.J.; Sattar, J. Enhancing underwater imagery using generative adversarial networks. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 7159–7165. [Google Scholar]
- Wang, K.; Hu, Y.; Chen, J.; Wu, X.; Zhao, X.; Li, Y. Underwater image restoration based on a parallel convolutional neural network. Remote Sens. 2019, 11, 1591. [Google Scholar] [CrossRef] [Green Version]
- Islam, M.J.; Xia, Y.; Sattar, J. Fast underwater image enhancement for improved visual perception. IEEE Robot. Autom. Lett. 2020, 5, 3227–3234. [Google Scholar] [CrossRef] [Green Version]
- Johannsson, H.; Kaess, M.; Englot, B.; Hover, F.; Leonard, J. Imaging sonar-aided navigation for autonomous underwater harbor surveillance. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 4396–4403. [Google Scholar]
- Chen, J.; Gong, Z.; Li, H.; Xie, S. A detection method based on sonar image for underwater pipeline tracker. In Proceedings of the 2011 Second International Conference on Mechanic Automation and Control Engineering, Inner Mongolia, China, 15–17 July 2011; pp. 3766–3769. [Google Scholar]
- Wang, X.; Li, Q.; Yin, J.; Han, X.; Hao, W. An adaptive denoising and detection approach for underwater sonar image. Remote Sens. 2019, 11, 396. [Google Scholar] [CrossRef] [Green Version]
- Kim, J.; Song, S.; Yu, S.C. Denoising auto-encoder based image enhancement for high resolution sonar image. In Proceedings of the 2017 IEEE Underwater Technology (UT), Busan, Korea, 21–24 February 2017; pp. 1–5. [Google Scholar]
- Sung, M.; Kim, J.; Yu, S.C. Image-based super resolution of underwater sonar images using generative adversarial network. In Proceedings of the TENCON 2018–2018 IEEE Region 10 Conference, Jeju, Korea, 28–31 October 2018; pp. 457–461. [Google Scholar]
- Sung, M.; Cho, H.; Kim, T.; Joe, H.; Yu, S.C. Crosstalk removal in forward scan sonar image using deep learning for object detection. IEEE Sens. J. 2019, 19, 9929–9944. [Google Scholar] [CrossRef]
- Lagudi, A.; Bianco, G.; Muzzupappa, M.; Bruno, F. An alignment method for the integration of underwater 3D data captured by a stereovision system and an acoustic camera. Sensors 2016, 16, 536. [Google Scholar] [CrossRef] [Green Version]
- Babaee, M.; Negahdaripour, S. 3-D object modeling from 2-D occluding contour correspondences by opti-acoustic stereo imaging. Comput. Vis. Image. Underst. 2015, 132, 56–74. [Google Scholar] [CrossRef]
- Kim, S.M. Single image-based enhancement techniques for underwater optical imaging. J. Ocean Eng. Technol. 2020, 34, 442–453. [Google Scholar] [CrossRef]
- Kim, H.G.; Seo, J.M.; Kim, S.M. Comparison of GAN Deep Learning Methods for Underwater Optical Image Enhancement. J. Ocean Eng. Technol. 2022, 36, 32–40. [Google Scholar] [CrossRef]
- Shin, Y.S.; Cho, Y.; Lee, Y.; Choi, H.T.; Kim, A. Comparative Study of Sonar Image Processing for Underwater Navigation. J. Ocean Eng. Technol. 2016, 30, 214–220. [Google Scholar] [CrossRef]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
- Panetta, K.; Gao, C.; Agaian, S. Human-visual-system-inspired underwater image quality measures. IEEE J. Ocean. Eng. 2015, 41, 541–551. [Google Scholar] [CrossRef]
- Yang, M.; Sowmya, A. An underwater color image quality evaluation metric. IEEE Trans. Image Process. 2015, 24, 6062–6071. [Google Scholar] [CrossRef] [PubMed]
- Bechara, B.; McMahan, C.A.; Moore, W.S.; Noujeim, M.; Geha, H.; Teixeira, F.B. Contrast-to-noise ratio difference in small field of view cone beam computed tomography machines. J. Oral Sci. 2012, 54, 227–232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, X.; Yang, P. An improved imaging algorithm for multi receiver SAS system with wide-bandwidth signal. Remote Sens. 2021, 13, 5008. [Google Scholar] [CrossRef]
- Bülow, H.; Birk, A. Synthetic aperture sonar (SAS) without navigation: Scan registration as basis for near field synthetic imaging in 2D. Sensors 2020, 20, 4440. [Google Scholar] [CrossRef] [PubMed]
Device | Specifications | ||
---|---|---|---|
Eagle IPZ/4000 | Field of view | 3.3~45° | |
Spatial resolution | 1920 × 1080 | ||
Blueview M900-2250 | Dual frequencies | 900 kHz | 2250 kHz |
Maximum range | 100 m | 10 m | |
Field of view | 130° (H) × 20° (V) | ||
LED SeaLite | Output | 10,000 Lumens | |
Efficacy | 63 lm/W |
Depth (m) | Distance between System and Phantom (m) | Rotation (°) |
---|---|---|
2 | 4.5 | 0, −15, −30, −45, 15, 30, 45 |
2.2 | 5, 5.5, 6 |
Distance | Camera 1 | Camera 2 |
---|---|---|
5 m | 67.7% | 70.4% |
6 m | 77.1% | 84.1% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, H.-G.; Seo, J.; Kim, S.M. Underwater Optical-Sonar Image Fusion Systems. Sensors 2022, 22, 8445. https://doi.org/10.3390/s22218445
Kim H-G, Seo J, Kim SM. Underwater Optical-Sonar Image Fusion Systems. Sensors. 2022; 22(21):8445. https://doi.org/10.3390/s22218445
Chicago/Turabian StyleKim, Hong-Gi, Jungmin Seo, and Soo Mee Kim. 2022. "Underwater Optical-Sonar Image Fusion Systems" Sensors 22, no. 21: 8445. https://doi.org/10.3390/s22218445
APA StyleKim, H. -G., Seo, J., & Kim, S. M. (2022). Underwater Optical-Sonar Image Fusion Systems. Sensors, 22(21), 8445. https://doi.org/10.3390/s22218445