Exploiting High Geopositioning Accuracy of SAR Data to Obtain Accurate Geometric Orientation of Optical Satellite Images
<p>A typical optical satellite image (<b>a</b>) and its corresponding SAR image (<b>b</b>) of the same area.</p> "> Figure 2
<p>The calculation process of gradient magnitude and gradient direction.</p> "> Figure 3
<p>The generation of a feature orientation index table (<math display="inline"><semantics> <mrow> <mi>F</mi> <mi>O</mi> <mi>I</mi> </mrow> </semantics></math>) for an image with nine pixels. (<b>a</b>) Nine feature orientations with index numbers 0~8, which are used to divide the range of <math display="inline"><semantics> <mrow> <mfenced close=")" open="["> <mrow> <mn>0</mn> <mo>°</mo> <mo>,</mo> <mn>180</mn> <mo>°</mo> </mrow> </mfenced> </mrow> </semantics></math> into 8 subranges evenly. (<b>b</b>) The determination of lower bound (red arrow) and upper bound (black arrow) of a pixel’s gradient direction (green arrow). (<b>c</b>) <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>O</mi> <mi>I</mi> </mrow> </semantics></math>.</p> "> Figure 4
<p>The statistical process of a feature vector with 9 elements. <math display="inline"><semantics> <mrow> <mi>W</mi> <mi>G</mi> <mi>M</mi> <mi>L</mi> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>W</mi> <mi>G</mi> <mi>M</mi> <mi>U</mi> </mrow> </semantics></math> refer to the two weighted gradient magnitude images.</p> "> Figure 5
<p>The processing pipeline of AWOG descriptor.</p> "> Figure 6
<p>Comparative matching experiments of PC and NCC on an image pair of an optical satellite image and SAR images. (<b>a</b>) Optical satellite image with a template window of 61 × 61 pixels. (<b>b</b>) SAR image with a searching area of 121 × 121 pixels. (<b>c</b>,<b>d</b>) are the AWOG descriptors generated from the optical satellite image and SAR image, respectively. (<b>e</b>,<b>g</b>) are the 3D visualization of similarity values for PC and NCC with AWOG descriptors, respectively. (<b>f</b>,<b>h</b>) are the corresponding 2D similarity maps of (<b>e</b>,<b>g</b>).</p> "> Figure 7
<p>Image reshaping.</p> "> Figure 8
<p>Flowchart of the proposed geometric orientation framework.</p> "> Figure 9
<p>The eight experimental image pairs used in <a href="#sec5dot1dot1-remotesensing-13-03535" class="html-sec">Section 5.1.1</a>. (<b>a</b>–<b>h</b>) correspond to image pairs 1–8.</p> "> Figure 10
<p>The correct matching ratio (CMR) results of all matching methods under different template sizes. (<b>a</b>–<b>h</b>) correspond to the CMR results of image pairs 1–8.</p> "> Figure 11
<p>The RMSE results of all methods with a template window of 91 × 91 pixels.</p> "> Figure 12
<p>The average running times of all methods under different sizes of template windows.</p> "> Figure 13
<p>The matching results of the proposed method on all experimental image pairs. (<b>a</b>–<b>h</b>) correspond to the results of image pairs 1–8.</p> "> Figure 14
<p>The study area (framed with the yellow rectangle) and the areas (marked by the red polygon) covered by SAR reference images.</p> "> Figure 15
<p>The overlap of optical satellite images with the used SAR reference images. (<b>a</b>) The distribution of SAR reference images. (<b>b</b>–<b>d</b>) display the overlaps of GF-1, GF-2, and ZY-3 with the SAR reference images, respectively. Note that the number at the end of the name of an image represents a specific image in the corresponding image collection.</p> "> Figure 16
<p>The obtained matches from all reference SAR images for the optical satellite image GF-1-3. (<b>a</b>–<b>g</b>) display the locations of matches on all SAR reference images. (<b>h</b>) displays the locations of all obtained matches on GF-1-3, where the color of a point indicates the SAR image to which it is matched.</p> "> Figure 16 Cont.
<p>The obtained matches from all reference SAR images for the optical satellite image GF-1-3. (<b>a</b>–<b>g</b>) display the locations of matches on all SAR reference images. (<b>h</b>) displays the locations of all obtained matches on GF-1-3, where the color of a point indicates the SAR image to which it is matched.</p> "> Figure 17
<p>Registration checkerboard overlays of optical images and SAR images with image tiles of 300 × 300 m before and after the orientation process. (<b>a</b>,<b>c</b>,<b>e</b>) show the optical satellite images before, and (<b>b</b>,<b>d</b>,<b>f</b>) show them after, the geometric orientation with the orientation framework.</p> "> Figure 17 Cont.
<p>Registration checkerboard overlays of optical images and SAR images with image tiles of 300 × 300 m before and after the orientation process. (<b>a</b>,<b>c</b>,<b>e</b>) show the optical satellite images before, and (<b>b</b>,<b>d</b>,<b>f</b>) show them after, the geometric orientation with the orientation framework.</p> "> Figure 17 Cont.
<p>Registration checkerboard overlays of optical images and SAR images with image tiles of 300 × 300 m before and after the orientation process. (<b>a</b>,<b>c</b>,<b>e</b>) show the optical satellite images before, and (<b>b</b>,<b>d</b>,<b>f</b>) show them after, the geometric orientation with the orientation framework.</p> "> Figure 18
<p>The running time used for extracting AWOG descriptors with or without the <math display="inline"><semantics> <mrow> <mi>F</mi> <mi>O</mi> <mi>I</mi> </mrow> </semantics></math> on all experimental image pairs used in <a href="#sec5dot1dot1-remotesensing-13-03535" class="html-sec">Section 5.1.1</a>.</p> ">
Abstract
:1. Introduction
- We propose a robust feature descriptor AWOG to match SAR and optical satellite images and introduce a PC-based matching strategy, obtaining stable and reliable matching point pairs with higher efficiency.
- We put forward a framework for an accurate geometric orientation of optical images using the VCPs provided by SAR images.
- Various experiments on SAR and optical satellite image datasets verify the superiority of AWOG to the other state-of-the-art multimodal image matching descriptors. Compared with CFOG, the correct matching ratio is improved by about 17%, and the RMSE of location errors is reduced by about 0.1 pixels. Additionally, the efficiency of the proposed method is comparable to CFOG and about 80% faster than the other state-of-the-art methods such as HOPC, DLSS, and MI.
- We further prove the effectiveness of the proposed method for the geometric orientation of optical satellite images using multiple SAR and optical satellite image pairs. Significantly, the geopositioning accuracy of optical satellite images is improved, from more than 200 to around 8 m.
2. Related Work
3. Accurate and Robust Matching of SAR Images and Optical Satellite Images
3.1. AWOG Descriptor
3.2. Phase Correlation Matching
4. A General Orientation Framework of Optical Satellite Images Using SAR Images
5. Experiments and Analysis
5.1. Matching Performance Investigation
5.1.1. Experimental Datasets
5.1.2. Evaluation Criteria
5.1.3. Comparison with the State-of-the-Art Multimodal Image Matching Methods
5.2. Geometric Orientation Accuracy Analysis
5.2.1. Study Area and Datasets
5.2.2. Geometric Orientation Performance
6. Discussion
6.1. The Advantage of Using FOI in Generating AWOG Descriptor
6.2. Parameter Study
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Qiu, C.; Schmitt, M.; Zhu, X.X. Towards automatic SAR-optical stereogrammetry over urban areas using very high resolution imagery. ISPRS J. Photogramm. Remote Sens. 2018, 138, 218–231. [Google Scholar] [CrossRef]
- Pohl, C.; Van Genderen, J.L. Review article multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef] [Green Version]
- Nayak, S.; Zlatanova, S. Remote Sensing and GIS Technologies for Monitoring and Prediction of Disasters; Springer: Berlin, Germany, 2008. [Google Scholar]
- Zhou, K.; Lindenbergh, R.; Gorte, B.; Zlatanova, S. LiDAR-guided dense matching for detecting changes and updating of buildings in Airborne LiDAR data. ISPRS J. Photogramm. Remote Sens. 2020, 162, 200–213. [Google Scholar] [CrossRef]
- Strozzi, T.; Wegmuller, U.; Tosi, L.; Bitelli, G.; Spreckels, V. Land subsidence monitoring with differential SAR interferometry. Photogramm. Eng. Remote Sens. 2001, 67, 1261–1270. [Google Scholar]
- Geymen, A. Digital elevation model (DEM) generation using the SAR interferometry technique. Arab. J. Geosci. 2014, 7, 827–837. [Google Scholar] [CrossRef]
- Ding, J.; Chen, B.; Liu, H.; Huang, M. Convolutional neural network with data augmentation for SAR target recognition. IEEE Geosci. Remote Sens. 2016, 13, 364–368. [Google Scholar] [CrossRef]
- Carlson, T.N.; Gillies, R.R.; Perry, E.M. A method to make use of thermal infrared temperature and NDVI measurements to infer surface soil water content and fractional vegetation cover. Remote Sens. 1994, 9, 161–173. [Google Scholar] [CrossRef]
- Bosch, I.; Gomez, S.; Vergara, L.; Moragues, J. Infrared image processing and its application to forest fire surveillance. In Proceedings of the IEEE Conference on Advanced Video and Signal Based Surveillance, London, UK, 5–7 September 2007; pp. 283–288. [Google Scholar]
- Shao, Z.; Zhang, L.; Wang, L. Stacked sparse autoencoder modeling using the synergy of airborne LiDAR and satellite optical and SAR data to map forest above-ground biomass. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5569–5582. [Google Scholar] [CrossRef]
- Zhang, H.; Xu, R. Exploring the optimal integration levels between SAR and optical data for better urban land cover mapping in the Pearl River Delta. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 87–95. [Google Scholar] [CrossRef]
- Campos-Taberner, M.; García-Haro, F.J.; Camps-Valls, G.; Grau-Muedra, G.; Nutini, F.; Busetto, L.; Katsantonis, D.; Stavrakoudis, D.; Minakou, C.; Gatti, L. Exploitation of SAR and optical sentinel data to detect rice crop and estimate seasonal dynamics of leaf area index. Remote Sens. 2017, 9, 248. [Google Scholar] [CrossRef] [Green Version]
- Hong, D.; Hu, J.; Yao, J.; Chanussot, J.; Zhu, X.X. Multimodal remote sensing benchmark datasets for land cover classification with a shared and specific feature learning model. ISPRS J. Photogramm. Remote Sens. 2021, 178, 68–80. [Google Scholar] [CrossRef]
- De Alban, J.D.T.; Connette, G.M.; Oswald, P.; Webb, E.L. Combined Landsat and L-band SAR data improves land cover classification and change detection in dynamic tropical landscapes. Remote Sens. 2018, 10, 306. [Google Scholar] [CrossRef] [Green Version]
- Niu, X.; Gong, M.; Zhan, T.; Yang, Y. A conditional adversarial network for change detection in heterogeneous images. IEEE Geosci. Remote Sens. 2018, 16, 45–49. [Google Scholar] [CrossRef]
- Touati, R.; Mignotte, M.; Dahmane, M. Multimodal change detection in remote sensing images using an unsupervised pixel pairwise-based Markov random field model. IEEE Trans. Image Process. 2019, 29, 757–767. [Google Scholar] [CrossRef]
- Ley, A.; D’Hondt, O.; Valade, S.; Hänsch, R.; Hellwich, O. Exploiting GAN-Based SAR to Optical Image Transcoding for Improved Classification via Deep Learning. In Proceedings of the EUSAR 2018, Aachen, Germany, 4–7 June 2018; pp. 396–401. [Google Scholar]
- Chu, T.; Tan, Y.; Liu, Q.; Bai, B. Novel fusion method for SAR and optical images based on non-subsampled shearlet transform. Int. J. Remote Sens. 2020, 41, 4590–4604. [Google Scholar] [CrossRef]
- Shakya, A.; Biswas, M.; Pal, M. CNN-based fusion and classification of SAR and Optical data. Int. J. Remote Sens. 2020, 41, 8839–8861. [Google Scholar] [CrossRef]
- Holland, D.; Boyd, D.; Marshall, P. Updating topographic mapping in Great Britain using imagery from high-resolution satellite sensors. ISPRS J. Photogramm. Remote Sens. 2006, 60, 212–223. [Google Scholar] [CrossRef]
- Zhu, Q.; Jiang, W.; Zhu, Y.; Li, L. Geometric Accuracy Improvement Method for High-Resolution Optical Satellite Remote Sensing Imagery Combining Multi-Temporal SAR Imagery and GLAS Data. Remote Sens. 2020, 12, 568. [Google Scholar] [CrossRef] [Green Version]
- Tao, C.V.; Hu, Y. A comprehensive study of the rational function model for photogrammetric processing. Photogramm. Eng. Remote Sens. 2001, 67, 1347–1358. [Google Scholar]
- Fraser, C.S.; Hanley, H.B. Bias-compensated RPCs for sensor orientation of high-resolution satellite imagery. Photogramm. Eng. Remote Sens. 2005, 71, 909–915. [Google Scholar] [CrossRef]
- Wang, M.; Cheng, Y.; Chang, X.; Jin, S.; Zhu, Y. On-orbit geometric calibration and geometric quality assessment for the high-resolution geostationary optical satellite GaoFen4. ISPRS J. Photogramm. Remote Sens. 2017, 125, 63–77. [Google Scholar] [CrossRef]
- Wang, M.; Yang, B.; Hu, F.; Zang, X. On-orbit geometric calibration model and its applications for high-resolution optical satellite imagery. Remote Sens. 2014, 6, 4391–4408. [Google Scholar] [CrossRef] [Green Version]
- Bouillon, A.; Bernard, M.; Gigord, P.; Orsoni, A.; Rudowski, V.; Baudoin, A. SPOT 5 HRS geometric performances: Using block adjustment as a key issue to improve quality of DEM generation. ISPRS J. Photogramm. Remote Sens. 2006, 60, 134–146. [Google Scholar] [CrossRef]
- Li, R.; Deshpande, S.; Niu, X.; Zhou, F.; Di, K.; Wu, B. Geometric integration of aerial and high-resolution satellite imagery and application in shoreline mapping. Mar. Geod. 2008, 31, 143–159. [Google Scholar] [CrossRef]
- Tang, S.; Wu, B.; Zhu, Q. Combined adjustment of multi-resolution satellite imagery for improved geopositioning accuracy. ISPRS J. Photogramm. Remote Sens. 2016, 114, 125–136. [Google Scholar] [CrossRef]
- Chen, D.; Tang, Y.; Zhang, H.; Wang, L.; Li, X. Incremental Factorization of Big Time Series Data with Blind Factor Approximation. IEEE Trans. Knowl. Data Eng. 2021, 33, 569–584. [Google Scholar] [CrossRef]
- Cléri, I.; Pierrot-Deseilligny, M.; Vallet, B. Automatic Georeferencing of a Heritage of old analog aerial Photographs. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 2, 33. [Google Scholar] [CrossRef] [Green Version]
- Pehani, P.; Čotar, K.; Marsetič, A.; Zaletelj, J.; Oštir, K. Automatic geometric processing for very high resolution optical satellite data based on vector roads and orthophotos. Remote Sens. 2016, 8, 343. [Google Scholar] [CrossRef] [Green Version]
- Müller, R.; Krauß, T.; Schneider, M.; Reinartz, P. Automated georeferencing of optical satellite data with integrated sensor model improvement. Photogramm. Eng. Remote Sens. 2012, 78, 61–74. [Google Scholar] [CrossRef]
- Zebker, H.A.; Goldstein, R.M. Topographic mapping from interferometric synthetic aperture radar observations. J. Geophys. Res. Space Phys. 1986, 91, 4993–4999. [Google Scholar] [CrossRef]
- Werninghaus, R.; Buckreuss, S. The TerraSAR-X mission and system design. IEEE Trans. Geosci. Remote Sens. 2009, 48, 606–614. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Q. System Design and Key Technologies of the GF-3 Satellite; In Chinese. Acta Geod. Cartogr. Sin. 2017, 46, 269–277. [Google Scholar]
- Lou, L.; Liu, Z.; Zhang, H.; Qian, F.Q.; Huang, Y. TH-2 satellite engineering design and implementation. Acta Geod. Cartogr. Sin. 2020, 49, 1252–1264. (In Chinese) [Google Scholar]
- Bresnahan, P.C. Absolute Geolocation Accuracy Evaluation of TerraSAR-X-1 Spotlight and Stripmap Imagery—Study Results. In Proceedings of the Civil Commercial Imagery Evaluation Workshop, Fairfax, VA, USA, 31 March–2 April 2009. [Google Scholar]
- Eineder, M.; Minet, C.; Steigenberger, P.; Cong, X.; Fritz, T. Imaging geodesy—Toward centimeter-level ranging accuracy with TerraSAR-X. IEEE Trans. Geosci. Remote Sens. 2010, 49, 661–671. [Google Scholar] [CrossRef]
- Bagheri, H.; Schmitt, M.; d’Angelo, P.; Zhu, X.X. A framework for SAR-optical stereogrammetry over urban areas. ISPRS J. Photogramm. Remote Sens. 2018, 146, 389–408. [Google Scholar] [CrossRef] [PubMed]
- Jiao, N.; Wang, F.; You, H.; Liu, J.; Qiu, X. A generic framework for improving the geopositioning accuracy of multi-source optical and SAR imagery. ISPRS J. Photogramm. Remote Sens. 2020, 169, 377–388. [Google Scholar] [CrossRef]
- Reinartz, P.; Müller, R.; Schwind, P.; Suri, S.; Bamler, R. Orthorectification of VHR optical satellite data exploiting the geometric accuracy of TerraSAR-X data. ISPRS J. Photogramm. Remote Sens. 2011, 66, 124–132. [Google Scholar] [CrossRef]
- Merkle, N.; Luo, W.; Auer, S.; Müller, R.; Urtasun, R. Exploiting deep matching and SAR data for the geo-localization accuracy improvement of optical satellite images. Remote Sens. 2017, 9, 586. [Google Scholar] [CrossRef] [Green Version]
- Ye, Y.; Bruzzone, L.; Shan, J.; Bovolo, F.; Zhu, Q. Fast and robust matching for multimodal remote sensing image registration. IEEE Trans. Geosci. Remote Sens. 2019, 57, 9059–9070. [Google Scholar] [CrossRef] [Green Version]
- Kuglin, C.D.; Hines, D.C. The phase correlation image alignment method. In Proceedings of the IEEE Conference on Cybernetics and Society, Banff, AB, Canada, 1–4 October 1975; pp. 163–165. [Google Scholar]
- Dellinger, F.; Delon, J.; Gousseau, Y.; Michel, J.; Tupin, F. SAR-SIFT: A SIFT-like algorithm for SAR images. IEEE Trans. Geosci. Remote Sens. 2014, 53, 453–466. [Google Scholar] [CrossRef] [Green Version]
- Jiang, X.; Ma, J.; Xiao, G.; Shao, Z.; Guo, X. A review of multimodal image matching: Methods and applications. Inf. Fusion 2021, 73, 22–71. [Google Scholar] [CrossRef]
- Yu, L.; Zhang, D.; Holden, E.-J. A fast and fully automatic registration approach based on point features for multi-source remote-sensing images. Comput. Geosci. 2008, 34, 838–848. [Google Scholar] [CrossRef]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-up robust features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Pan, C.; Zhang, Z.; Yan, H.; Wu, G.; Ma, S. Multisource data registration based on NURBS description of contours. Int. J. Remote Sens. 2008, 29, 569–591. [Google Scholar] [CrossRef]
- Li, H.; Manjunath, B.; Mitra, S.K. A contour-based approach to multisensor image registration. IEEE Trans. Image Process. 1995, 4, 320–334. [Google Scholar] [CrossRef] [Green Version]
- Dare, P.; Dowman, I. An improved model for automatic feature-based registration of SAR and SPOT images. ISPRS J. Photogramm. Remote Sens. 2001, 56, 13–28. [Google Scholar] [CrossRef]
- Fan, B.; Huo, C.; Pan, C.; Kong, Q. Registration of optical and SAR satellite images by exploring the spatial relationship of the improved SIFT. IEEE Geosci. Remote Sens. 2012, 10, 657–661. [Google Scholar] [CrossRef] [Green Version]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Xiang, Y.; Wang, F.; You, H. OS-SIFT: A robust SIFT-like algorithm for high-resolution optical-to-SAR image registration in suburban areas. IEEE Trans. Geosci. Remote Sens. 2018, 56, 3078–3090. [Google Scholar] [CrossRef]
- Salehpour, M.; Behrad, A. Hierarchical approach for synthetic aperture radar and optical image coregistration using local and global geometric relationship of invariant features. J. Appl. Remote Sens. 2017, 11, 015002. [Google Scholar] [CrossRef]
- Leutenegger, S.; Chli, M.; Siegwart, R.Y. BRISK: Binary Robust invariant scalable keypoints. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2548–2555. [Google Scholar]
- Li, J.; Hu, Q.; Ai, M. RIFT: Multimodal image matching based on radiation-variation insensitive feature transform. IEEE Trans. Image Process. 2019, 29, 3296–3310. [Google Scholar] [CrossRef]
- Kovesi, P. Phase congruency detects corners and edges. In Proceedings of the Digital Image Computing: Techniques and Applications 2003, Sydney, Australia, 10–12 December 2003. [Google Scholar]
- Cui, S.; Xu, M.; Ma, A.; Zhong, Y. Modality-Free Feature Detector and Descriptor for Multimodal Remote Sensing Image Registration. Remote Sens. 2020, 12, 2937. [Google Scholar] [CrossRef]
- Wang, L.; Sun, M.; Liu, J.; Cao, L.; Ma, G. A Robust Algorithm Based on Phase Congruency for Optical and SAR Image Registration in Suburban Areas. Remote Sens. 2020, 12, 3339. [Google Scholar] [CrossRef]
- Shechtman, E.; Irani, M. Matching local self-similarities across images and videos. In Proceedings of the IEEE Conference on CVPR, Minneapolis, MN, USA, 17–22 June 2007; pp. 1–8. [Google Scholar]
- Sedaghat, A.; Ebadi, H. Distinctive order based self-similarity descriptor for multisensor remote sensing image matching. ISPRS J. Photogramm. Remote Sens. 2015, 108, 62–71. [Google Scholar] [CrossRef]
- Sedaghat, A.; Mokhtarzade, M.; Ebadi, H. Uniform robust scale-invariant feature matching for optical remote sensing images. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4516–4527. [Google Scholar] [CrossRef]
- Sui, H.; Xu, C.; Liu, J.; Hua, F. Automatic optical-to-SAR image registration by iterative line extraction and Voronoi integrated spectral point matching. IEEE Trans. Geosci. Remote Sens. 2015, 53, 6058–6072. [Google Scholar] [CrossRef]
- Hnatushenko, V.; Kogut, P.; Uvarov, M. Variational approach for rigid co-registration of optical/SAR satellite images in agricultural areas. J. Comput. Appl. Math. 2022, 400, 113742. [Google Scholar] [CrossRef]
- Xu, C.; Sui, H.; Li, H.; Liu, J. An automatic optical and SAR image registration method with iterative level set segmentation and SIFT. Int. J. Remote Sens. 2015, 36, 3997–4017. [Google Scholar] [CrossRef]
- Kelman, A.; Sofka, M.; Stewart, C.V. Keypoint descriptors for matching across multiple image modalities and nonlinear intensity variations. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, MN, USA, 18–23 June 2007; pp. 1–7. [Google Scholar]
- Gesto-Diaz, M.; Tombari, F.; Gonzalez-Aguilera, D.; Lopez-Fernandez, L.; Rodriguez-Gonzalvez, P. Feature matching evaluation for multimodal correspondence. ISPRS J. Photogramm. Remote Sens. 2017, 129, 179–188. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust Registration of Multimodal Remote Sensing Images Based on Structural Similarity. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
- Wang, X.; Wang, X. FPGA Based Parallel Architectures for Normalized Cross-Correlation. In Proceedings of the 2009 1st International Conference on Information Science and Engineering (ICISE), Nanjing, China, 26–28 December 2009; pp. 225–229. [Google Scholar]
- Cole-Rhodes, A.A.; Johnson, K.L.; LeMoigne, J.; Zavorin, I. Multiresolution registration of remote sensing imagery by optimization of mutual information using a stochastic gradient. IEEE Trans. Image Process. 2003, 12, 1495–1511. [Google Scholar] [CrossRef] [PubMed]
- Suri, S.; Reinartz, P. Mutual-information-based registration of TerraSAR-X and Ikonos imagery in urban areas. IEEE Trans. Geosci. Remote Sens. 2009, 48, 939–949. [Google Scholar] [CrossRef]
- Xiang, Y.; Tao, R.; Wan, L.; Wang, F.; You, H. OS-PC: Combining feature representation and 3-D phase correlation for subpixel optical and SAR image registration. IEEE Trans. Geosci. Remote Sens. 2020, 58, 6451–6466. [Google Scholar] [CrossRef]
- Fan, J.; Wu, Y.; Li, M.; Liang, W.; Cao, Y. SAR and optical image registration using nonlinear diffusion and phase congruency structural descriptor. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5368–5379. [Google Scholar] [CrossRef]
- Xiong, X.; Xu, Q.; Jin, G.; Zhang, H.; Gao, X. Rank-based local self-similarity descriptor for optical-to-SAR image matching. IEEE Geosci. Remote Sens. 2019, 17, 1742–1746. [Google Scholar] [CrossRef]
- Li, Q.; Qu, G.; Li, Z. Matching Between SAR Images and Optical Images Based on HOG Descriptor. In Proceedings of the IET International Radar Conference, Xi’an, China, 14–16 April 2013; pp. 1–4. [Google Scholar]
- Dalal, N.; Triggs, B. Histograms of oriented gradients for human detection. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), San Diego, CA, USA, 25 June 2005; Volume 1, pp. 886–893. [Google Scholar]
- Ye, Y.; Shen, L. HOPC: A Novel Similarity Metric Based on Geometric Structural Properties for Multimodal Remote Sensing Image Matching. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 9–16. [Google Scholar] [CrossRef] [Green Version]
- Zhang, G.; Jiang, B.; Wang, T.; Ye, Y.; Li, X. Combined Block Adjustment for Optical Satellite Stereo Imagery Assisted by Spaceborne SAR and Laser Altimetry Data. Remote Sens. 2021, 13, 3062. [Google Scholar] [CrossRef]
- Ye, Y.; Shen, L.; Hao, M.; Wang, J.; Xu, Z. Robust optical-to-SAR image matching based on shape properties. IEEE Geosci. Remote Sens. 2017, 14, 564–568. [Google Scholar] [CrossRef]
- Ye, Y.; Yang, C.; Zhu, B.; Zhou, L.; He, Y.; Jia, H. Improving Co-Registration for Sentinel-1 SAR and Sentinel-2 Optical Images. Remote Sens. 2021, 13, 928. [Google Scholar] [CrossRef]
- Merkle, N.; Auer, S.; Müller, R.; Reinartz, P. Exploring the potential of conditional adversarial networks for optical and SAR image matching. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1811–1820. [Google Scholar] [CrossRef]
- Zhang, J.; Ma, W.; Wu, Y.; Jiao, L. Multimodal remote sensing image registration based on image transfer and local features. IEEE Geosci. Remote Sens. 2019, 16, 1210–1214. [Google Scholar] [CrossRef]
- Ma, W.; Zhang, J.; Wu, Y.; Jiao, L.; Zhu, H.; Zhao, W. A novel two-step registration method for remote sensing images based on deep and local features. IEEE Trans. Geosci. Remote Sens. 2019, 57, 4834–4843. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. In Proceedings of the International Conference on Learning Representations, San Diego, CA, USA, 7–9 May 2015. [Google Scholar]
- Li, Z.; Zhang, H.; Huang, Y. A Rotation-Invariant Optical and SAR Image Registration Algorithm Based on Deep and Gaussian Features. Remote Sens. 2021, 13, 2628. [Google Scholar] [CrossRef]
- Zhang, H.; Ni, W.; Yan, W.; Xiang, D.; Wu, J.; Yang, X.; Bian, H. Registration of multimodal remote sensing image based on deep fully convolutional neural network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3028–3042. [Google Scholar] [CrossRef]
- Hughes, L.H.; Marcos, D.; Lobry, S.; Tuia, D.; Schmitt, M. A deep learning framework for matching of SAR and optical imagery. ISPRS J. Photogramm. Remote Sens. 2020, 169, 166–179. [Google Scholar] [CrossRef]
- Zhang, J.-Y.; Chen, Y.; Huang, X.-X. Edge Detection of Images Based on Improved Sobel Operator and Genetic Algorithms. In Proceedings of the International Conference on Image Analysis and Signal Processing, Taizhou, China, 11–12 April 2009; pp. 31–35. [Google Scholar]
- Yi, Z.; Zhiguo, C.; Yang, X. Multi-spectral remote image registration based on SIFT. Electron. Lett. 2008, 44, 107–108. [Google Scholar] [CrossRef]
- Li, Q.; Wang, G.; Liu, J.; Chen, S. Robust scale-invariant feature matching for remote sensing image registration. IEEE Geosci. Remote Sens. 2009, 6, 287–291. [Google Scholar]
- Chen, J.; Tian, J.; Lee, N.; Zheng, J.; Smith, R.T.; Laine, A.F. A partial intensity invariant feature descriptor for multimodal retinal image registration. IEEE Trans. Biomed. Eng. 2010, 57, 1707–1718. [Google Scholar] [CrossRef] [Green Version]
- Foroosh, H.; Zerubia, J.B.; Berthod, M. Extension of phase correlation to subpixel registration. IEEE Trans. Image Process. 2002, 11, 188–200. [Google Scholar] [CrossRef] [Green Version]
- Ye, Y.; Shan, J. A local descriptor based registration method for multispectral remote sensing images with nonlinear intensity differences. ISPRS J. Photogramm. Remote Sens. 2014, 90, 83–95. [Google Scholar] [CrossRef]
- Grodecki, J.; Dial, G. Block adjustment of high-resolution satellite images described by rational polynomials. Photogramm. Eng. Remote Sens. 2003, 69, 59–68. [Google Scholar] [CrossRef]
Category | Image Pair | GSD | Date | Size (Pixels) | |
---|---|---|---|---|---|
Optical-to-SAR | 1 | TM band3 | 30 m | May 2007 | 600 × 600 |
TerraSAR-X | 30 m | March 2008 | 600 × 600 | ||
2 | Google Earth | 3 m | November 2007 | 528 × 524 | |
TerraSAR-X | 3 m | December 2007 | 534 × 524 | ||
3 | Google Earth | 3 m | March 2009 | 628 × 618 | |
TerraSAR-X | 3 m | January 2008 | 628 × 618 | ||
4 | Google Earth | 10 m | April 2019 | 777 × 737 | |
TH-2 | 10 m | August 2019 | 777 × 737 | ||
5 | Google Earth | 10 m | June 2019 | 1001 × 1001 | |
TH-2 | 10 m | December 2019 | 1000 × 1000 | ||
6 | Google Earth | 10 m | August 2017 | 1001 × 1001 | |
GF-3 | 10 m | February 2017 | 1000 × 1000 | ||
LiDAR-to-Optical | 7 | LiDAR intensity | 2 m | October 2010 | 621 × 617 |
WorldView-2 | 2 m | October 2011 | 621 × 621 | ||
8 | LiDAR depth | 2.5 m | June 2012 | 524 × 524 | |
Airborne optical | 2.5 m | June 2012 | 524 × 524 |
Image Pair | Criteria | AWOGPC | CFOGPC | AWOGSSD | CFOGSSD | HOPCSSD | DLSSSSD | MI |
---|---|---|---|---|---|---|---|---|
1 | RMSE (pixels) | 0.488 | 0.515 | 0.539 | 0.566 | 0.647 | 0.669 | 0.816 |
2 | RMSE (pixels) | 0.610 | 0.676 | 0.690 | 0.734 | 0.741 | 0.846 | 0.856 |
3 | RMSE (pixels) | 0.736 | 0.864 | 0.832 | 0.902 | 0.906 | 0.919 | 0.948 |
4 | RMSE (pixels) | 0.663 | 0.745 | 0.765 | 0.795 | 0.841 | 0.867 | 1.091 |
5 | RMSE (pixels) | 0.483 | 0.565 | 0.583 | 0.672 | 0.713 | 0.690 | 0.991 |
6 | RMSE (pixels) | 0.634 | 0.692 | 0.671 | 0.744 | 0.807 | 0.963 | 0.844 |
7 | RMSE (pixels) | 0.614 | 0.655 | 0.688 | 0.666 | 0.724 | 0.746 | 0.996 |
8 | RMSE (pixels) | 0.689 | 0.738 | 0.774 | 0.825 | 0.698 | 0.817 | 1.142 |
Template Size (Pixels) | Criteria | AWOGPC | CFOGPC | AWOGSSD | CFOGSSD | HOPCSSD | DLSSSSD | MI |
---|---|---|---|---|---|---|---|---|
25 × 25 | t (s) | 1.279 | 0.510 | 3.019 | 1.428 | 2.446 | 2.601 | 10.011 |
31 × 31 | t (s) | 1.369 | 0.605 | 4.125 | 2.459 | 2.808 | 2.727 | 11.051 |
37 × 37 | t (s) | 1.397 | 0.610 | 4.830 | 3.061 | 2.966 | 2.854 | 12.124 |
43 × 43 | t (s) | 1.410 | 0.616 | 6.040 | 4.003 | 3.252 | 3.016 | 14.056 |
49 × 49 | t (s) | 1.425 | 0.622 | 7.065 | 4.615 | 4.105 | 3.751 | 15.582 |
55 × 55 | t (s) | 1.532 | 0.752 | 7.896 | 5.150 | 4.852 | 4.210 | 16.992 |
61 × 61 | t (s) | 1.548 | 0.766 | 9.042 | 5.908 | 6.348 | 4.601 | 17.726 |
67 × 67 | t (s) | 1.574 | 0.784 | 10.343 | 6.732 | 6.792 | 5.583 | 18.885 |
73 × 73 | t (s) | 1.591 | 0.802 | 11.612 | 7.453 | 7.193 | 5.985 | 20.421 |
79 × 79 | t (s) | 1.606 | 0.839 | 12.955 | 8.222 | 8.020 | 6.396 | 22.324 |
85 × 85 | t (s) | 1.640 | 0.860 | 14.238 | 8.988 | 9.684 | 6.969 | 24.637 |
91 × 91 | t (s) | 1.652 | 0.888 | 15.866 | 10.307 | 10.546 | 7.788 | 27.753 |
Category | Sensor | GSD | Date | Amount | Average Size (Pixels) |
---|---|---|---|---|---|
SAR reference images | TerraSAR-X | 5 m | September 2017 | 12 | 7737 × 9235 |
Optical satellite images | GF-1 | 2 m | September 2018 | 4 | 40,124 × 39,872 |
GF-2 | 1 m | Noverber 2016 | 4 | 29,428 × 28,000 | |
ZY-3 | 2 m | Decenber 2017 | 3 | 30,422 × 30,016 |
Image Name | Nr | No | RMSE (Pixels) | t (s) |
---|---|---|---|---|
GF-1-1 | 6 | 6883 | 1.05 | 87.89 |
GF-1-2 | 5 | 4794 | 1.18 | 73.79 |
GF-1-3 | 7 | 9046 | 0.86 | 98.18 |
GF-1-4 | 7 | 5799 | 0.93 | 93.07 |
GF-2-1 | 2 | 897 | 1.67 | 33.15 |
GF-2-2 | 3 | 923 | 1.61 | 43.98 |
GF-2-3 | 3 | 1581 | 1.45 | 42.88 |
GF-2-4 | 3 | 1469 | 1.68 | 43.47 |
ZY-3-1 | 3 | 2310 | 1.06 | 45.91 |
ZY-3-2 | 5 | 5216 | 0.81 | 67.54 |
ZY-3-3 | 6 | 5269 | 1.10 | 78.31 |
Criteria | = 3 | ||||
---|---|---|---|---|---|
= 3 | = 5 | = 8 | = 9 | = 10 | |
CMR/% | 94.5 | 96.2 | 96.5 | 96.0 | 96.0 |
RMSE/pixels | 0.691 | 0.619 | 0.606 | 0.604 | 0.601 |
t/s | 0.941 | 0.978 | 1.091 | 1.144 | 1.189 |
Criteria | = 8 | ||||
---|---|---|---|---|---|
= 3 | = 5 | = 7 | = 9 | = 11 | |
CMR/% | 96.5 | 96.0 | 92.5 | 84.7 | 76.2 |
RMSE/pixel | 0.606 | 0.635 | 0.697 | 0.718 | 0.726 |
t/s | 1.091 | 1.189 | 1.379 | 1.626 | 1.955 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fan, Z.; Zhang, L.; Liu, Y.; Wang, Q.; Zlatanova, S. Exploiting High Geopositioning Accuracy of SAR Data to Obtain Accurate Geometric Orientation of Optical Satellite Images. Remote Sens. 2021, 13, 3535. https://doi.org/10.3390/rs13173535
Fan Z, Zhang L, Liu Y, Wang Q, Zlatanova S. Exploiting High Geopositioning Accuracy of SAR Data to Obtain Accurate Geometric Orientation of Optical Satellite Images. Remote Sensing. 2021; 13(17):3535. https://doi.org/10.3390/rs13173535
Chicago/Turabian StyleFan, Zhongli, Li Zhang, Yuxuan Liu, Qingdong Wang, and Sisi Zlatanova. 2021. "Exploiting High Geopositioning Accuracy of SAR Data to Obtain Accurate Geometric Orientation of Optical Satellite Images" Remote Sensing 13, no. 17: 3535. https://doi.org/10.3390/rs13173535
APA StyleFan, Z., Zhang, L., Liu, Y., Wang, Q., & Zlatanova, S. (2021). Exploiting High Geopositioning Accuracy of SAR Data to Obtain Accurate Geometric Orientation of Optical Satellite Images. Remote Sensing, 13(17), 3535. https://doi.org/10.3390/rs13173535