High-Resolution Remote Sensing Image Change Detection Method Based on Improved Siamese U-Net
<p>Siam-FAUnet structure diagram.</p> "> Figure 2
<p>Schematic diagram of the atrous spatial pyramid pooling (ASPP).</p> "> Figure 3
<p>Schematic diagram of the overall structure of the flow alignment module (FAM): (<b>a</b>) description of what is contained in the first panel; (<b>b</b>) schematic diagram of the semantic flow alignment process.</p> "> Figure 4
<p>Residual correction module structure diagram.</p> "> Figure 5
<p>Example of CDD.</p> "> Figure 6
<p>Example of the SZTAKI dataset.</p> "> Figure 7
<p>The influence of λ on evaluation criteria.</p> "> Figure 8
<p>Example of change detection visualization results of different models: (<b>a</b>,<b>b</b>) are original bi-temporal images; (<b>c</b>) is the ground truth; (<b>d</b>) is the result of CDNet; (<b>e</b>) is the result of FC-Siam-conc; (<b>f</b>) is the result of FC-Siam-diff; (<b>g</b>) is the result of IFN; (<b>h</b>) is the result of DASNet; (<b>i</b>) is the result of STANet; (<b>j</b>) is the result of SNUNet; (<b>k</b>) is our result of Siam-FAUnet.</p> "> Figure 9
<p>Example of change detection visualization results of different models: (<b>a</b>,<b>b</b>) are original bi-temporal images; (<b>c</b>) is the ground truth; (<b>d</b>) is the result of CDNet; (<b>e</b>) is the result of FC-Siam-conc; (<b>f</b>) is the result of FC-Siam-diff; (<b>g</b>) is the result of IFN; (<b>h</b>) is the result of DASNet; (<b>i</b>) is the result of STANet; (<b>j</b>) is the result of SNUNet; and (<b>k</b>) is our result of Siam-FAUnet.</p> "> Figure 10
<p>Example of change detection visualization results of different models: (<b>a</b>,<b>b</b>) are original bi-temporal images; (<b>c</b>) is the ground truth; (<b>d</b>) is the result of CDNet; (<b>e</b>) is the result of FC-Siam-conc; (<b>f</b>) is the result of FC-Siam-diff; (<b>g</b>) is the result of IFN; (<b>h</b>) is the result of DASNet; (<b>i</b>) is the result of STANet; (<b>j</b>) is the result of SNUNet; and (<b>k</b>) is our result of Siam-FAUnet.</p> ">
Abstract
:1. Introduction
- Remote sensing images exhibit internal complexity and distinct characteristics in the expression of features as compared to images used in everyday settings. Remote sensing images are large-scale observations of the Earth, with original sizes that are significantly larger than those of regular images. With increasing image resolution, more detailed parts of the features can be resolved, leading to increased information content in the images. Nevertheless, the majority of research scenes are typically limited to specific scenarios with relatively low information content, thereby complicating change detection in high-resolution remote sensing images. As the multi-source heterogeneous data sources born with the continuous development of remote sensing technology have differences in imaging modes, the traditional change detection techniques cannot accurately identify the change areas, and the current methods, which mainly target the change detection between different data sources, are not universal for multimodal data application scenarios.
- The low degree of automation is a major challenge in traditional change detection methods, which require a considerable amount of manual intervention, leading to a high degree of uncertainty and a significant impact on human factors on the results. One of the current difficulties is to automatically extract features, simplify the process, and improve the level of automation. Moreover, due to the existence of diversity among the datasets used in the research, there is a need to further investigate change detection methods that have universality and generalization.
- The blurred boundary and missed detection of small targets. High-resolution remote sensing image information is complex, and shallow information will be lost when feature extraction is performed by the change detection network, resulting in blurred detection boundaries and small target miss detection, etc. Further research on multi-scale information extraction is needed to improve the detection performance. The imaging of remote sensing images is often susceptible to disturbances from external factors such as lighting, sensors, and human factors, which can result in the presence of pseudo-changes. Change detection, which involves comparing two remote sensing images obtained at different times, can be compromised if these external factors interfere with the imaging process, leading to inaccuracies in the research results. As a result, the reduction in the impact of pseudo-changes has become one of the primary challenges in remote sensing image change detection. It is critical to devise strategies to minimize their interference and exclude them from the research results.
- This paper proposes a high-resolution image change detection model, called Siam-FAUnet, based on the Siamese structure. The Siamese network was first proposed by Bromley et al. [59] in 1994, and it is used to extract change features to alleviate the influence of sample imbalance. Moreover, the bi-channel input of the two temporally separated remote sensing images can reduce the image distortion caused by channel overlay.
- The proposed model employs an encoder–decoder architecture, where an enhanced VGG16 is employed in the encoder section for feature extraction. To capture contextual features across multiple scales, the atrous spatial pyramid pooling (ASPP) module [60] is employed. Atrous convolution with varying dilation rates is used to sample input features, followed by a concatenation of the results to enlarge the network’s receptive field, which enables the finer classification of remote sensing images. In the decoder section, the flow alignment module (FAM) [61] is utilized to combine the features extracted by the encoder. FAM can learn the semantic flow between the features of different resolutions, and efficiently transfer the semantic information from coarse features to refined features with high resolution. This mitigates semantic misalignment issues caused by direct feature concatenation during feature fusion.
2. Materials and Methods
2.1. Structure of Change Detection Network Based on Siam-FAUnet
2.2. Prediction Module
- Atrous spatial pyramid pooling (ASPP) is used to address the challenge of large variation in the scale of objects in remote sensing images and the randomness of change locations. The traditional pooling operation samples at a fixed scale, which results in the network being unable to fully utilize the global information of the image and causes significant variations in the segmentation of objects of different scales. The proposed ASPP module effectively addresses these issues and its structure is shown in Figure 2. ASPP employs atrous convolution into sample input features with various dilation rates and then concatenates the results. A 1 × 1 convolutional layer is used to decrease the number of channels of the feature map to the desired amount. By incorporating the ASPP module into the network, the receptive field of the network is enlarged. This allows the network to extract features from a wider area while preserving the details of the original image. As a result, the network can more thoroughly classify remote sensing images, leading to improved change detection performance.
- The structure of the flow alignment module (FAM) is depicted in Figure 3, where Figure 3a illustrates the overall structure of FAM, and Figure 3b shows the schematic diagram of the semantic flow alignment process. FAM utilizes the feature maps, Fn and Fn−1, from two adjacent layers, where the resolution of Fn−1 is higher than that of Fn. FAM first increases the resolution of Fn to align with that of Fn−1 through bilinear interpolation and then concatenates these two feature maps. These concatenated feature maps are then processed by a sub-network comprising two 3 × 3 convolutional layers. The FAM module generates a semantic flow field by utilizing input feature maps, namely Fn and Fn−1. This flow field facilitates the improvement of the up-sampling of low-resolution features. The semantic flow field maps each pixel point, Pn−1, on the spatial grid to n levels through simple addition. Then, a differentiable bilinear sampling mechanism is employed to interpolate the values of the four neighboring pixels of Pn (upper left, upper right, lower left, and lower right) in order to estimate the final output of the FAM module, . This final output can be represented in Formula (1):
2.3. Residual Correction Module
2.4. Loss Function
2.5. Experimental Setup
2.5.1. Datasets
2.5.2. Experimental Environment and Evaluation Criteria
2.5.3. Comparison Methods
- CDNet [69] is a fully convolutional neural network based on the idea of stacking, shrinking, and expanding.
- FC-Siam-conc [70] is proposed based on the FCN model. The Siamese network is used to realize the dual input feature extraction of two-temporal remote sensing images. Finally, the features are spliced to obtain change information.
- FC-Siam-diff is proposed in [70] and also uses the FCN as the benchmark network. The difference with the previous one is that the semantic restoration stage uses a difference method to connect the feature maps, and finally obtains the changing image we need.
- IFN [44] is an encoder–decoder structure, and unlike U-Net, this model employs dense skip connections and implements implicit deep supervision in the decoder structure.
- DASNet [71] is proposed based on a dual attention mechanism to locate the changed areas and obtain more discriminant feature representations.
- STANet [72] is proposed based on the Siamese FCN to extract the bi-temporal image feature maps and design a change detection self-attention module to update the feature maps by exploiting spatial-temporal dependencies among individual pixels at different positions and times.
- SNUNet [73] is proposed based on the combination of the Siamese network and NestedUNet uses dense skip connections between the encoder and decoder and between the decoder and decoder.
3. Results
3.1. Loss Function
3.2. Ablation Experiments
3.3. Comparison Experiments
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Li, D.; Dong, Z.H.; Liu, X.A. The Present Status and its Enlightenment of Remote Sensing Satellite Application. Aerosp. China 2020, 501, 46–53. [Google Scholar]
- Xie, H.L.; Wen, J.M.; Chen, Q.R.; He, Y.F. Research Progress of the Application of Geo-information Science and Technology in Territorial Spatial Planning. J. Geo-Inf. Sci. 2022, 24, 202–219. [Google Scholar]
- Sui, H.G.; Zhao, B.F.; Xu, C.; Zhou, M.T.; Du, Z.T.; Liu, J.Y. Rapid Extraction of Flood Disaster Emergency Information with Multi-modal Sequence Remote Sensing Images. Geomat. Inf. Sci. Wuhan Univ. 2021, 46, 1441–1449. [Google Scholar]
- Peng, X.D. Research on Classification of GF-2 Remote Sensing Image Based on Improved Unet Network. Master’s Thesis, China University of Geosciences (Beijing), Beijing, China, 2020. [Google Scholar]
- Sui, H.G.; Feng, W.Q.; Li, W.Z. Review of Change Detection Methods for Multi-temporal Remote Sensing Imagery. Geomat. Inf. Sci. Wuhan Univ. 2018, 43, 1885–1898. [Google Scholar]
- Daudt, R.C.; Le Saux, B.; Boulch, A.; Gousseau, Y. Urban change detection for multispectral earth observation using convolutional neural networks. In Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2115–2118. [Google Scholar]
- Ji, S.P.; Tian, S.Q.; Zhang, C. Urban Land Cover Classification and Change Detection Using Fully Atrous Convolutional Neural Network. Geomat. Inf. Sci. Wuhan Univ. 2020, 45, 233–241. [Google Scholar]
- Liu, S.C.; Zheng, Y.J.; Dalponte, M. A novel fire index-based burned area change detection approach using Landsat-8 OLI data. Eur. J. Remote Sens. 2020, 53, 104–112. [Google Scholar] [CrossRef] [Green Version]
- Chen, C.F.; Son, N.T.; Chang, N.B. Multi-decadal mangrove forest change detection and prediction in Honduras, Central America, with Landsat imagery and a Markov chain model. Remote Sens. 2013, 5, 6408–6426. [Google Scholar] [CrossRef] [Green Version]
- Wang, P.; Wang, L.; Leung, H.; Zhang, G. Super-Resolution Mapping Based on Spatial–Spectral Correlation for Spectral Imagery. IEEE Trans. Geosci. Remote Sens. 2021, 59, 2256–2268. [Google Scholar] [CrossRef]
- Li, Z.X.; Shi, W.Z.; Zhang, C.J.; Geng, J.; Huang, J.W.; Ye, Z.R. Subpixel Change Detection Based on Improved Abundance Values for Remote Sensing Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 10073–10086. [Google Scholar] [CrossRef]
- Hu, S.S.; Huang, Y.; Huang, C.X.; Li, D.C.; Wang, Q. Development Status and Future Prospect of Multi-source Remote Sensing Image Collaborative Application. Radio Eng. 2021, 51, 1425–1433. [Google Scholar]
- Xu, J.F.; Zhang, B.M.; Yu, D.X.; Lin, Y.H.; Guo, H.T. Aircraft target change detection for high-resolution remote sensing images using multi-feature fusion. Natl. Remote Sens. Bull. 2020, 24, 37–52. [Google Scholar] [CrossRef]
- Zhang, C.J.; An, R.; Ma, L. Building change detection in remote sensing image based on improved U-Net. Comput. Eng. Appl. 2021, 57, 239–246. [Google Scholar]
- Yuan, H.C. Dynamic Monitoring of Urban Planning Based on Remote Sensing Technology. Master’s Thesis, Chongqing Jiaotong University, Chongqing, China, 2014. [Google Scholar]
- Volpi, M.; Tuia, D.; Bovolo, F.; Kanevski, M.; Bruzzone, L. Supervised change detection in VHR images using contextual information and support vector machines. Int. J. Appl. Earth Obs. Geoinf. 2013, 20, 77–85. [Google Scholar] [CrossRef]
- Toure, S.; Stow, D.; Shih, H. An object-based temporal inversion approach to urban land use change analysis. Remote Sens. Lett. 2016, 7, 503–512. [Google Scholar] [CrossRef]
- Wang, Y.; Shu, N.; Gong, Y. A Study of Land Use Change Detection Based on High Resolution Remote Sensing Images. Remote Sens. Nat. Resour. 2012, 24, 43–47. [Google Scholar]
- Wang, C.Y.; Tian, X. Forest Cover Change Detection based on GF-1 PMS Data. Remote Sens. Technol. Appl. 2021, 36, 208–216. [Google Scholar]
- Huang, L.; Fang, Y.M.; Zuo, X.Q.; Yu, X.Q. Automatic change detection method of multitemporal remote sensing images based on 2D-otsu algorithm improved by firefly algorithm. J. Sens. 2015, 2015, 327123. [Google Scholar] [CrossRef] [Green Version]
- Jin, S.M.; Yang, L.M.; Danielson, P. A comprehensive change detection method for updating the National Land Cover Database to circa 2011. Remote Sens. Environ. 2013, 132, 159–175. [Google Scholar] [CrossRef] [Green Version]
- Mei, Y.P.; Zhang, D.C.; Fu, R. Research on SAR image change detection method based on morphology and multi-scale spatial clustering. J. Optoelectron. Laser 2021, 32, 1140–1146. [Google Scholar]
- Zhang, Q.Y.; Li, Z.; Peng, D.L. Land use change detection based on object-oriented change vector analysis (OCVA). J. China Agric. Univ. 2019, 24, 166–174. [Google Scholar]
- Zhang, X.L.; Zhang, X.W.; Li, F.; Yang, T. Change Detection Method for High Resolution Remote Sensing Images Using Deep Learning. Acta Geod. Cartogr. Sin. 2017, 46, 999–1008. [Google Scholar]
- Huang, W.; Huang, J.L.; Wang, L.H.; Hu, Y.X.; Han, P.P. Remote sensing image change detection based on change vector analysis of PCA component. Remote Sens. Land Resour. 2016, 28, 22–27. [Google Scholar]
- Zhao, Q.J. Remote sensing image change detection algorithm based on improved principal component analysis. Geomat. Spat. Inf. Technol. 2019, 42, 111–113+117. [Google Scholar]
- Byrne, G.F.; Crapper, P.F.; Mayo, K.K. Monitoring land-cover change by principal component analysis of multitemporal Landsat data. Remote Sens. Environ. 1980, 10, 175–184. [Google Scholar] [CrossRef]
- Huang, C.X.; Yin, J.J.; Yang, J. Polarimetric SAR change detection with l1-norm principal component analysis. Syst. Eng. Electron. 2019, 41, 2214–2220. [Google Scholar]
- Chu, Y. Remote Sensing Image Change Detection Based on Deep Neural Network. Master’s Thesis, Nanjing University of Science & Technology, Nanjing, China, 2017. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet Classification with Deep Convolutional Neural Networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.P.; Zhang, L.F.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Zhao, J.C. Unsupervised Change Detection Technology in High-Resolution Multispectral Remote Sensing Images Based on Superpixel and Siamese Convention Neural Network. Master’s Thesis, Zhejiang University, Hangzhou, China, 2017. [Google Scholar]
- Ren, Q.R.; Yang, W.Z.; Wang, C.J.; Wei, W.Y.; Qian, Y.Y. Review of remote sensing image change detection. J. Comput. Appl. 2021, 41, 2294–2305. [Google Scholar]
- Yuan, L. Object Change Detection Based on Deep Learning and Vectorization. Master’s Thesis, University of Electronic Science and Technology of China, Chengdu, China, 2021. [Google Scholar]
- Shi, W.Z.; Zhang, M.; Zhang, R.; Chen, S.X.; Zhan, Z. Change detection based on artificial intelligence: State-of-the-art and challenges. Remote Sens. 2020, 12, 1688. [Google Scholar] [CrossRef]
- Chang, Z.L.; Yang, X.G.; Lu, R.T. High-resolution remote sensing image change detection based on improved DeepLabv3+. Laser Optoelectron. Prog. 2022, 59, 493–504. [Google Scholar]
- Zhang, C.; Wei, S.Q.; Ji, S.P. Detecting large-scale urban land cover changes from very high resolution remote sensing images using CNN-based classification. ISPRS Int. J. Geo-Inf. 2019, 8, 189. [Google Scholar] [CrossRef] [Green Version]
- Tian, Q.L.; Qin, K.; Chen, J.; Li, Y.; Chen, X.J. Building change detection for aerial images based on attention pyramid network. Acta Opt. Sin. 2020, 40, 47–56. [Google Scholar]
- Peng, D.F.; Zhang, Y.J.; Guan, H.Y. End-to-end change detection for high resolution satellite images using improved UNet++. Remote Sens. 2019, 11, 1382. [Google Scholar] [CrossRef] [Green Version]
- Papadomanolaki, M.; Verma, S.; Vakalopoulou, M.; Gupta, S.; Karantzalos, K. Detecting urban changes with recurrent neural networks from multitemporal Sentinel-2 data. In Proceedings of the 2019 IEEE IGARSS International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; IEEE: Tokyo, Japan, 2019; pp. 214–217. [Google Scholar]
- Wang, Q.; Zhang, X.D.; Chen, G.Z.; Dai, F.; Gong, Y.F.; Zhu, K. Change detection based on Faster R-CNN for high-resolution remote sensing images. Remote Sens. Lett. 2018, 9, 923–932. [Google Scholar] [CrossRef]
- Chopra, S.; Hadsell, R.; LeCun, Y. Learning a similarity metric discriminatively, with application to face verification. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; IEEE: San Diego, CA, USA, 2005; pp. 539–546. [Google Scholar]
- Zhan, Y.; Fu, K.; Yan, M.L.; Sun, X. Change detection based on deep siamese convolutional network for optical aerial images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1845–1849. [Google Scholar] [CrossRef]
- Zhang, C.X.; Yue, P.; Tapete, D. A deeply supervised image fusion network for change detection in high resolution bi-temporal remote sensing images. ISPRS J. Photogramm. Remote Sens. 2020, 166, 183–200. [Google Scholar] [CrossRef]
- Raza, A.; Huo, H.; Fang, T. EUNet-CD: Efficient UNet++ for change detection of very high-resolution remote sensing images. IEEE Geosci. Remote Sens. Lett. 2022, 19, 3510805. [Google Scholar] [CrossRef]
- Wang, D.C.; Chen, X.N.; Jiang, M.Y. ADS-Net: An attention-based deeply supervised network for remote sensing image change detection. Int. J. Appl. Earth Obs. Geoinf. 2021, 101, 102348. [Google Scholar]
- Jiang, H.W.; Hu, X.Y.; Li, K. PGA-SiamNet: Pyramid feature-based attention-guided siamese network for remote sensing orthoimagery building change detection. Remote Sens. 2020, 12, 484. [Google Scholar] [CrossRef] [Green Version]
- Shi, Q.; Liu, M.X.; Li, S.C.; Liu, X.P.; Wang, F.; Zhang, L.P. A deeply supervised attention metric-based network and an open aerial image dataset for remote sensing change detection. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5604816. [Google Scholar] [CrossRef]
- Zhang, M.; Shi, W.Z. A feature difference convolutional neural network-based change detection method. IEEE Trans. Geosci. Remote Sens. 2020, 58, 7232–7246. [Google Scholar] [CrossRef]
- Bandara, W.G.C.; Patel, V.M. A transformer-based siamese network for change detection. arXiv 2022, arXiv:2201.01293. [Google Scholar]
- Fan, W.; Zhou, M.; Huang, R. Multiscale deep features fusion for change detection. J. Image Graph. 2020, 25, 0669–0678. [Google Scholar]
- Chen, H.R.X.; Wu, C.; Du, B.; Zhang, L.P.; Wang, L. Change detection in multisource VHR images via deep siamese convolutional multiple-layers recurrent neural network. IEEE Trans. Geosci. Remote Sens. 2020, 58, 2848–2864. [Google Scholar] [CrossRef]
- Fang, B.; Pan, L.; Kou, R. Dual learning-based siamese framework for change detection using bi-temporal VHR optical remote sensing images. Remote Sens. 2019, 11, 1292. [Google Scholar] [CrossRef] [Green Version]
- Wang, M.Y.; Tan, K.; Jia, X.P. A deep siamese network with hybrid convolutional feature extraction module for change detection based on multi-sensor remote sensing images. Remote Sens. 2020, 12, 205. [Google Scholar] [CrossRef] [Green Version]
- Xu, Q.F.; Chen, K.M.; Sun, X. Pseudo-siamese capsule network for aerial remote sensing images change detection. IEEE Geosci. Remote Sens. Lett. 2020, 19, 6000405. [Google Scholar] [CrossRef]
- Liu, T.; Li, Y.; Cao, Y.; Shen, Q. Change detection in multitemporal synthetic aperture radar images using dual-channel convolutional neural network. J. Appl. Remote Sens. 2017, 11, 042615. [Google Scholar] [CrossRef] [Green Version]
- Wiratama, W.; Lee, J.; Park, S.E. Dual-dense convolution network for change detection of high-resolution panchromatic imagery. Appl. Sci. 2018, 8, 1785. [Google Scholar] [CrossRef] [Green Version]
- Touati, R.; Mignotte, M.; Dahmane, M. Partly uncoupled siamese model for change detection from heterogeneous remote sensing imagery. J. Remote Sens. GIS 2020, 9, 272. [Google Scholar]
- Bromley, J.; Guyon, I.; LeCun, Y. Signature verification using a “siamese” time delay neural network. Adv. Neural Inf. Process. Syst. 1994, 6, 737–744. [Google Scholar]
- Yang, M.K.; Yu, K.; Zhang, C. Denseaspp for semantic segmentation in street scenes. In Proceedings of the 2018 IEEE CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 3684–3692. [Google Scholar]
- Li, X.T.; You, A.S.; Zhu, Z. Semantic flow for fast and accurate scene parsing. In Proceedings of the European Conference on Computer Vision, Glasgow, Scotland, 23–28 August 2020; Springer: Cham, Switzerland, 2020; pp. 775–793. [Google Scholar]
- Boer, P.; Kroese, D.P.; Mannor, S. A Tutorial on the Cross-Entropy Method. Ann. Oper. Res. 2005, 134, 19–67. [Google Scholar] [CrossRef]
- Lebedev, M.A.; Vizilter, Y.V.; Vygolov, O.V. Change detection in remote sensing images using conditional adversarial networks. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 565–571. [Google Scholar] [CrossRef] [Green Version]
- Benedek, C.; Sziranyi, T. Change detection in optical aerial images by a multilayer conditional mixed Markov model. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3416–3430. [Google Scholar] [CrossRef] [Green Version]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. Comput. Sci. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Robbins, H.; Monro, S. A Stochastic Approximation Method. Ann. Math. Stat. 1951, 22, 400–407. [Google Scholar] [CrossRef]
- Ji, S.H.; Vishwanathan, S.V.N.; Satish, N. BlackOut: Speeding up Recurrent Neural Network Language Models with Very Large Vocabularies. Comput. Sci. 2016, 115, 2159–2168. [Google Scholar]
- Duchi, J.; Hazan, E.; Singer, Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. J. Mach. Learn. Res. 2011, 12, 2121–2159. [Google Scholar]
- Alcantarilla, P.F.; Stent, S.; Ros, G. Street-view change detection with deconvolutional networks. Auton. Robot 2018, 42, 1301–1322. [Google Scholar] [CrossRef]
- Daudt, R.C.; Le Saux, B.; Boulch, A. Fully convolutional siamese networks for change detection. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 4063–4067. [Google Scholar]
- Chen, J.; Yuan, Z.Y.; Peng, J. DASNet: Dual attentive fully convolutional siamese networks for change detection in high-resolution satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 14, 1194–1206. [Google Scholar] [CrossRef]
- Chen, H.; Shi, Z.W. A spatial-temporal attention-based method and a new dataset for remote sensing image change detection. Remote Sens. 2020, 12, 1662. [Google Scholar] [CrossRef]
- Fang, S.; Li, K.Y.; Shao, J.Y. SNUNet-CD: A densely connected siamese network for change detection of VHR images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 8007805. [Google Scholar] [CrossRef]
Dataset | Pairs | Size (Pixel) | Spatial Resolution (m/Pixel) | Content |
---|---|---|---|---|
CDD | 16,000 | 256 × 256 | 0.03 m/pixel–1 m/pixel | Real remote sensing images that change with the seasons. |
SZTAKI (SZADA and TISZADOB) | 12 | 952 × 640 | 1.5 m/pixel | The SZADA dataset contains 7 pairs of remote sensing images. The TISZADOB dataset contains 5 pairs of remote sensing images. The changing content includes new built-up regions, building operations, planting of large group of trees, fresh plough-land, and ground work before building over. |
FAM | ASSP | OA (%) | P (%) | R (%) | F1-Score (%) |
---|---|---|---|---|---|
× | × | 96.86 | 95.56 | 85.92 | 90.48 |
√ | × | 97.39 | 96.80 | 87.84 | 92.11 |
× | √ | 97.69 | 94.60 | 91.97 | 93.27 |
√ | √ | 98.14 | 95.62 | 93.56 | 94.58 |
FAM | ASPP | SZADA/1 | TISZADOB/3 | ||||||
---|---|---|---|---|---|---|---|---|---|
OA (%) | P (%) | R (%) | F1-Score (%) | OA (%) | P (%) | R (%) | F1-Score (%) | ||
× | × | 93.11 | 44.67 | 66.47 | 53.43 | 94.87 | 75.67 | 88.59 | 81.62 |
√ | × | 93.45 | 46.17 | 67.64 | 54.88 | 95.48 | 76.24 | 89.97 | 82.54 |
× | √ | 93.56 | 45.30 | 70.89 | 55.28 | 95.62 | 75.45 | 91.03 | 82.51 |
√ | √ | 94.95 | 44.47 | 73.79 | 55.50 | 96.37 | 75.27 | 95.63 | 84.24 |
Model | OA (%) | P (%) | R (%) | F1-Score (%) |
---|---|---|---|---|
CDNet * | 95.48 | 90.26 | 82.90 | 86.43 |
FC-Siam-conc * | 95.72 | 84.41 | 82.50 | 82.50 |
FC-Siam-diff * | 95.75 | 85.78 | 83.64 | 83.73 |
IFN | 97.71 | 94.96 | 86.08 | 90.30 |
DASNet * | 97.84 | 91.40 | 92.50 | 91.90 |
STANet * | 96.67 | 90.68 | 82.06 | 85.97 |
SNUNet * | 97.74 | 89.55 | 83.22 | 86.10 |
Ours | 98.14 | 95.62 | 93.56 | 94.58 |
Models | SZADA/1 | TISZADOB/3 | ||||||
---|---|---|---|---|---|---|---|---|
OA (%) | P (%) | R (%) | F1-Score (%) | OA (%) | P (%) | R (%) | F1-Score (%) | |
CDNet * | 93.53 | 41.46 | 70.25 | 52.15 | 92.87 | 71.28 | 88.87 | 79.11 |
FC-Siam-conc * | 90.96 | 42.35 | 66.18 | 51.65 | 92.13 | 71.53 | 87.68 | 78.79 |
FC-Siam-diff * | 89.71 | 41.14 | 69.46 | 51.67 | 90.06 | 69.78 | 87.19 | 77.52 |
IFN * | 92.79 | 40.96 | 72.87 | 52.44 | 93.51 | 69.33 | 93.48 | 79.61 |
DASNet * | 92.13 | 41.35 | 72.42 | 52.64 | 92.81 | 70.35 | 91.07 | 79.38 |
STANet * | 90.08 | 40.97 | 70.56 | 51.84 | 91.06 | 70.36 | 89.85 | 78.92 |
SNUNet * | 91.30 | 41.48 | 70.59 | 52.25 | 92.74 | 70.74 | 90.66 | 79.47 |
Ours | 94.95 | 44.47 | 73.79 | 55.50 | 96.37 | 75.27 | 95.63 | 84.24 |
Images | Real | CDNet | FC-Siam-conc | FC-Siam-diff | IFN | DASNet | STANet | SNUNet | Ours |
---|---|---|---|---|---|---|---|---|---|
Pair 1 | 10 | 2 | 2 | 3 | 6 | 4 | 4 | 4 | 10 |
Pair 2 | 8 | 6 | 4 | 5 | 6 | 6 | 6 | 5 | 8 |
Pair 3 | 5 | 3 | 2 | 2 | 4 | 3 | 3 | 3 | 5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Q.; Li, M.; Li, G.; Zhang, J.; Yan, S.; Chen, Z.; Zhang, X.; Chen, G. High-Resolution Remote Sensing Image Change Detection Method Based on Improved Siamese U-Net. Remote Sens. 2023, 15, 3517. https://doi.org/10.3390/rs15143517
Wang Q, Li M, Li G, Zhang J, Yan S, Chen Z, Zhang X, Chen G. High-Resolution Remote Sensing Image Change Detection Method Based on Improved Siamese U-Net. Remote Sensing. 2023; 15(14):3517. https://doi.org/10.3390/rs15143517
Chicago/Turabian StyleWang, Qing, Mengqi Li, Gongquan Li, Jiling Zhang, Shuoyue Yan, Zhuoran Chen, Xiaodong Zhang, and Guanzhou Chen. 2023. "High-Resolution Remote Sensing Image Change Detection Method Based on Improved Siamese U-Net" Remote Sensing 15, no. 14: 3517. https://doi.org/10.3390/rs15143517