A Polarimetric Scattering Characteristics-Guided Adversarial Learning Approach for Unsupervised PolSAR Image Classification
"> Figure 1
<p>Illustration of the proposed PSCAN. The propagation of the data from different domains is marked by arrows of different colors.</p> "> Figure 2
<p>The architecture of the feature encoder and classifier. Layers within different blocks are represented by different colors.</p> "> Figure 3
<p>Cloude–Pottier decomposition results of the RadarSat-2 PolSAR image. (<b>a</b>) Entropy. (<b>b</b>) Alpha angle. (<b>c</b>) Anisotropy.</p> "> Figure 4
<p><math display="inline"><semantics> <mrow> <mi>H</mi> <mo>/</mo> <mi>α</mi> </mrow> </semantics></math> classification hyperplane.</p> "> Figure 5
<p>Classification hyperplanes and results of <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>/</mo> <mi>A</mi> <mo>/</mo> <mi>α</mi> </mrow> </semantics></math> of the RadarSat-2 PolSAR image. (<b>a</b>) Result of <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>/</mo> <mi>α</mi> </mrow> </semantics></math> hyperplane. (<b>b</b>) Result of <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>/</mo> <mi>A</mi> </mrow> </semantics></math> hyperplane. (<b>c</b>) Result of <math display="inline"><semantics> <mrow> <mi>A</mi> <mo>/</mo> <mi>α</mi> </mrow> </semantics></math> hyperplane.</p> "> Figure 6
<p>Pauli RGB and ground truth maps of the San Francisco datasets and the color code. (<b>a</b>,<b>b</b>) ALOS-2 San Francisco dataset. (<b>c</b>,<b>d</b>) GF-3 San Francisco dataset. (<b>e</b>,<b>f</b>) RS-2 San Francisco dataset.</p> "> Figure 7
<p>Pauli RGB and ground truth maps of the Qingdao datasets and the color code. (<b>a</b>,<b>b</b>) GF-3 Qingdao dataset. (<b>c</b>,<b>d</b>) RS-2 Qingdao dataset.</p> "> Figure 8
<p>Comparison of the whole map classification results on the San Francisco area when the source domain is set to ALOS-2. The circles in pink and gray represent the correct and incorrect classification, respectively. (<b>a</b>) GF-3 as the target domain. (<b>b</b>) RS-2 as the target domain.</p> "> Figure 9
<p>Comparison of the whole map classification results on the San Francisco area when the source domain is set to GF-3. The circles in pink and gray represent the correct and incorrect classification, respectively. (<b>a</b>) ALOS-2 as the target domain. (<b>b</b>) RS-2 as the target domain.</p> "> Figure 10
<p>Comparison of the whole map classification results on the San Francisco area when the source domain is set to RS-2. (<b>a</b>) ALOS-2 as the target domain. (<b>b</b>) GF-3 as the target domain.</p> "> Figure 11
<p>Comparison of the whole map classification results on the Qingdao area. The circles in pink and gray represent the correct and incorrect classification, respectively. (<b>a</b>) RS-2 as the target domain. (<b>b</b>) GF-3 as the target domain.</p> "> Figure 12
<p>Effect of <math display="inline"><semantics> <mi>α</mi> </semantics></math> on classification performance of the proposed PSCAN. (<b>a</b>) Results when RS-2 is the target. (<b>b</b>) Results when GF-3 is the target.</p> "> Figure 13
<p>Comparison of feature distributions between the source and target data before and after adaptation. (<b>a</b>) Feature distribution of Water data before adaptation. (<b>b</b>) Feature distribution of water data after adaptation. (<b>c</b>) Feature distribution of Low-Density Urban data before adaptation. (<b>d</b>) Feature distribution of low-density urban data after adaptation.</p> "> Figure 14
<p>Comparison of the feature discriminability of target data before and after adaptation. (<b>a</b>) Target feature visualization before adaptation. (<b>b</b>) Target feature visualization after adaptation.</p> "> Figure 15
<p>Comparison results of the ablation study. (<b>a</b>) Results of OA. (<b>b</b>) Results of KC.</p> ">
Abstract
:1. Introduction
- –
- We propose the integration of deep ADA into the PolSAR community and extend CNN-based PolSAR classifiers to the field of unsupervised classification. Compared to those of traditional unsupervised PolSAR classification methods, the deep features extracted by CNNs yield better a decoupling performance. Furthermore, the transfer of knowledge enables ADA to make predictions without requiring manual inference, making it more suitable for application scenarios.
- –
- We describe the design of an auxiliary task and incorporate it into the standard adversarial adaptation network to capture category semantic information from the polarimetric scattering characteristics-guided pseudo-labels. This enhances the learning of the class-wise correlations between the source and target domains.
- –
- We propose a novel unsupervised ADA method to address the issue of insufficient discriminability exhibited by domain-invariant features obtained through the existing methods. Compared with the related methods [31,32], our proposed method utilizes the polarimetric scattering characteristics of PolSAR images, i.e., Cloude–Pottier decomposition [18,33], to construct pseudo-labels, which avoids the complex and inefficient automatic labeling mechanisms in optical image-related methods, so it is very simple and efficient.
2. Related Work
2.1. Convolutional Neural Network
2.2. Unsupervised Domain Adaptation
3. Methods
3.1. Representation of PolSAR Data
3.2. Problem Definition
3.3. Feature Encoder and Classifier
3.4. Domain Discriminator
3.5. Pseudo-Label and Auxiliary Task
- –
- The pseudo-labels need to be arbitrary to the data source. Specifically, if a region in the source domain is labeled as a certain class, the same region in the target domain should also be labeled as the same class in order to ensure correct alignment of class-specific features.
- –
- The pseudo-labeling approach must exhibit broad generalizability. To achieve this, the labeling process must be automated, without any manual intervention. Additionally, the method should produce the expected results for all types of data, not just specific data types. The combination of these two aspects ensures the applicability of the algorithm.
- –
- The pseudo-labels must strive for maximum accuracy. While pseudo-labeling can be a powerful tool, the introduction of labeling errors can significantly reduce classification accuracy. Therefore, it is important to minimize label noise in order to achieve optimal performance.
4. Experiments
4.1. Study Area and Data Sources
4.2. Experimental Setup
4.3. Comparison of Results
4.4. Effect of Hyperparameter
4.5. Feature Visualization Results
4.6. Ablation Experiments
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Mott, H. Remote Sensing with Polarimetric Radar; Wiley-IEEE Press: Hoboken, NJ, USA, 2007. [Google Scholar]
- Zhang, C.; Sargent, I.; Pan, X.; Li, H.; Gardiner, A.; Hare, J.; Atkinson, P.M. Joint Deep Learning for land cover and land use classification. Remote Sens. Environ. 2019, 221, 173–187. [Google Scholar] [CrossRef] [Green Version]
- Shi, H.; Zhao, L.; Yang, J.; Lopez-Sanchez, J.M.; Zhao, J.; Sun, W.; Shi, L.; Li, P. Soil moisture retrieval over agricultural fields from L-band multi-incidence and multitemporal PolSAR observations using polarimetric decomposition techniques. Remote Sens. Environ. 2021, 261, 112485. [Google Scholar] [CrossRef]
- Ma, X.; Xu, J.; Wu, P.; Kong, P. Oil Spill Detection Based on Deep Convolutional Neural Networks Using Polarimetric Scattering Information From Sentinel-1 SAR Images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 4204713. [Google Scholar] [CrossRef]
- Zhang, T.; Quan, S.; Yang, Z.; Guo, W.; Zhang, Z.; Gan, H. A Two-Stage Method for Ship Detection Using PolSAR Image. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5236918. [Google Scholar] [CrossRef]
- Moghaddam, M.; Saatchi, S. Analysis of scattering mechanisms in SAR imagery over boreal forest: Results from BOREAS’93. IEEE Trans. Geosci. Remote Sens. 1995, 33, 1290–1296. [Google Scholar] [CrossRef] [Green Version]
- Cloude, S.R.; Pottier, E. A review of target decomposition theorems in radar polarimetry. IEEE Trans. Geosci. Remote Sens. 1996, 34, 498–518. [Google Scholar] [CrossRef]
- Cameron, W.; Leung, L. Feature Motivated Polarization Scattering Matrix Decomposition. In Proceedings of the IEEE International Conference on Radar, Arlington, VA, USA, 7–10 May 1990; pp. 549–557. [Google Scholar] [CrossRef]
- Freeman, A.; Durden, S.L. A three-component scattering model for polarimetric SAR data. IEEE Trans. Geosci. Remote Sens. 1998, 36, 963–973. [Google Scholar] [CrossRef] [Green Version]
- Yamaguchi, Y.; Sato, A.; Boerner, W.; Sato, R.; Yamada, H. Four-component scattering power decomposition with rotation of coherency matrix. IEEE Trans. Geosci. Remote Sens. 2011, 49, 2251–2258. [Google Scholar] [CrossRef]
- Hara, Y.; Atkins, R.G.; Yueh, S.H.; Shin, R.T.; Kong, J.A. Application of neural networks to radar image classification. IEEE Trans. Geosci. Remote Sens. 1994, 32, 100–109. [Google Scholar] [CrossRef]
- Lee, J.S.; Grunes, M.R.; Kwok, R. Classification of multi-look polarimetric SAR imagery based on complex Wishart distribution. Int. J. Remote Sens. 1994, 15, 2299–2311. [Google Scholar] [CrossRef]
- Lardeux, C.; Frison, P.; Tison, C.; Souyris, J.; Stoll, B.; Fruneau, B.; Rudant, J. Support vector machine for multifrequency SAR polarimetric data classification. IEEE Trans. Geosci. Remote Sens. 2009, 47, 4143–4152. [Google Scholar] [CrossRef]
- Du, L.J.; Lee, J.S. Fuzzy classification of earth terrain covers using complex polarimetric SAR data. Int. J. Remote Sens. 1996, 17, 809–826. [Google Scholar] [CrossRef]
- Cao, F.; Hong, W.; Wu, Y.; Pottier, E. An Unsupervised Segmentation With an Adaptive Number of Clusters Using the SPAN/H/α/A Space and the Complex Wishart Clustering for Fully Polarimetric SAR Data Analysis. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3454–3467. [Google Scholar] [CrossRef]
- Ersahin, K.; Cumming, I.G.; Ward, R.K. Segmentation and Classification of Polarimetric SAR Data Using Spectral Graph Partitioning. IEEE Trans. Geosci. Remote Sens. 2010, 48, 164–174. [Google Scholar] [CrossRef] [Green Version]
- van Zyl, J.J. Unsupervised classification of scattering behavior using radar polarimetry data. IEEE Trans. Geosci. Remote Sens. 1989, 27, 36–45. [Google Scholar] [CrossRef]
- Cloude, S.R.; Pottier, E. An entropy based classification scheme for land applications of polarimetric SAR. IEEE Trans. Geosci. Remote Sens. 1997, 35, 68–78. [Google Scholar] [CrossRef]
- Zhou, Y.; Wang, H.; Xu, F.; Jin, Y. Polarimetric SAR Image Classification Using Deep Convolutional Neural Networks. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1935–1939. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, S.; Zou, B.; Dong, H. Unsupervised Deep Representation Learning and Few-Shot Classification of PolSAR Images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–16. [Google Scholar] [CrossRef]
- Ren, B.; Zhao, Y.; Hou, B.; Chanussot, J.; Jiao, L. A Mutual Information-Based Self-Supervised Learning Model for PolSAR Land Cover Classification. IEEE Trans. Geosci. Remote Sens. 2021, 59, 9224–9237. [Google Scholar] [CrossRef]
- Wang, M.; Deng, W. Deep visual domain adaptation: A survey. Neurocomputing 2018, 312, 135–153. [Google Scholar] [CrossRef] [Green Version]
- Pan, S.J.; Yang, Q. A Survey on Transfer Learning. IEEE Trans. Knowl. Data Eng. 2010, 22, 1345–1359. [Google Scholar] [CrossRef]
- Garrett, W.; Diana, J.C. A Survey of Unsupervised Deep Domain Adaptation. arXiv 2020, arXiv:1812.02849. [Google Scholar]
- Long, M.; Cao, Y.; Wang, J.; Jordan, M.I. Learning Transferable Features with Deep Adaptation Networks. In Proceedings of the International Conference on Machine Learning, Lille, France, 6–11 July 2015; pp. 97–105. [Google Scholar]
- Sun, B.; Feng, J.; Saenko, K. Return of Frustratingly Easy Domain Adaptation. In Proceedings of the AAAI Conference on Artificial Intelligence, Phoenix, AZ, USA, 12–17 February 2016; pp. 2058–2065. [Google Scholar]
- Long, M.; Zhu, H.; Wang, J.; Jordan, M.I. Deep Transfer Learning with Joint Adaptation Networks. In Proceedings of the ICML’17: Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, 6–11 August 2017; pp. 2208–2217. [Google Scholar]
- Ganin, Y.; Ustinova, E.; Ajakan, H.; Germain, P.; Larochelle, H.; Laviolette, F.; Marchand, M.; Lempitsky, V. Domain-adversarial training of neural networks. J. Mach. Learn. Res. 2016, 17, 2030–2096. [Google Scholar]
- Tzeng, E.; Hoffman, J.; Saenko, K.; Darrell, T. Adversarial Discriminative Domain Adaptation. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 7167–7176. [Google Scholar]
- Chen, X.; Wang, S.; Long, M.; Wang, J. Transferability vs. Discriminability: Batch Spectral Penalization for Adversarial Domain Adaptation. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 1081–1090. [Google Scholar]
- Kang, G.; Jiang, L.; Yang, Y.; Hauptmann, A.G. Contrastive Adaptation Network for Unsupervised Domain Adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 4888–4897. [Google Scholar] [CrossRef] [Green Version]
- Pei, Z.; Cao, Z.; Long, M.; Wang, J. Multi-Adversarial Domain Adaptation. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LO, USA, 2–7 February 2018; pp. 3934–3941. [Google Scholar]
- Pottier, E. The H/A/α Polarimetric Decomposition Approach Applied to PolSAR Data Processing. In Proceedings of the PIERS—Workshop on Advances in Radar Methods, Baveno, Italy, 20–22 July 1998; pp. 120–122. [Google Scholar]
- LeCun, Y.; Boser, B.; Denker, J.; Henderson, D.; Howard, R.; Hubbard, W.; Jackel, L. Backpropagation Applied to Handwritten Zip Code Recognition. Neural Comput. 1989, 1, 541–551. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. In Proceedings of the Advances in Neural Information Processing Systems 25 (NIPS 2012), Lake Tahoe, NV, USA, 3–6 December 2012; pp. 1097–1105. [Google Scholar] [CrossRef] [Green Version]
- Lecun, Y.; Bengio, Y.; Hinton, G.E. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Chen, S.; Tao, C. PolSAR Image Classification Using Polarimetric-Feature-Driven Deep Convolutional Neural Network. IEEE Geosci. Remote Sens. Lett. 2018, 15, 627–631. [Google Scholar] [CrossRef]
- Liu, X.; Jiao, L.; Tang, X.; Sun, Q.; Zhang, D. Polarimetric Convolutional Network for PolSAR Image Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3040–3054. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Dong, H.; Zou, B. Efficiently utilizing complex-valued PolSAR image data via a multi-task deep learning framework. ISPRS J. Photogramm. Remote Sens. 2019, 157, 59–72. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Z.; Wang, H.; Xu, F.; Jin, Y. Complex-valued Convolutional Neural Network and Its Application in Polarimetric SAR Image Classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 7177–7188. [Google Scholar] [CrossRef]
- Yang, C.; Hou, B.; Ren, B.; Hu, Y.; Jiao, L. CNN-Based Polarimetric Decomposition Feature Selection for PolSAR Image Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8796–8812. [Google Scholar] [CrossRef]
- Qin, R.; Fu, X.; Lang, P. PolSAR Image Classification Based on Low-Frequency and Contour Subbands-Driven Polarimetric SENet. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens. 2020, 13, 4760–4773. [Google Scholar] [CrossRef]
- Dong, H.; Zhang, L.; Lu, D.; Zou, B. Attention-Based Polarimetric Feature Selection Convolutional Network for PolSAR Image Classification. IEEE Geosci. Remote Sens. Lett. 2020, 19, 4001705. [Google Scholar] [CrossRef]
- Dong, H.; Zhang, L.; Zou, B. PolSAR Image Classification with Lightweight 3D Convolutional Networks. Remote Sens. 2020, 12, 396. [Google Scholar] [CrossRef] [Green Version]
- Tan, X.; Li, M.; Zhang, P.; Wu, Y.; Song, W. Complex-Valued 3-D Convolutional Neural Network for PolSAR Image Classification. IEEE Geosci. Remote Sens. Lett. 2020, 17, 1022–1026. [Google Scholar] [CrossRef]
- Dong, H.; Zou, B.; Zhang, L.; Zhang, S. Automatic Design of CNNs via Differentiable Neural Architecture Search for PolSAR Image Classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 6362–6375. [Google Scholar] [CrossRef] [Green Version]
- Liu, F.; Jiao, L.; Tang, X. Task-Oriented GAN for PolSAR Image Classification and Clustering. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 2707–2719. [Google Scholar] [CrossRef] [PubMed]
- Wen, Z.; Wu, Q.; Liu, Z.; Pan, Q. Polar-Spatial Feature Fusion Learning With Variational Generative-Discriminative Network for PolSAR Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8914–8927. [Google Scholar] [CrossRef]
- Zhu, X.; Montazeri, S.; Ali, M.; Hua, Y.; Wang, Y.; Mou, L.; Shi, Y.; Xu, F.; Bamler, R. Deep Learning Meets SAR: Concepts, Models, Pitfalls, and Perspectives. IEEE Geosci. Remote Sens. Mag. 2021, 9, 143–172. [Google Scholar] [CrossRef]
- Pan, S.J.; Tsang, I.W.; Kwok, J.T.; Yang, Q. Domain Adaptation via Transfer Component Analysis. IEEE Trans. Neural Netw. 2011, 22, 199–210. [Google Scholar] [CrossRef] [Green Version]
- Baktashmotlagh, M.; Harandi, M.T.; Lovell, B.C.; Salzmann, M. Unsupervised Domain Adaptation by Domain Invariant Projection. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, NSW, Australia, 1–8 December 2013; pp. 769–776. [Google Scholar] [CrossRef] [Green Version]
- Zhuang, F.; Cheng, X.; Luo, P.; Pan, S.J.; He, Q. Supervised Representation Learning: Transfer Learning with Deep Autoencoders. In Proceedings of the IJCAI’15: 24th International Conference on Artificial Intelligence, Buenos Aires, Argentina, 25–31 July 2015; pp. 4119–4125. [Google Scholar]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Nets. In Proceedings of the Advances in Neural Information Processing Systems 27 (NIPS 2014), Montreal, QC, Canada, 8–13 December 2014; pp. 2672–2680. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the ICML’15: 32nd International Conference on International Conference on Machine Learning, Lille, France, 7–9 July 2015; pp. 448–456. [Google Scholar]
- Qiang, W.; Li, J.; Zheng, C.; Su, B. Auxiliary task guided mean and covariance alignment network for adversarial domain adaptation. Knowl.-Based Syst. 2021, 223, 107066. [Google Scholar] [CrossRef]
- Hou, C.; Tsai, Y.H.; Yeh, Y.R.; Wang, Y.F. Unsupervised Domain Adaptation With Label and Structural Consistency. IEEE Trans. Image Process. 2016, 25, 5552–5562. [Google Scholar] [CrossRef]
- Liang, J.; He, R.; Sun, Z.; Tan, T. Exploring uncertainty in pseudo-label guided unsupervised domain adaptation. Pattern Recognit. 2019, 96, 106996. [Google Scholar] [CrossRef]
- Wang, Q.; Breckon, T. Unsupervised Domain Adaptation via Structured Prediction Based Selective Pseudo-Labeling. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; pp. 6243–6250. [Google Scholar]
- Yin, J.; Moon, W.M.; Yang, J. Novel Model-Based Method for Identification of Scattering Mechanisms in Polarimetric SAR Data. IEEE Trans. Geosci. Remote Sens. 2016, 54, 520–532. [Google Scholar] [CrossRef]
- Lu, D.; Zou, B. Improved alpha angle estimation of polarimetric SAR data. Electron. Lett. 2016, 52, 393–395. [Google Scholar] [CrossRef]
- Lee, J.S.; Grunes, M.R.; Ainsworth, T.; Du, L.J.; Schuler, D.L.; Cloude, S.R. Unsupervised classification using polarimetric decomposition and the complex Wishart classifier. IEEE Trans. Geosci. Remote Sens. 1999, 37, 2249–2258. [Google Scholar] [CrossRef]
- Ferro-Famil, L.; Pottier, E.; Lee, J.S. Unsupervised classification of multifrequency and fully polarimetric SAR images based on the H/A/Alpha-Wishart classifier. IEEE Trans. Geosci. Remote Sens. 2001, 39, 2332–2342. [Google Scholar] [CrossRef]
- Lee, J.S.; Grunes, M.R.; Pottier, E.; Ferro-Famil, L. Unsupervised terrain classification preserving polarimetric scattering characteristics. IEEE Trans. Geosci. Remote Sens. 2004, 42, 722–731. [Google Scholar] [CrossRef]
- Wen, Y.; Zhang, K.; Li, Z.; Qiao, Y. A Discriminative Feature Learning Approach for Deep Face Recognition. In Proceedings of the Computer Vision ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; pp. 499–515. [Google Scholar]
- Schroff, F.; Kalenichenko, D.; Philbin, J. FaceNet: A Unified Embedding for Face Recognition and Clustering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 815–823. [Google Scholar] [CrossRef] [Green Version]
- Liu, W.; Wen, Y.; Yu, Z.; Yang, M. Large-Margin Softmax Loss for Convolutional Neural Networks. In Proceedings of the ICML’16: 33rd International Conference on International Conference on Machine Learning, New York, NY, USA, 19–24 June 2016; pp. 507–516. [Google Scholar]
- Liu, X.; Jiao, L.; Liu, F. PolSF: PolSAR image dataset on San Francisco. arXiv 2019, arXiv:1912.07259. [Google Scholar]
- Long, M.; Wang, J.; Ding, G.; Sun, J.; Yu, P.S. Transfer Feature Learning with Joint Distribution Adaptation. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Sydney, NSW, Australia, 1–8 December 2013; pp. 2200–2207. [Google Scholar] [CrossRef]
- Wang, J.; Chen, Y.; Hao, S.; Feng, W.; Shen, Z. Balanced Distribution Adaptation for Transfer Learning. In Proceedings of the 2017 IEEE International Conference on Data Mining (ICDM), New Orleans, LA, USA, 18–21 November 2017; pp. 1129–1134. [Google Scholar] [CrossRef] [Green Version]
- Long, M.; Wang, J.; Ding, G.; Sun, J.; Yu, P.S. Transfer Joint Matching for Unsupervised Domain Adaptation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 1410–1417. [Google Scholar] [CrossRef]
- Sun, B.; Saenko, K. Deep coral: Correlation alignment for deep domain adaptation. In Proceedings of the Computer Vision—ECCV 2016 Workshops, Amsterdam, The Netherlands, 8–10, 15–16 October 2016; pp. 443–450. [Google Scholar]
- Long, M.; Cao, Z.; Wang, J.; Jordan, M.I. Conditional Adversarial Domain Adaptation. In Proceedings of the Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems, NeurIPS, Montréal, QC, Canada, 3–8 December 2018; pp. 1640–1650. [Google Scholar]
- Heusel, M.; Ramsauer, H.; Unterthiner, T.; Nessler, B.; Hochreiter, S. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium. In Proceedings of the NIPS’17: 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; pp. 6626–6638. [Google Scholar]
San Francisco | Qingdao | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Method | A→G | A→R | G→A | G→R | R→A | R→G | Avg | G→R | R→G | Avg | Total |
Source-only | 58.40 | 79.91 | 35.22 | 64.85 | 73.20 | 40.77 | 58.73 | 61.02 | 57.37 | 59.20 | 58.96 |
TCA | 68.68 | 81.47 | 68.29 | 76.58 | 75.97 | 75.16 | 74.36 | 87.62 | 64.68 | 76.15 | 75.25 |
JDA | 71.35 | 82.12 | 71.36 | 74.24 | 79.58 | 78.06 | 76.12 | 88.56 | 84.43 | 86.50 | 81.31 |
BDA | 70.53 | 81.75 | 72.87 | 73.49 | 78.85 | 77.81 | 75.88 | 91.06 | 84.22 | 87.64 | 81.76 |
TJM | 73.83 | 80.90 | 74.28 | 77.55 | 77.52 | 79.22 | 77.22 | 93.64 | 74.87 | 84.25 | 80.74 |
DAN | 81.41 | 93.13 | 85.21 | 86.38 | 91.23 | 86.67 | 87.34 | 92.86 | 91.97 | 92.42 | 89.88 |
JAN | 77.73 | 93.06 | 85.96 | 86.56 | 91.40 | 86.69 | 86.90 | 92.91 | 91.52 | 92.22 | 89.56 |
DCORAL | 77.56 | 94.43 | 83.22 | 92.38 | 84.96 | 79.17 | 85.29 | 94.42 | 90.98 | 92.70 | 88.99 |
DANN | 80.45 | 93.57 | 91.77 | 91.67 | 92.31 | 85.15 | 89.15 | 96.23 | 90.25 | 93.24 | 91.20 |
CDAN | 78.72 | 94.33 | 90.82 | 92.28 | 92.24 | 82.36 | 88.46 | 95.36 | 92.38 | 93.87 | 91.16 |
PSCAN | 82.14 | 93.51 | 91.72 | 92.86 | 93.23 | 88.48 | 90.32 | 97.64 | 94.49 | 96.07 | 93.19 |
ALOS-2 → GF-3 | ALOS-2 → RS-2 | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Method | C1 | C2 | C3 | C4 | C5 | AA | KC | C1 | C2 | C3 | C4 | C5 | AA | KC |
Source-only | 8.96 | 96.72 | 92.78 | 48.63 | 45.37 | 58.49 | 47.81 | 85.88 | 79.25 | 16.23 | 99.36 | 95.38 | 75.22 | 74.34 |
TCA | 87.47 | 93.85 | 84.34 | 40.65 | 40.72 | 69.41 | 60.69 | 98.94 | 94.29 | 83.96 | 57.52 | 67.04 | 80.35 | 76.72 |
JDA | 99.12 | 97.49 | 78.07 | 44.23 | 42.85 | 72.35 | 64.10 | 99.94 | 95.30 | 77.49 | 71.50 | 59.63 | 80.77 | 77.51 |
BDA | 99.12 | 96.60 | 77.22 | 45.94 | 37.99 | 71.37 | 63.04 | 99.82 | 95.11 | 70.62 | 76.17 | 60.75 | 80.49 | 77.06 |
TJM | 99.39 | 93.20 | 74.22 | 49.47 | 58.82 | 75.02 | 67.19 | 97.71 | 93.29 | 68.38 | 70.71 | 70.06 | 80.03 | 76.02 |
DAN | 99.72 | 87.45 | 76.57 | 73.63 | 77.92 | 83.06 | 76.32 | 97.31 | 88.93 | 95.46 | 92.92 | 93.93 | 93.71 | 91.32 |
JAN | 62.35 | 93.52 | 95.67 | 74.51 | 61.70 | 77.55 | 71.67 | 96.36 | 89.86 | 93.95 | 93.44 | 93.53 | 93.43 | 91.21 |
DCORAL | 99.66 | 94.67 | 95.82 | 58.99 | 55.74 | 80.98 | 71.84 | 99.89 | 89.56 | 95.04 | 94.11 | 96.12 | 94.94 | 92.95 |
DANN | 95.30 | 96.88 | 91.32 | 81.35 | 32.58 | 79.49 | 75.00 | 99.55 | 87.55 | 97.95 | 92.08 | 95.28 | 94.48 | 91.89 |
CDAN | 82.40 | 92.59 | 96.74 | 65.11 | 68.55 | 81.08 | 73.21 | 99.84 | 90.01 | 96.22 | 92.76 | 95.67 | 94.90 | 92.83 |
PSCAN | 99.67 | 94.78 | 94.71 | 75.89 | 49.77 | 82.96 | 77.29 | 99.97 | 91.52 | 94.34 | 90.91 | 92.27 | 93.80 | 91.78 |
GF-3 → ALOS-2 | GF-3 → RS-2 | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Method | C1 | C2 | C3 | C4 | C5 | AA | KC | C1 | C2 | C3 | C4 | C5 | AA | KC |
Source-only | 3.42 | 96.47 | 88.01 | 46.84 | 5.51 | 48.05 | 25.70 | 83.61 | 93.34 | 98.26 | 45.62 | 3.45 | 64.86 | 56.05 |
TCA | 55.05 | 86.19 | 48.20 | 94.84 | 61.38 | 69.13 | 60.21 | 93.25 | 80.00 | 45.73 | 89.01 | 73.01 | 76.20 | 70.66 |
JDA | 56.81 | 72.81 | 63.85 | 94.65 | 67.54 | 71.13 | 63.95 | 84.10 | 83.07 | 51.38 | 88.15 | 61.39 | 73.62 | 67.70 |
BDA | 59.00 | 80.29 | 68.95 | 93.19 | 62.75 | 72.83 | 65.90 | 82.57 | 83.39 | 52.28 | 86.75 | 59.08 | 72.81 | 66.75 |
TJM | 73.63 | 83.60 | 61.77 | 90.71 | 63.43 | 74.63 | 67.74 | 92.37 | 81.69 | 48.75 | 88.75 | 74.52 | 77.22 | 71.88 |
DAN | 83.97 | 78.89 | 96.49 | 85.54 | 81.38 | 85.25 | 80.96 | 98.24 | 75.71 | 88.01 | 85.33 | 90.38 | 87.53 | 82.94 |
JAN | 85.50 | 77.18 | 96.96 | 86.19 | 82.38 | 85.64 | 81.90 | 97.80 | 72.46 | 91.43 | 86.34 | 93.25 | 88.25 | 83.20 |
DCORAL | 81.70 | 90.79 | 97.17 | 89.54 | 64.11 | 84.66 | 78.64 | 98.90 | 89.94 | 97.07 | 90.50 | 88.53 | 92.99 | 90.37 |
DANN | 97.74 | 73.39 | 97.68 | 94.95 | 82.76 | 89.30 | 89.28 | 99.74 | 89.08 | 98.04 | 89.37 | 85.85 | 92.42 | 89.47 |
CDAN | 98.61 | 80.92 | 96.53 | 90.27 | 77.86 | 88.84 | 88.05 | 99.16 | 83.65 | 97.23 | 92.34 | 94.90 | 93.46 | 90.28 |
PSCAN | 99.28 | 77.55 | 93.10 | 95.27 | 80.82 | 89.20 | 89.20 | 98.78 | 86.86 | 93.64 | 93.67 | 94.45 | 93.48 | 90.97 |
RS-2 → ALOS-2 | RS-2 → GF-3 | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Method | C1 | C2 | C3 | C4 | C5 | AA | KC | C1 | C2 | C3 | C4 | C5 | AA | KC |
Source-only | 59.69 | 94.40 | 88.30 | 88.26 | 58.70 | 77.87 | 66.88 | 0.17 | 94.85 | 97.46 | 14.33 | 12.23 | 43.81 | 27.76 |
TCA | 70.48 | 95.47 | 67.30 | 90.02 | 60.35 | 76.72 | 69.92 | 99.73 | 96.54 | 75.69 | 59.09 | 47.77 | 75.77 | 68.81 |
JDA | 77.68 | 91.01 | 76.80 | 86.08 | 68.28 | 79.97 | 74.40 | 100.00 | 95.94 | 72.02 | 64.69 | 61.34 | 78.80 | 72.48 |
BDA | 77.26 | 91.87 | 76.13 | 84.43 | 66.97 | 79.33 | 73.49 | 100.00 | 96.24 | 71.34 | 64.38 | 60.81 | 78.56 | 72.16 |
TJM | 78.38 | 95.18 | 66.97 | 87.49 | 63.66 | 78.34 | 71.86 | 100.00 | 95.53 | 61.73 | 73.00 | 69.46 | 79.94 | 73.91 |
DAN | 96.00 | 88.20 | 94.79 | 94.31 | 78.09 | 90.28 | 88.61 | 99.71 | 90.33 | 86.93 | 83.62 | 75.70 | 87.26 | 82.94 |
JAN | 96.80 | 88.93 | 94.29 | 94.32 | 77.51 | 90.37 | 88.83 | 99.72 | 90.41 | 86.88 | 83.68 | 75.66 | 87.27 | 82.96 |
DCORAL | 80.42 | 91.01 | 96.49 | 91.16 | 74.29 | 86.67 | 80.83 | 99.62 | 93.85 | 94.95 | 63.41 | 58.46 | 82.06 | 73.77 |
DANN | 98.26 | 88.85 | 94.35 | 96.04 | 77.86 | 91.07 | 90.01 | 97.31 | 96.53 | 88.93 | 82.55 | 60.99 | 85.26 | 81.01 |
CDAN | 98.58 | 87.71 | 97.14 | 91.37 | 80.06 | 90.97 | 89.93 | 96.15 | 96.93 | 91.39 | 80.40 | 45.89 | 82.15 | 77.46 |
PSCAN | 99.67 | 83.22 | 94.35 | 93.71 | 85.46 | 91.28 | 91.16 | 97.81 | 93.41 | 81.97 | 89.09 | 78.88 | 88.23 | 85.20 |
GF-3 → RS-2 | RS-2 → GF-3 | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Method | C1 | C2 | C3 | AA | KC | C1 | C2 | C3 | AA | KC |
Source-only | 85.53 | 0.00 | 100.00 | 61.84 | 41.90 | 100.00 | 0.00 | 85.45 | 61.82 | 37.40 |
TCA | 92.73 | 69.55 | 100.00 | 87.43 | 81.38 | 87.52 | 63.64 | 37.39 | 62.85 | 45.41 |
JDA | 92.56 | 72.98 | 99.71 | 88.42 | 82.74 | 85.21 | 85.18 | 82.32 | 84.23 | 76.14 |
BDA | 92.15 | 81.45 | 99.52 | 91.04 | 86.53 | 84.76 | 86.95 | 79.39 | 83.70 | 75.75 |
TJM | 92.73 | 88.49 | 99.90 | 93.71 | 90.44 | 83.07 | 64.01 | 81.08 | 76.05 | 61.88 |
DAN | 90.03 | 88.89 | 99.81 | 92.91 | 89.29 | 97.46 | 89.66 | 88.76 | 91.96 | 87.87 |
JAN | 92.37 | 86.69 | 99.91 | 92.99 | 89.37 | 93.95 | 84.93 | 97.59 | 92.15 | 87.21 |
DCORAL | 90.51 | 99.36 | 93.19 | 94.35 | 91.62 | 89.69 | 85.55 | 99.77 | 91.67 | 86.39 |
DANN | 90.37 | 98.26 | 99.96 | 96.20 | 94.34 | 81.04 | 91.31 | 99.46 | 90.60 | 85.23 |
CDAN | 86.00 | 99.88 | 100.00 | 95.29 | 93.04 | 85.76 | 93.74 | 98.19 | 92.56 | 88.46 |
PSCAN | 93.71 | 99.29 | 99.86 | 97.62 | 96.46 | 89.62 | 95.11 | 99.27 | 94.66 | 91.67 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dong, H.; Si, L.; Qiang, W.; Miao, W.; Zheng, C.; Wu, Y.; Zhang, L. A Polarimetric Scattering Characteristics-Guided Adversarial Learning Approach for Unsupervised PolSAR Image Classification. Remote Sens. 2023, 15, 1782. https://doi.org/10.3390/rs15071782
Dong H, Si L, Qiang W, Miao W, Zheng C, Wu Y, Zhang L. A Polarimetric Scattering Characteristics-Guided Adversarial Learning Approach for Unsupervised PolSAR Image Classification. Remote Sensing. 2023; 15(7):1782. https://doi.org/10.3390/rs15071782
Chicago/Turabian StyleDong, Hongwei, Lingyu Si, Wenwen Qiang, Wuxia Miao, Changwen Zheng, Yuquan Wu, and Lamei Zhang. 2023. "A Polarimetric Scattering Characteristics-Guided Adversarial Learning Approach for Unsupervised PolSAR Image Classification" Remote Sensing 15, no. 7: 1782. https://doi.org/10.3390/rs15071782
APA StyleDong, H., Si, L., Qiang, W., Miao, W., Zheng, C., Wu, Y., & Zhang, L. (2023). A Polarimetric Scattering Characteristics-Guided Adversarial Learning Approach for Unsupervised PolSAR Image Classification. Remote Sensing, 15(7), 1782. https://doi.org/10.3390/rs15071782