Continuous Particle Swarm Optimization-Based Deep Learning Architecture Search for Hyperspectral Image Classification
<p>Framework of the stages of particle swarm optimization PSO-Net. The constructed network architectures from the search space are encoded into arrays, and populations are initialized. The gradient descent method is used to obtain the weight parameters for all architectures separately, and PSO is used to optimize the architectures and to search for the optimal architecture until reaching the number of iterations. The output is the optimal architecture.</p> "> Figure 2
<p>Framework of CPSO-Net. A continuous SuperNet that contains all candidate operations and can be trained by gradient descent is maintained first. Then, all architectures are encoded into an array, inheriting the parameters from the SuperNet, and search for the optimal architecture by PSO. Finally, the optimal architecture is found.</p> "> Figure 3
<p>The general architecture stacking by the automatically designed computation cells by PSO-Net and CPSO-Net. The bottleneck convolution layer and the fully connected layer in <a href="#remotesensing-13-01082-f003" class="html-fig">Figure 3</a> are designed for data preprocessing and classification with human knowledge.</p> "> Figure 4
<p>An example of an encoded architecture, including an encoded normal cell and a reduction cell with four intermediate nodes. There are eight elements in each cell with four intermediate nodes. (<b>a</b>) An example of an encoded normal cell. The 0th intermediate node, as well the 2th node in the cell, are calculated by the sum of two elements <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>P</mi> </mrow> </semantics></math> [<a href="#B2-remotesensing-13-01082" class="html-bibr">2</a>] computed based on the 0th node in the cell and <math display="inline"><semantics> <mrow> <mi>O</mi> <mi>P</mi> </mrow> </semantics></math> [<a href="#B4-remotesensing-13-01082" class="html-bibr">4</a>] computed based on the 1th node in the cell, where the 0th and 1th nodes in the cell are the two input nodes. All the remaining intermediate nodes are encoded into arrays in the same way. (<b>b</b>) An example of an encoded reduction cell.</p> "> Figure 5
<p>The update process for an element in a particle. This example shows an update of an element of the <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>n</mi> <mi>d</mi> </mrow> </semantics></math> intermediate node. Before being able to update the element of a particle, the difference between <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>g</mi> <mi>B</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> <mo>_</mo> <mi>E</mi> <mo>−</mo> <mi>P</mi> <mo>_</mo> <mi>E</mi> <mo>)</mo> </mrow> </semantics></math> or <math display="inline"><semantics> <mrow> <mo>(</mo> <mi>p</mi> <mi>B</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> <mo>_</mo> <mi>E</mi> <mo>−</mo> <mi>P</mi> <mo>_</mo> <mi>E</mi> <mo>)</mo> </mrow> </semantics></math> by the different, and the velocity of the element needs to be computed using the velocity operator.</p> "> Figure 6
<p>Classification results of different models searched by CPSO-Net with different numbers of cells, intermediate nodes, and population sizes based on four biased hyperspectral image (HSI) datasets.</p> "> Figure 7
<p>Time consumed in the different stages of the different automatically designed CNNs methods based on four biased HSI datasets.</p> "> Figure 8
<p>The accuracy and parameters of <math display="inline"><semantics> <mrow> <mi>g</mi> <mi>B</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> </mrow> </semantics></math> during the iterations based on four biased datasets by CPSO-Net.</p> "> Figure 9
<p>Positions of 20 particles and <math display="inline"><semantics> <mrow> <mi>g</mi> <mi>B</mi> <mi>e</mi> <mi>s</mi> <mi>t</mi> </mrow> </semantics></math> during the iterations based on four biased HSI datasets by CPSO-Net.</p> "> Figure 10
<p>Optimal architecture of PSO-Net for the biased Pavia dataset.</p> "> Figure 11
<p>Optimal architecture of CPSO-Net for the biased Salinas dataset.</p> "> Figure 12
<p>Classification Maps on biased Pavia dataset.</p> "> Figure 13
<p>Classification maps based on the biased Salinas dataset.</p> "> Figure 14
<p>Classification maps based on the biased Indian Pines dataset.</p> "> Figure 15
<p>Classification maps based on the biased KSC dataset.</p> ">
Abstract
:1. Introduction
- (1)
- Two methods based on PSO are explored to automatically design the CNN architecture for HSI classification.
- (2)
- A novel encoding strategy is devised that can be used to encode architectures into arrays with the information of the connections and basic operations types between the nodes in computation cells.
- (3)
- To improve the search efficiency, CPSO-Net maintains continuous SuperNet sharing parameters for all particles and optimizes by collecting the gradients of all individuals in the population.
- (4)
- PSO-Net and CPSO-Net are tested on four biased and unbiased hyperspectral datasets with limited training samples, showing comparable performance to the state-of-the-art CNN classification methods.
2. Related Work
2.1. Neural Architecture Search
2.2. Particle Swarm Optimization
3. Proposed Method
3.1. Construction of the Search Space
3.2. Initialization of the Swarm
3.3. The Weight Optimization of the Neural Network Constructed by Architectures
3.3.1. Individual Parameter Optimization of PSO-Net
3.3.2. Weight-Sharing Parameter Optimization of CPSO-Net
3.4. Fitness Evaluation
3.5. Particle Update
4. Experimental Results and Analysis
4.1. Datasets
4.2. Comparative Experiment
4.3. Experimental Settings
4.4. Results Analysis
4.4.1. Classification Accuracy
4.4.2. Complexity Analysis
4.4.3. Convergence Analysis
4.4.4. Optimal Architecture
4.4.5. Classification Maps
4.5. Unbiased HSI Classification
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Camps-Valls, G.; Tuia, D.; Bruzzone, L.; Benediktsson, J.A. Advances in hyperspectral image classification: Earth monitoring with statistical learning methods. IEEE Signal Process. Mag. 2014, 31, 45–54. [Google Scholar] [CrossRef] [Green Version]
- Yi, C.; Zhao, Y.Q.; Chan, J.C.W. Hyperspectral image super-resolution based on spatial and spectral correlation fusion. IEEE Trans. Geosci. Remote Sens. 2018, 56, 4165–4177. [Google Scholar] [CrossRef]
- Kang, X.; Zhang, X.; Li, S.; Li, K.; Li, J.; Benediktsson, J.A. Hyperspectral anomaly detection with attribute and edge-preserving filters. IEEE Trans. Geosci. Remote Sens. 2017, 55, 5600–5611. [Google Scholar] [CrossRef]
- Haut, J.M.; Bernabé, S.; Paoletti, M.E.; Fernandez-Beltran, R.; Plaza, A.; Plaza, J. LowChigh-power consumption architectures for deep-learning models applied to hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2018, 16, 776–780. [Google Scholar] [CrossRef]
- Teke, M.; Deveci, H.S.; Haliloglu, O.; Gürbüz, S.Z.; Sakarya, U. A short survey of hyperspectral remote sensing applications in agriculture. In Proceedings of the 2013 6th International Conference on Recent Advances in Space Technologies (RAST), Istanbul, Turkey, 16 August 2013; pp. 171–176. [Google Scholar]
- Shang, X.; Chisholm, L.A. Classification of australian native forest species using hyperspectral remote sensing and machine-learning classification algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2481–2489. [Google Scholar] [CrossRef]
- Han, Y.; Li, J.; Zhang, Y.; Hong, Z.; Wang, J. Sea ice detection based on an improved similarity measurement method using hyperspectral data. Sensors 2017, 17, 1124. [Google Scholar]
- El-Sharkawy, Y.H.; Elbasuney, S. Hyperspectral imaging: Anew prospective for remote recognition of explosive materials. Remote Sens. Appl. Soc. Environ. 2019, 13, 31–38. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, H.; Shen, Q. Spectral-spatial classification of hyperspectral imagery with 3d convolutional neural network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef] [Green Version]
- Luo, Y.; Zou, J.; Yao, C.; Zhao, X.; Li, T.; Bai, G. Hsi-cnn: A novel convolution neural network for hyperspectral image. In Proceedings of the International Conference on Audio, Language and Image Processing (ICALIP), Shanghai, China, 16–17 July 2018; pp. 464–469. [Google Scholar]
- Santara, A.; Mani, K.; Hatwar, P.; Singh, A.; Garg, A.; Padia, K.; Mitra, P. BASS Net: Band-adaptive spectral-spatial feature learning neural network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 5293–5301. [Google Scholar] [CrossRef] [Green Version]
- Cai, Y.; Liu, X.; Cai, Z. BS-Nets: An end-to-end framework for band selection of hyperspectral image. IEEE Trans. Geosci. Remote Sens. 2020, 58, 1969–1984. [Google Scholar] [CrossRef] [Green Version]
- Cai, Y.; Zhang, Z.; Cai, Z.; Liu, X.; Jiang, X.; Yan, Q. Graph convolutional subspace clustering: A robust subspace clustering framework for hyperspectral image. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Feng, J.; Wang, L.; Yu, H.; Jiao, L.; Zhang, X. Divide-and-conquer dual-architecture convolutional neural network for classification of hyperspectral images. Remote Sens. 2019, 11, 484. [Google Scholar] [CrossRef] [Green Version]
- Charmisha, K.; Sowmya, V.; Soman, K. Dimensionally reduced features for hyperspectral image classification using deep learning. In Proceedings of the International Conference on Communications and Cyber Physical Engineering (ICCCE), Hyderabad, India, 24–25 January 2018; pp. 171–179. [Google Scholar]
- Du, J.; Li, Z. A hyperspectral target detection framework with subtraction pixel pair features. IEEE Access. 2018, 6, 45562–45577. [Google Scholar] [CrossRef]
- Jia, P.; Zhang, M.; Yu, W.; Shen, F.; Shen, Y. Convolutional neural network based classification for hyperspectral data. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 5075–5078. [Google Scholar]
- Haut, J.; Paoletti, M.; Plaza, J.; Plaza, A. Hyperspectral image classification using random occlusion data augmentation. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1751–1755. [Google Scholar] [CrossRef]
- Mei, S.; Ji, J.; Geng, Y.; Zhang, Z.; Li, X.; Du, Q. Unsupervised spatial-spectral feature learning by 3d convolutional autoencoder for hyperspectral classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6808–6820. [Google Scholar] [CrossRef]
- Sellami, A.; Farah, M.; Farah, I.R.; Solaiman, B. Hyperspectral imagery classification based on semi-supervised 3-d deep neural network and adaptive band selection. Expert Syst. Appl. 2019, 129, 246–259. [Google Scholar] [CrossRef]
- Paoletti, M.; Haut, J.; Plaza, J.; Plaza, A. Deep&Dense convolutional neural network for hyperspectral image classification. Remote Sens. 2018, 10, 1454. [Google Scholar]
- Zhong, Z.; Li, J.; Luo, Z.; Chapman, M. Spectral-spatial residual network for hyperspectral image classification: A 3-D deep learning framework. IEEE Trans. Geosci. Remote Sens. 2018, 56, 847–858. [Google Scholar] [CrossRef]
- Fang, B.; Li, Y.; Zhang, H.; Chan, J. Hyperspectral images classification based on dense convolutional networks with spectral-wise attention mechanism. Remote Sens. 2019, 11, 159. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Nasrabadi, N.M.; Tran, T.D. Hyperspectral image classification using dictionary-based sparse representation. IEEE Trans. Geosci. Remote Sens. 2011, 49, 3973–3985. [Google Scholar] [CrossRef]
- Safari, K.; Prasad, S.; Labate, D. A multiscale deep learning approach for high-resolution hyperspectral image classification. IEEE Geosci. Remote Sens Lett. 2021, 18, 167–171. [Google Scholar] [CrossRef]
- Shao, W.; Du, S. Spectral-spatial feature extraction for hyperspectral image classification: A dimension reduction and deep learning approach. IEEE Trans. Geosci. Remote Sens. 2016, 54, 4544–4554. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- He, X.; Zhao, K.; Chu, X. AutoML: A survey of the state-of-the-art. Knowl. Based Syst. 2021, 212, 1–27. [Google Scholar] [CrossRef]
- Elsken, T.; Metzen, J.H.; Hutter, F. Neural architecture search: A survey. J. Mach. Learn. Res. 2019, 20, 1–21. [Google Scholar]
- Zoph, B.; Le, Q.V. Neural architecture search with reinforcement learning. arXiv 2017, arXiv:1611.01578. [Google Scholar]
- Chen, Y.; Zhu, K.; Zhu, L.; He, X.; Ghamisi, G.; Benediktsson, J.A. Automatic design of convolutional neural network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7048–7066. [Google Scholar] [CrossRef]
- Junior, F.; Erivaldo, F.; Yen, G. Particle swarm optimization of deep neural networks architectures for image classification. Swarm Evol. Comput. 2019, 49, 62–74. [Google Scholar] [CrossRef]
- Real, E.; Aggarwal, A.; Huang, Y.; Le, Q.V. Regularized evolution for image classifier architecture search. In Proceedings of the Conference on Artificial Intelligence (AAAI), Honolulu, HI, USA, 27 January–1 February 2019; pp. 4780–4789. [Google Scholar]
- Stanley, K.O.; Miikkulainen, R. Evolving neural networks through augmenting topologies. IEEE Trans. Evol. Comput. 2002, 10, 99–127. [Google Scholar] [CrossRef]
- Gauci, J.; Stanley, K.O. Autonomous evolution of topographic regularities in artificial neural networks. Neural Comput. 2010, 22, 1860–1898. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Stanley, K.O.; D’Ambrosio, D.B.; Gauci, J. A hypercube-based indirect encoding for evolving large-scale neural networks. Artif. Life 2009, 15, 185–212. [Google Scholar] [CrossRef]
- Stanley, K.O. Compositional pattern producing networks: A novel abstraction of development. Genet. Program Evol. Mach. 2007, 8, 131–162. [Google Scholar] [CrossRef] [Green Version]
- Baker, B.; Gupta, O.; Naik, N.; Raskar, R. Designing Neural Network Architectures Using Reinforcement Learning. arXiv 2017, arXiv:1611.02167. [Google Scholar]
- Zoph, B.; Vasudevan, V.; Shlens, J.; Le, Q.V. Learning transferable architectures for scalable image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–22 June 2018; pp. 8697–8710. [Google Scholar]
- Liu, H.; Simonyan, K.; Yang, Y. Darts: Differentiable architecture search. arXiv 2019, arXiv:1806.09055. [Google Scholar]
- Xie, S.; Zheng, H.; Liu, C.; Lin, C. Snas: Stochastic neural architecture search. arXiv 2019, arXiv:1812.09926. [Google Scholar]
- Wu, B.; Dai, X.; Zhang, P.; Wang, Y.; Sun, F.; Wu, Y.; Tian, Y.; Vajda, P.; Jia, Y.; Keutzer, K. Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 10734–10742. [Google Scholar]
- Pham, H.; Guan, M.Y.; Zoph, B.; Le, Q.V.; Dean, J. Efficient neural architecture search via parameter sharing. In Proceedings of the Internation Conference on Machine Learning (ICML), Stockholm, Sweden, 10–15 July 2018; pp. 6522–6531. [Google Scholar]
- Xie, L.; Yuille, A. Genetic cnn. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 1388–1397. [Google Scholar]
- Yang, Z.; Wang, Y.; Chen, X.; Shi, B.; Xu, C.; Xu, C.; Tian, Q.; Xu, C. CARS: Continuous evolution for efficient neural architecture search. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 1829–1838. [Google Scholar]
- Elsken, T.; Metzen, J.H.; Hutter, F. Efficient multi-objective neural architecture search via lamarckian evolution. arXiv 2019, arXiv:1804.09081. [Google Scholar]
- Brock, A.; Lim, T.; Ritchie, J.M.; Weston, N. Smash: One-shot model architecture search through hypernetworks. arXiv 2017, arXiv:1708.05344. [Google Scholar]
- Saxena, S.; Verbeek, J. Convolutional neural fabrics. In Proceedings of the Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; pp. 4060–4068. [Google Scholar]
- Eberhart, R.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the 6th International Symposium on Micro Machine and Human Science (MHS), Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
- Sahu, A.; Panigrahi, S.K.; Pattnaik, S. Fast convergence particle swarm optimization for functions optimization. Procedia Technol. 2012, 4, 319–324. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Jiang, H.; Li, C.; Jia, X.; Ghamisi, P. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6232–6251. [Google Scholar] [CrossRef] [Green Version]
- Friedman, M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 1940, 30, 86–92. [Google Scholar] [CrossRef]
- Nalepa, J.; Myller, M.; Kawulok, M. Validating hyperspectral image segmentation. IEEE Geosci. Remote Sens. Lett. 2019, 16, 1264–1268. [Google Scholar] [CrossRef] [Green Version]
- Cao, X.; Ren, M.; Zhao, J.; Lu, H.; Jiao, L. Non-overlapping classification of hyperspectral imagery based on set-to-sets distance. Neurocomputing 2020, 378, 422–434. [Google Scholar] [CrossRef]
Salinas Valley (Salinas) | Indian Pines | ||||||
---|---|---|---|---|---|---|---|
No. | Class | Samples | Training | No. | Class | Samples | Training |
1 | Broccoli_green_weeds_1 | 2009 | 8 | 1 | Alfalfa | 46 | 1 |
2 | Broccoli_green_weeds_2 | 3726 | 14 | 2 | Corn–notill | 1428 | 28 |
3 | Fallow | 1976 | 8 | 3 | Corn–mintill | 830 | 16 |
4 | Fallow_rough_plow | 1394 | 6 | 4 | Corn | 237 | 4 |
5 | Fallow_smooth | 2678 | 10 | 5 | Grass–pasture | 483 | 10 |
6 | Stubble | 3959 | 14 | 6 | Grass–trees | 730 | 14 |
7 | Celery | 3579 | 14 | 7 | Grass–pasture–mowed | 28 | 1 |
8 | Grapes_untrained | 11,271 | 40 | 8 | Hay–windrowed | 478 | 10 |
9 | Soil_vineyard_develop | 6203 | 22 | 9 | Oats | 20 | 1 |
10 | Corn_senesced_green_weeds | 3278 | 12 | 10 | Soybean–notill | 972 | 18 |
11 | Lettuce_romaine_4wk | 1068 | 4 | 11 | Soybean–mintill | 2455 | 47 |
12 | Lettuce_romaine_5wk | 1927 | 8 | 12 | Soybean–clean | 593 | 12 |
13 | Lettuce_romaine_6wk | 916 | 4 | 13 | Wheat | 205 | 4 |
14 | Lettuce_romaine_7wk | 1070 | 4 | 14 | Woods | 1265 | 24 |
15 | Vineyard_untrained | 7268 | 26 | 15 | Buildings–grass–trees–drives | 386 | 8 |
16 | Vineyard_vertical_trellis | 1807 | 6 | 16 | Stone–steel–towers | 93 | 2 |
Total | 43,980 | 200 | Total | 10,249 | 200 | ||
University of Pavia (Pavia) | Kennedy Space Center (KSC) | ||||||
No. | Class | Samples | Training | No. | Class | Samples | Training |
1 | Asphalt | 6631 | 34 | 1 | Scrub | 761 | 24 |
2 | Meadows | 18,649 | 70 | 2 | Willow swamp | 243 | 10 |
3 | Gravel | 2099 | 12 | 3 | Cp hammock | 256 | 10 |
4 | Trees | 3064 | 14 | 4 | Slash pine | 252 | 10 |
5 | Painted metal sheets | 1345 | 6 | 5 | Oak/broadleaf | 261 | 10 |
6 | Bare soil | 5029 | 30 | 6 | Hardwood | 229 | 10 |
7 | Bitumen | 1330 | 14 | 7 | Swamp | 105 | 6 |
8 | Self-blocking bricks | 3682 | 16 | 8 | Graminoid marsh | 431 | 20 |
9 | Shadows | 947 | 4 | 9 | Spartina marsh | 520 | 20 |
10 | Cattail marsh | 404 | 18 | ||||
11 | Salt marsh | 419 | 18 | ||||
12 | Mud flats | 503 | 20 | ||||
13 | Water | 927 | 24 | ||||
Total | 42,776 | 200 | Total | 5211 | 200 |
Algorithm | Architecture | CNN Training | PSO | |||||
---|---|---|---|---|---|---|---|---|
Parameters | Cells | Nodes | Train_Samples | Valid_Samples | Epochs_Search | Epochs_Test | Pop_Size | Iterations |
Hand-CNNs | / | / | 200 | 600 | / | 100 | / | / |
Auto-CNNs | 3 | 4 | 200 | 600 | / | 100 | / | / |
PSO-Net | 3 | 4 | 200 | 600 | 100 | 100 | 100 | 40 |
CPSO-Net | 3 | 4 | 200 | 600 | 1 | 100 | 100 | 40 |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 91.79 ± 0.98 | 91.65 ± 2.64 | 88.67 ± 0.65 | 91.86 ± 0.23 | 92.58 ± 0.82 | 94.36 ± 0.55 | 93.96 ± 1.23 |
AA (%) | 84.59 ± 2.97 | 89.00 ± 3.27 | 80.86 ± 1.12 | 84.86 ± 1.49 | 87.75 ± 1.23 | 90.06 ± 1.02 | 88.39 ± 2.11 |
K × 100 | 89.42 ± 1.58 | 89.16 ± 3.37 | 85.60 ± 1.34 | 89.63 ± 1.87 | 90.47 ± 1.30 | 92.68 ± 1.03 | 92.32 ± 1.07 |
Asphalt | 92.11 ± 2.40 | 95.08 ± 2.46 | 91.90 ± 2.48 | 94.32 ± 1.16 | 95.16 ± 1.38 | 95.68 ± 1.35 | 95.31 ± 1.48 |
Meadows | 98.88 ± 0.72 | 96.20 ± 4.20 | 97.16 ± 0.23 | 98.38 ± 0.84 | 98.17 ± 0.79 | 99.05 ± 0.48 | 98.45 ± 0.59 |
Gravel | 59.99 ± 13.85 | 69.92 ± 18.36 | 79.43 ± 8.27 | 81.87 ± 9.48 | 82.65 ± 12.93 | 70.93 ± 14.39 | 84.23 ± 13.21 |
Trees | 93.35 ± 4.35 | 90.84 ± 6.93 | 52.57 ± 12.76 | 49.15 ± 8.82 | 63.49 ± 14.82 | 76.46 ± 6.32 | 71.24 ± 5.92 |
Painted metal sheets | 97.10 ± 4.30 | 99.25 ± 2.01 | 99.33 ± 0.35 | 99.48 ± 0.38 | 96.72 ± 2.49 | 96.45 ± 1.05 | 98.28 ± 3.16 |
Bare Soil | 93.97 ± 3.24 | 89.34 ± 10.88 | 93.42 ± 2.49 | 96.26 ± 2.83 | 98.76 ± 0.38 | 98.84 ± 0.47 | 99.54 ± 0.46 |
Bitumen | 54.23 ± 12.48 | 82.36 ± 14.01 | 87.16 ± 1.84 | 94.24 ± 2.85 | 95.82 ± 1.37 | 97.95 ± 1.25 | 97.43 ± 1.85 |
Self-blocking bricks | 88.77 ± 5.64 | 79.67 ± 18.68 | 95.42 ± 1.56 | 97.80 ± 0.37 | 96.42 ± 2.48 | 97.22 ± 1.37 | 98.42 ± 0.76 |
Shadows | 82.95 ± 21.60 | 98.34 ± 2.11 | 31.55 ± 10.26 | 52.23 ± 11.72 | 62.74 ± 25.64 | 77.02 ± 8.29 | 52.61 ± 16.47 |
Parameters | 141 K | 216.5 K | 25.6 M | 7.3 M | 92.4 K | 85.7 K | 37.7 K |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 90.68 ± 1.28 | 91.16 ± 1.64 | 91.57 ± 0.74 | 92.97 ± 0.82 | 95.42 ± 1.48 | 96.18 ± 1.08 | 96.64 ± 1.48 |
AA (%) | 88.18 ± 1.25 | 93.59 ± 2.35 | 92.80 ± 1.02 | 94.94 ± 1.14 | 95.37 ± 2.05 | 96.52 ± 1.24 | 97.32 ± 1.24 |
K × 100 | 89.56 ± 1.45 | 90.16 ± 1.82 | 90.51 ± 0.95 | 92.11 ± 0.89 | 94.09 ± 1.85 | 95.84 ± 1.57 | 96.23 ± 1.15 |
Brocoli_green_weeds_1 | 81.76 ± 6.68 | 99.59 ± 0.72 | 98.73 ± 0.36 | 95.44 ± 3.58 | 81.69 ± 8.29 | 84.80 ± 8.35 | 94.67 ± 5.33 |
Brocoli_green_weeds_2 | 94.88 ± 6.07 | 99.54 ± 1.35 | 74.36 ± 0.44 | 89.69 ± 1.45 | 97.52 ±0.28 | 97.53 ± 0.48 | 96.95 ± 1.21 |
Fallow | 91.35 ± 4.62 | 82.03 ± 20.05 | 100.00 ± 0.00 | 95.11 ± 2.13 | 92.36 ± 1.40 | 94.22 ± 1.37 | 97.68 ± 1.04 |
Fallow_rough_plow | 88.31 ± 7.49 | 98.61 ± 1.57 | 95.42 ± 2.59 | 100.00 ± 0.00 | 93.28 ± 1.27 | 96.45 ± 1.21 | 96.82 ± 1.60 |
Fallow_smooth | 95.60 ± 4.40 | 97.13 ± 3.98 | 96.76 ± 0.33 | 97.31 ± 0.42 | 97.36 ± 0.58 | 97.62 ± 0.70 | 96.93 ± 1.46 |
Stubble | 99.74 ± 0.63 | 99.81 ± 0.49 | 99.36 ± 0.63 | 99.37 ± 0.56 | 99.78 ± 0.17 | 99.82 ±0.05 | 99.87 ± 0.12 |
Celery | 97.42 ± 2.03 | 99.62 ± 0.58 | 99.45 ± 0.35 | 99.48 ± 0.27 | 99.47 ± 0.12 | 99.49 ± 0.59 | 98.66 ± 0.49 |
Grapes_untrained | 88.48 ± 3.13 | 81.13 ± 8.86 | 96.76 ± 1.42 | 97.16 ± 0.67 | 95.26 ± 1.21 | 96.12 ± 2.48 | 98.03 ± 3.37 |
Soil_vineyard_develop | 98.96 ± 1.70 | 98.96 ± 2.30 | 97.39 ± 1.35 | 99.37 ± 0.54 | 99.36 ± 1.10 | 99.38 ± 0.34 | 95.42 ± 3.91 |
Corn_senesced_green_weeds | 95.20 ± 5.57 | 91.58 ± 3.72 | 93.82 ± 4.48 | 99.21.00 ± 0.93 | 99.81 ± 0.25 | 99.22 ± 0.64 | 99.86 ± 0.52 |
Lettuce_romaine_4wk | 77.33 ± 8.43 | 83.38 ± 28.67 | 98.02 ± 0.83 | 99.86 ± 0.43 | 98.85 ± 0.12 | 99.92 ± 0.13 | 95.89 ± 1.14 |
Lettuce_romaine_5wk | 89.66 ± 9.04 | 98.72 ± 3.02 | 96.12 ± 0.65 | 99.45 ± 0.57 | 92.63 ± 0.64 | 99.82 ± 1.08 | 99.92 ± 1.89 |
Lettuce_romaine_6wk | 87.92 ± 9.43 | 95.30 ± 5.09 | 78.30 ± 6.34 | 78.81 ± 6.87 | 95.42 ± 0.72 | 96.61 ± 1.46 | 95.42 ± 1.79 |
Lettuce_romaine_7wk | 90.53 ± 8.71 | 96.95 ± 3.15 | 97.02 ± 0.85 | 96.50 ± 1.45 | 93.78 ± 1.03 | 94.77 ± 1.12 | 97.44 ± 3.35 |
Vineyard_untrained | 81.78 ± 11.24 | 79.72 ± 7.47 | 71.30 ± 10.29 | 74.63 ± 8.27 | 95.62 ± 2.34 | 95.89 ± 1.01 | 96.55 ± 1.56 |
Vineyard_vertical_trellis | 61.12 ± 14.38 | 95.37 ± 2.84 | 91.96 ± 2.48 | 97.65 ± 3.81 | 93.79 ± 4.96 | 92.70 ± 5.23 | 97.12 ± 6.67 |
Parameters | 193 K | 370.3 K | 27.5 M | 7.6 M | 101.8 K | 61.6 K | 47.1 K |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 85.29 ± 1.45 | 80.66 ± 2.91 | 81.61 ± 1.23 | 87.77 ± 1.17 | 88.12 ± 1.42 | 89.52 ± 0.98 | 89.32 ± 1.27 |
AA (%) | 69.94 ± 4.83 | 65.35 ± 6.85 | 66.71 ± 5.64 | 72.13 ±4.76 | 73.05 ±6.29 | 74.25 ± 4.04 | 74.42 ± 7.26 |
K × 100 | 82.70 ± 1.75 | 77.83 ± 3.37 | 78.02 ± 1.84 | 85.34 ± 1.49 | 85.87 ± 1.94 | 87.52 ± 1.59 | 87.58 ± 1.53 |
Alfalfa | 35.95 ± 35.52 | 12.46 ± 22.28 | 25.39 ± 46.34 | 46.72 ± 41.47 | 36.63 ± 42.27 | 38.63 ± 45.42 | 38.32 ± 39.62 |
Corn–notill | 82.49 ± 14.94 | 79.05 ± 10.59 | 87.37 ± 4.26 | 96.21 ± 5.36 | 86.02 ± 4.93 | 89.90 ± 2.21 | 89.95 ± 3.83 |
Corn–mintill | 64.88 ± 14.94 | 73.92 ± 16.97 | 62.38 ± 18.94 | 62.86 ± 22.63 | 79.63 ± 19.54 | 79.07 ± 26.98 | 67.46 ± 17.62 |
Corn | 45.67 ± 20.51 | 38.96 ± 21.69 | 51.37 ± 26.43 | 49.06 ± 20.52 | 54.19 ± 38.16 | 72.49 ± 18.64 | 68.27 ± 39.48 |
Grass–pasture | 58.91 ± 16.72 | 75.46 ± 16.00 | 67.39 ± 20.23 | 58.33 ± 23.84 | 74.63 ± 18.92 | 74.82 ± 24.85 | 68.63 ± 29.94 |
Grass–trees | 89.89 ± 8.48 | 94.84 ± 5.20 | 87.57 ± 5.42 | 94.41 ± 4.26 | 94.89 ± 4.37 | 89.63 ± 3.62 | 86.62 ± 7.76 |
Grass–pasture–mowed | 48.26 ± 42.07 | 33.70 ± 43.84 | 18.54 ± 43.36 | 34.83 ± 44.32 | 56.38 ±45.29 | 69.73 ± 43.74 | 73.84 ± 56.16 |
Hay–windrowed | 92.54 ± 7.28 | 99.53 ± 0.69 | 89.54 ± 4.73 | 92.41 ± 2.62 | 98.05 ± 0.75 | 98.34 ± 1.37 | 96.35 ± 3.64 |
Oats | 24.29 ± 31.89 | 8.48 ± 13.13 | 35.84 ± 35.74 | 52.63 ± 31.41 | 13.86 ± 26.95 | 19.92 ± 43.87 | 21.92 ± 43.82 |
Soybean–notill | 86.89 ± 4.65 | 75.52 ± 8.84 | 80.13 ± 2.65 | 86.35 ± 2.73 | 91.64 ± 3.23 | 90.42 ± 3.25 | 94.87 ± 2.78 |
Soybean–mintill | 92.72 ± 1.57 | 82.76 ± 6.90 | 85.75 ± 5.83 | 90.61 ± 2.84 | 92.76 ± 5.54 | 87.50 ± 7.53 | 94.36 ± 1.35 |
Soybean–clean | 68.58 ± 16.91 | 68.04 ± 19.14 | 71.36 ± 22.75 | 85.32 ± 19.31 | 72.27 ± 21.93 | 71.89 ± 19.21 | 73.05 ± 16.95 |
Wheat | 92.99 ± 13.36 | 95.41 ± 8.34 | 86.73 ± 2.56 | 71.39 ± 3.75 | 94.02 ± 5.85 | 83.21 ± 4.86 | 82.83 ± 8.42 |
Woods | 98.89 ± 1.28 | 96.38 ± 3.15 | 93.79 ± 2.65 | 95.36 ± 2.79 | 97.46 ± 1.38 | 97.62 ± 1.74 | 95.49 ± 1.35 |
Buildings–grass–trees | 75.44 ± 13.80 | 60.64 ± 19.60 | 64.52 ± 32.28 | 71.16 ± 21.84 | 53.72 ± 31.57 | 54.37 ± 27.46 | 68.42 ± 32.49 |
Stone–steel–towers | 57.62 ± 26.55 | 50.40 ± 42.30 | 59.72 ± 32.48 | 66.39 ± 27.48 | 72.68 ± 27.54 | 71.42 ± 16.29 | 70.34 ± 29.67 |
parameters | 191.4 K | 364.2 K | 27.4 M | 7.6 M | 38.4 K | 53.6 K | 55.1 K |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 95.98 ± 0.83 | 96.77 ± 0.83 | 95.76 ± 0.72 | 95.92 ± 1.52 | 96.04 ± 0.90 | 97.43 ± 1.13 | 97.56 ± 1.69 |
AA (%) | 94.07 ± 1.39 | 94.70 ± 0.86 | 95.12 ± 1.58 | 94.24 ± 1.46 | 94.01 ± 1.19 | 96.21 ± 0.61 | 96.41 ± 1.24 |
K × 100 | 95.53 ± 0.95 | 96.41 ± 0.93 | 95.28 ± 0.63 | 95.46 ± 0.73 | 95.64 ± 1.10 | 97.07 ± 0.77 | 97.28 ± 1.15 |
Scrub | 99.72 ± 0.29 | 97.72 ± 2.16 | 99.16 ± 1.04 | 97.89 ± 1.78 | 94.42 ± 1.54 | 93.28 ± 2.31 | 96.22 ± 137 |
Willow swamp | 82.18 ± 9.58 | 90.35 ± 9.95 | 95.40 ± 3.63 | 83.12 ± 5.02 | 83.50 ± 5.26 | 86.37 ± 6.18 | 88.64 ± 7.31 |
Cp hammock | 92.80 ± 7.60 | 95.88 ± 2.52 | 93.42 ± 2.57 | 90.53 ± 1.48 | 89.84 ± 6.45 | 90.91 ± 5.35 | 94.53 ± 6.34 |
Slash pine | 78.38 ± 7.19 | 91.68 ± 5.97 | 95.23 ± 3.29 | 84.94 ± 3.35 | 95.03 ± 3.91 | 96.18 ± 4.67 | 94.84 ± 5.39 |
Oak/broadleaf | 91.57 ± 7.51 | 81.88 ± 9.97 | 95.76 ± 4.27 | 83.86 ± 1.58 | 85.27 ± 2.57 | 96.38 ± 1.39 | 87.58 ± 2.58 |
Hardwood | 88.02 ± 8.54 | 91.55 ± 8.02 | 97.38 ± 2.56 | 97.96 ± 2.56 | 88.47 ± 4.84 | 96.67 ± 2.12 | 97.55 ± 1.79 |
Swamp | 96.64 ± 7.55 | 89.91 ± 10.76 | 89.27 ± 5.73 | 100.00 ± 0.00 | 98.67 ± 1.46 | 98.00±1.23 | 100.00 ± 0.00 |
Graminoid marsh | 95.81 ± 5.42 | 96.96 ± 4.31 | 84.37 ± 8.26 | 98.12 ± 2.49 | 97.95 ± 2.62 | 98.31 ± 1.08 | 96.00 ± 4.57 |
Spartina marsh | 98.46 ± 3.19 | 88.86 ± 0.70 | 96.23 ± 2.14 | 99.58 ± 1.12 | 100.00 ± 0.00 | 98.23 ± 1.77 | 100.00 ± 0.00 |
Cattail marsh | 100.00 ± 0.00 | 98.16 ± 3.77 | 94.96 ± 1.50 | 93.37 ± 1.48 | 92.45 ± 1.23 | 99.65 ± 0.68 | 98.56 ± 2.49 |
Salt marsh | 100.00 ± 0.00 | 98.28 ± 2.83 | 96.25 ± 1.77 | 96.50 ± 1.94 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
Mud flats | 99.33 ± 1.07 | 99.97 ± 1.63 | 99.13 ± 0.92 | 99.25 ± 1.84 | 98.69 ± 1.09 | 99.36 ± 0.69 | 99.44 ± 0.56 |
Water | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
Parameters | 178.9 K | 327.2 K | 26.7 M | 7.5 M | 100.4 K | 69.4 K | 60.1 K |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
Ranking (OA) | 2.5 | 2.5 | 1.75 | 3.5 | 4.75 | 6.5 | 6.5 |
Ranking (OA) | 2 | 3.5 | 2.5 | 3.5 | 3.75 | 6.25 | 6.5 |
Ranking (Kappa) | 2.5 | 1.75 | 1.75 | 4 | 5 | 6.5 | 6.5 |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 83.12 ± 2.69 | 82.46 ± 2.58 | 79.93 ± 0.93 | 82.99 ± 1.32 | 86.05 ± 1.43 | 86.70 ± 1.26 | 87.13 ± 0.76 |
AA (%) | 76.35 ± 2.46 | 78.63 ± 3.59 | 73.82 ± 1.57 | 76.25 ± 1.46 | 79.50 ± 1.78 | 81.46 ± 1.34 | 81.98 ± 1.45 |
K × 100 | 78.42 ± 2.83 | 78.462 ± 3.48 | 73.35 ± 1.86 | 77.33 ± 1.83 | 81.44 ± 1.47 | 81.94 ± 1.39 | 82.92 ± 1.14 |
Asphalt | 79.12 ± 2.69 | 83.45 ± 2.34 | 84.12 ± 2.58 | 86.71 ± 1.78 | 93.99 ± 1.96 | 91.31 ± 1.67 | 94.14 ± 2.79 |
Meadows | 85.24 ± 1.53 | 93.75 ± 2.73 | 94.35 ± 2.21 | 95.50 ± 1.93 | 95.00 ± 1.36 | 95.74 ±1.38 | 95.70 ± 1.39 |
Gravel | 53.14 ± 16.58 | 59.84 ± 18.43 | 80.37 ± 8.35 | 71.29 ± 9.68 | 51.91 ± 17.43 | 50.17 ± 15.72 | 50.63 ± 13.23 |
Trees | 83.57 ± 5.18 | 82.69 ± 6.54 | 78.45 ± 11.35 | 85.92 ± 8.69 | 87.17 ± 5.82 | 88.60 ± 5.35 | 89.37 ± 5.24 |
Painted metal sheets | 86.95 ± 5.51 | 89.95 ± 2.41 | 74.38 ± 3.15 | 71.76 ± 5.28 | 94.81 ± 3.78 | 99.58 ± 1.05 | 97.34 ± 3.57 |
Bare soil | 62.32 ± 3.58 | 69.78 ± 10.73 | 52.50 ± 2.33 | 52.52 ± 2.56 | 66.44 ± 3.23 | 66.10 ± 4.59 | 67.31 ± 5.35 |
Bitumen | 63.74 ± 16.72 | 62.32 ± 14.42 | 61.77 ± 1.97 | 70.78 ± 2.59 | 57.68 ± 1.37 | 71.26 ± 1.64 | 65.80 ± 1.46 |
Self-blocking bricks | 75.63 ± 7.44 | 71.85 ± 18.51 | 59.82 ± 1.34 | 65.39 ± 1.68 | 72.10 ± 2.31 | 71.73 ± 1.75 | 80.07 ± 2.73 |
Shadows | 97.49 ± 19.27 | 98.84 ± 2.24 | 78.62 ± 11.24 | 86.38 ± 11.72 | 96.45 ± 15.73 | 98.73 ± 7.94 | 97.51 ± 8.47 |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 80.36 ± 1.69 | 81.21 ± 1.58 | 83.13 ± 0.97 | 81.99 ± 1.46 | 84.41 ± 1.53 | 88.83 ± 1.36 | 89.75 ± 1.72 |
AA (%) | 82.86 ± 1.57 | 84.62 ± 2.58 | 80.37 ± 1.48 | 81.58 ± 1.39 | 87.21 ± 1.83 | 88.90 ± 1.62 | 90.53 ± 1.43 |
K × 100 | 78.62 ± 1.72 | 80.37 ± 1.49 | 81.20 ± 1.24 | 79.91 ± 1.34 | 82.69 ± 1.67 | 87.54 ± 1.38 | 88.58 ± 1.84 |
Brocoli_green_weeds_1 | 92.46 ± 3.49 | 73.63 ± 4.7 | 58.93 ± 3.57 | 38.13 ± 19.45 | 60.22 ± 8.29 | 86.19 ± 8.84 | 97.50 ± 7.39 |
Brocoli_green_weeds_2 | 92.73 ± 6.73 | 91.52 ± 1.58 | 93.03 ± 1.29 | 96.65 ± 1.38 | 98.66 ± 1.52 | 99.55 ± 1.35 | 99.36 ± 1.20 |
Fallow | 86.63 ± 4.68 | 82.84 ± 20.58 | 47.17 ± 10.34 | 79.07 ± 2.47 | 87.06 ± 1.39 | 76.72 ± 1.67 | 84.81 ± 1.38 |
Fallow_rough_plow | 92.52 ± 7.57 | 91.37 ± 1.82 | 93.26 ± 2.93 | 92.61 ± 1.58 | 97.39 ± 1.97 | 96.65 ± 1.29 | 99.60 ± 1.56 |
Fallow_smooth | 87.43 ±4.83 | 92.59 ± 3.68 | 94.28 ± 1.23 | 93.59 ± 1.48 | 94.58 ± 1.73 | 90.78 ± 3.72 | 92.22 ± 2.39 |
Stubble | 93.52 ± 0.38 | 93.69 ± 0.60 | 97.85 ± 1.46 | 96.84 ± 1.44 | 99.34 ± 1.26 | 98.85 ± 1.37 | 99.37 ± 1.05 |
Celery | 93.34 ± 2.85 | 94.77 ± 0.55 | 97.70 ± 1.26 | 97.52 ± 1.27 | 98.87 ± 1.24 | 99.07 ± 0.95 | 99.82 ± 0.73 |
Grapes_untrained | 70.42 ± 3.58 | 73.37 ± 8.34 | 78.54 ± 1.96 | 75.71 ± 1.84 | 65.87 ± 5.38 | 86.60 ± 2.59 | 84.70 ± 4.45 |
Soil_vineyard_develop | 95.34 ± 1.73 | 94.74 ± 2.37 | 95.10 ± 1.68 | 96.53 ± 1.72 | 97.01 ± 1.37 | 98.64 ± 1.46 | 97.33 ± 3.80 |
Corn_senesced_green_weeds | 72.74 ± 5.79 | 74.64 ± 3.75 | 89.46 ± 2.57 | 87.41 ± 1.25 | 88.36 ± 3.21 | 91.19 ± 2.52 | 89.42 ± 1.26 |
Lettuce_romaine_4wk | 68.63 ± 8.70 | 84.73 ± 28.82 | 38.58 ± 2.67 | 34.33 ± 8.39 | 71.34 ± 1.42 | 63.45 ± 3.73 | 77.94 ± 1.58 |
Lettuce_romaine_5wk | 82.57 ± 9.52 | 88.62 ± 3.47 | 98.37 ± 1.36 | 98.89 ± 1.67 | 97.03 ± 1.45 | 97.50 ± 1.28 | 88.96 ± 6.96 |
Lettuce_romaine_6wk | 84.97 ± 9.23 | 95.30 ± 5.68 | 98.53 ± 6.34 | 98.57 ± 1.52 | 88.77 ± 2.71 | 84.97 ± 3.49 | 88.95 ± 2.02 |
Lettuce_romaine_7wk | 82.47 ± 8.78 | 83.89 ± 3.29 | 70.77 ± 1.46 | 79.33 ± 1.52 | 96.68 ± 1.14 | 93.24 ± 1.38 | 79.60 ± 2.13 |
Vineyard_untrained | 54.69 ± 11.44 | 57.72 ± 7.45 | 72.91 ± 10.58 | 72.76 ± 2.62 | 74.32 ± 2.37 | 68.77 ± 1.48 | 74.45 ± 1.36 |
Vineyard_vertical_trellis | 74.27 ± 14.83 | 79.57 ± 2.85 | 61.53 ± 2.59 | 67.44 ± 4.19 | 79.86 ± 4.85 | 90.24 ± 5.94 | 94.45 ± 6.48 |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 58.11 ± 1.94 | 61.83 ± 1.62 | 58.46 ± 1.23 | 58.59 ± 1.46 | 68.23 ± 1.73 | 70.61 ± 0.73 | 69.89 ± 1.48 |
AA (%) | 45.23 ± 6.93 | 49.24 ± 6.84 | 40.57 ± 6.64 | 47.28 ± 7.37 | 54.63 ± 6.52 | 56.47 ± 5.73 | 60.80 ± 7.42 |
K × 100 | 52.14 ± 1.46 | 55.25 ± 3.47 | 51.35 ± 1.84 | 52.77 ± 2.74 | 63.55 ± 1.42 | 66.90 ± 1.27 | 66.79 ± 1.73 |
Alfalfa | 13.55 ± 35.45 | 12.78 ± 22.61 | 13.95 ± 42.35 | 20.23 ± 43.28 | 14.78 ± 49.57 | 21.74 ± 43.63 | 24.78 ± 39.35 |
Corn–notill | 32.47 ± 14.73 | 35.84 ± 10.48 | 34.70 ± 3.73 | 24.93 ± 17.53 | 53.29 ± 4.93 | 50.83 ± 5.52 | 65.20 ± 3.53 |
Corn–mintill | 40.54 ± 14.48 | 42.53 ± 16.68 | 26.06 ± 29.25 | 34.55± 13.63 | 40.40 ± 29.52 | 49.75 ± 26.42 | 51.59 ± 17.24 |
Corn | 5.41 ± 20.53 | 30.96 ± 21.41 | 15.13 ± 12.25 | 37.39 ± 29.53 | 34.46 ± 38.74 | 53.08 ± 24.65 | 40.17 ± 39.51 |
Grass–pasture | 55.74 ± 16.53 | 57.96 ± 16.38 | 31.89 ± 32.46 | 65.19 ± 17.45 | 76.42 ± 18.27 | 68.62 ± 23.53 | 81.86 ± 29.96 |
Grass-trees | 88.52 ± 8.92 | 82.94 ± 5.79 | 95.0 6± 3.43 | 97.19 ± 1.37 | 90.04 ± 4.74 | 92.88 ± 3.74 | 92.77 ± 7.82 |
Grass–pasture–mowed | 0.00 ± 0.00 | 8.92 ± 43.92 | 0.00 ± 0.00 | 0.00 ± 0.00 | 8.58 ± 45.93 | 2.84 ± 13.36 | 16.42 ± 56.51 |
Hay–windrowed | 89.92 ± 7.71 | 93.74 ± 1.79 | 86.06 ± 1.85 | 96.06 ± 1.94 | 97.88 ± 1.79 | 95.49 ± 1.52 | 90.04 ± 3.84 |
Oats | 0.00 ± 0.00 | 1.68 ± 13.68 | 0.00 ± 0.00 | 0.00 ± 0.00 | 8.80 ± 36.95 | 0.00 ± 0.00 | 22.00 ± 23.2 |
Soybean–notill | 49.62 ± 4.48 | 57.05 ± 8.64 | 25.35 ± 4.75 | 36.95 ± 7.43 | 75.41 ± 3.67 | 64.65 ± 4.52 | 65.33 ± 2.83 |
Soybean–mintill | 69.62 ± 1.70 | 69.84 ± 6.68 | 78.70 ± 3.85 | 66.44 ± 9.42 | 75.32 ± 5.52 | 81.96 ± 7.94 | 66.91 ± 1.37 |
Soybean–clean | 31.31 ± 16.49 | 33.94 ± 19.54 | 24.47 ± 25.31 | 46.13 ± 13.35 | 38.20 ± 21.64 | 37.73 ± 19.47 | 40.45 ± 16.93 |
Wheat | 78.51 ± 13.72 | 78.81 ± 8.89 | 54.03 ± 4.49 | 69.40 ± 7.42 | 42.54 ± 5.52 | 75.41 ± 4.63 | 83.32 ± 8.62 |
Woods | 88.52 ± 1.69 | 85.06 ± 3.19 | 87.56 ± 2.57 | 85.54 ± 1.82 | 87.88 ± 1.31 | 95.96 ± 1.42 | 92.76 ± 1.73 |
Buildings–grass–trees | 23.42 ± 13.62 | 30.85 ± 19.28 | 30.13 ± 23.17 | 31.78 ± 27.48 | 48.92 ± 31.62 | 41.54 ± 27.96 | 54.71 ± 27.63 |
Stone–steel–towers | 55.41 ± 26.37 | 64.96 ± 42.35 | 46.03 ± 23.48 | 42.70 ± 21.47 | 73.76 ± 27.26 | 71.12 ± 16.93 | 94.62 ± 22.36 |
Method | CNN | SSRN | ResNet | DesNet | Auto-CNN | PSO-Net | CPSO-Net |
---|---|---|---|---|---|---|---|
OA (%) | 68.46 ± 1.56 | 69.52 ± 1.72 | 68.19 ± 1.83 | 73.57 ± 1.74 | 74.41 ± 1.37 | 82.50 ± 1.47 | 83.77 ± 1.74 |
AA (%) | 65.46 ± 1.46 | 68.35 ± 1.47 | 59.59 ± 1.94 | 64.10 ± 1.64 | 66.41 ± 1.76 | 74.98 ± 1.85 | 76.55 ± 1.87 |
K × 100 | 64.31 ± 1.68 | 68.61 ± 1.41 | 64.01 ± 1.84 | 70.65 ± 1.79 | 71.46 ± 1.86 | 80.49 ± 1.28 | 81.96 ± 1.85 |
Scrub | 91.72 ± 2.31 | 93.72 ± 2.57 | 69.69 ± 1.64 | 79.61 ± 1.94 | 91.11 ± 1.84 | 98.40 ± 1.49 | 96.54 ± 2.97 |
Willow swamp | 68.18 ± 9.35 | 71.28 ± 9.51 | 37.73 ± 3.74 | 33.76 ± 5.64 | 40.41 ± 5.75 | 76.63 ± 7.96 | 80.25 ± 6.24 |
Cp hammock | 70.80 ± 7.04 | 72.94 ± 2.79 | 73.86 ± 2.92 | 73.44 ± 2.53 | 65.69 ± 6.74 | 74.04 ± 6.87 | 73.71 ± 5.29 |
Slash pine | 23.24 ± 7.52 | 26.90 ± 5.47 | 20.88 ± 10.38 | 32.26 ± 12.45 | 27.62 ± 11.63 | 38.02 ± 15.37 | 56.42 ± 4.73 |
Oak/broadleaf | 43.84 ± 7.15 | 46.67 ± 4.73 | 43.69 ± 4.90 | 36.85 ± 24.74 | 63.81 ± 2.47 | 42.86 ± 8.95 | 39.38 ± 11.33 |
Hardwood | 22.29 ± 5.49 | 21.72 ± 6.68 | 20.63 ± 7.92 | 23.50 ± 5.63 | 52.40 ± 4.62 | 63.24 ± 1.36 | 52.21 ± 2.73 |
Swamp | 62.69 ± 7.69 | 69.27 ± 5.36 | 61.30 ± 3.17 | 68.36 ± 2.65 | 70.29 ± 1.35 | 76.38 ± 3.52 | 62.67 ± 1.48 |
Graminoid marsh | 75.48 ± 5.73 | 76.40 ± 4.38 | 75.56 ± 8.25 | 78.14 ± 1.45 | 45.86 ± 12.65 | 56.60 ± 6.42 | 83.26 ± 1.34 |
Spartina marsh | 59.91 ± 3.29 | 64.92 ± 1.52 | 58.60 ± 2.47 | 66.45 ± 1.86 | 95.77 ± 1.42 | 98.38 ± 1.69 | 93.42 ± 1.83 |
Cattail marsh | 76.15 ± 2.83 | 76.31 ± 3.27 | 71.47 ± 1.82 | 78.01 ± 1.89 | 41.59 ± 7.73 | 70.89 ± 4.84 | 84.00 ± 2.63 |
Salt marsh | 81.16 ± 2.37 | 86.27 ± 2.63 | 82.62 ± 1.94 | 87.41 ± 1.28 | 97.42 ± 1.45 | 96.43 ± 1.95 | 98.52 ± 1.85 |
Mud flats | 80.82 ± 1.48 | 82.82 ± 1.47 | 58.75 ± 1.82 | 80.73 ± 1.38 | 72.73 ± 1.09 | 84.21 ± 1.76 | 83.82 ± 3.43 |
Water | 94.72 ± 3.52 | 99.38 ± 1.52 | 99.89 ± 1.23 | 94.78.00 ± 2.53 | 98.64 ± 1.24 | 98.66 ± 1.96 | 90.95 ± 1.74 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, X.; Zhang, C.; Cai, Z.; Yang, J.; Zhou, Z.; Gong, X. Continuous Particle Swarm Optimization-Based Deep Learning Architecture Search for Hyperspectral Image Classification. Remote Sens. 2021, 13, 1082. https://doi.org/10.3390/rs13061082
Liu X, Zhang C, Cai Z, Yang J, Zhou Z, Gong X. Continuous Particle Swarm Optimization-Based Deep Learning Architecture Search for Hyperspectral Image Classification. Remote Sensing. 2021; 13(6):1082. https://doi.org/10.3390/rs13061082
Chicago/Turabian StyleLiu, Xiaobo, Chaochao Zhang, Zhihua Cai, Jianfeng Yang, Zhilang Zhou, and Xin Gong. 2021. "Continuous Particle Swarm Optimization-Based Deep Learning Architecture Search for Hyperspectral Image Classification" Remote Sensing 13, no. 6: 1082. https://doi.org/10.3390/rs13061082