Deep and Machine Learning Image Classification of Coastal Wetlands Using Unpiloted Aircraft System Multispectral Images and Lidar Datasets
"> Figure 1
<p>Workflow recommended in this paper for traditional machine learning techniques and deep learning algorithms.</p> "> Figure 2
<p>Study site, located at Wolf Branch creek coastal nature preserve in West Florida.</p> "> Figure 3
<p>Mangrove species in the area and black short mangrove.</p> "> Figure 4
<p>Native grass species naturally occurring as well as used in coastal restoration.</p> "> Figure 5
<p>Invasive species present in the Wolf Branch Creek nature reserve.</p> "> Figure 6
<p>Study area preparation and data collection (<b>A</b>,<b>B</b>) Ground control points close up views, (<b>C</b>) Ground control points distribution within the study area, enumerated from M1 to M12, (<b>D</b>) GNSS survey, receiver occupying a ground control point, (<b>E</b>) Micasense RedEdge-MX sensor mounted on the “Inspire 2” UAS.</p> "> Figure 7
<p>Study area UAS imagery. (<b>A</b>) Natural color visualization of the study area; the area demarcated by a blue rectangle is displayed in (<b>B</b>,<b>C</b>), (<b>B</b>) Zoomed-in color infrared visualization of area demarcated in (<b>A</b>), (<b>C</b>) Zoomed-in natural color visualization of the demarcated area in (<b>A</b>).</p> "> Figure 8
<p>Image segmentation results (<b>A</b>) polygons delimitating the segmented objects (<b>B</b>) RGB orthoimage, (<b>C</b>) polygons used as training samples during the model training process.</p> "> Figure 9
<p>Training tiles (<b>A</b>) mask showing the three classes identified and labeled in (<b>B</b>) the sample image tile. Class 0 represents the background.</p> "> Figure 10
<p>Majority filter applied to a classified image. (<b>A</b>) Classified image before filtering. (<b>B</b>) Segmented objects used to apply the majority filter. (<b>C</b>) Filtered image produced by labeling each object by its majority class.</p> "> Figure 11
<p>Vegetation maps corresponding to the highest overall accuracies achieved by each of the four classification algorithms for each experiment.</p> "> Figure 12
<p>Overall accuracy of all conducted trials.</p> "> Figure 13
<p>F1 scores organized by experiment, classification method, and band combination, classes 1 to 9.</p> "> Figure 14
<p>F1 scores organized by experiment, classification method and band combination, classes 10 to 17.</p> "> Figure A1
<p>Experiment: EXP1-OSB, producer and user accuracy organized by class and classification method, classes 1 to 6.</p> "> Figure A2
<p>Experiment: EXP1-OSB, producer and user accuracy organized by class and classification method, classes 7 to 12.</p> "> Figure A3
<p>Experiment: EXP1-OSB, producer and user accuracy organized by class and classification method, classes 13 to 17.</p> "> Figure A4
<p>Experiment: EXP2-SB_AB_CHM, producer and user accuracy organized by class and classification method, classes 1 to 6.</p> "> Figure A5
<p>Experiment: EXP2-SB_AB_CHM, producer and user accuracy organized by class and classification method, classes 7 to 12.</p> "> Figure A6
<p>Experiment: EXP2-SB_AB_CHM, producer and user accuracy organized by class and classification method, classes 13 to 17.</p> "> Figure A7
<p>Experiment: EXP3-SB_ UAS_CHM, producer and user accuracy organized by class and classification method., classes 1 to 6.</p> "> Figure A8
<p>Experiment: EXP3-SB_ UAS_CHM, producer and user accuracy organized by class and classification method., classes 7 to 12.</p> "> Figure A9
<p>Experiment: EXP3-SB_ UAS_CHM, producer and user accuracy organized by class and classification method., classes 13 to 17.</p> "> Figure A10
<p>Confusion matrix for the best results obtained in Exp1-OSB, using the Random Forest classifier (the Blue_Green_Red_RE_NIR band combination).</p> "> Figure A11
<p>Confusion matrix for the best results obtained in Exp1-OSB, using the Support Vector Machine classifier (the Blue_Green_Red_RE_NIR band combination).</p> "> Figure A12
<p>Confusion matrix for the best results obtained in Exp1-OSB, using the Deep Learning classifier with the DeepLabV3 architecture (the Blue_Green_Red band combination).</p> "> Figure A13
<p>Confusion matrix for the best results obtained in Exp1-OSB, using the Deep Learning classifier with the U-Net architecture (the Blue_Green_Red_RE band combination).</p> "> Figure A14
<p>Confusion matrix for the best results obtained in Exp2-SB_AB_CHM, using the Random Forest classifier (the Blue_Green_Red_NIR_CH band combination).</p> "> Figure A15
<p>Confusion matrix for the best results obtained in Exp2-SB_AB_CHM, using the Support Vector Machine classifier (the Blue_Green_Red_RE_NIR_CH band combination).</p> "> Figure A16
<p>Confusion matrix for the best results obtained in Exp2-SB_AB_CHM, using the Deep Learning classifier with the DeepLabV3 architecture (the Green_Red_RE_CH band combination).</p> "> Figure A17
<p>Confusion matrix for the best results obtained in Exp2-SB_AB_CHM, using the Deep Learning classifier with the U-Net architecture (the Green_Red_RE_CH band combination).</p> "> Figure A18
<p>Confusion matrix for the best results obtained in Exp3-SB_ UAS_CHM, using the Random Forest classifier (the Blue_Green_Red_RE_NIR_CH band combination).</p> "> Figure A19
<p>Confusion matrix for the best results obtained in Exp3-SB_ UAS_CHM, using the Support Vector Machine classifier (the Blue_Green_Red_RE_NIR_CH band combination).</p> "> Figure A20
<p>Confusion matrix for the best results obtained in Exp3-SB_ UAS_CHM, using the DeepLabV3 architecture (the Blue_Green_Red_RE_CH band combination).</p> "> Figure A21
<p>Confusion matrix for the best results obtained in Exp3-SB_ UAS_CHM, using the Deep Learning classifier with the U-Net architecture (the Blue_Green_Red_CH band combination).</p> ">
Abstract
:1. Introduction
Related Works
2. Materials and Methods
2.1. Study Area
2.2. Field Data Acquisition
2.3. Preprocessing of Image and lidar Datasets
2.4. Band Combinations
2.5. Training Dataset Preparation
2.6. Data Training and Classification
- Machine Learning Classifiers:
- Deep Learning Architectures:
2.7. Post-Classification Filtering and Accuracy Assessment
3. Results
3.1. Overall Accuracy
3.2. Effect of Canopy Height Models
3.3. Invasive Plants Classification
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Experiment: EXP1-OSB, Producer and User Accuracy Organized by Class and Classification Method
Appendix A.2. Experiment: EXP2-SB_AB_CHM, Producer and User Accuracy Organized by Class and Classification Method
Appendix A.3. Experiment: EXP3-SB_ UAS_CHM, Producer and User Accuracy Organized by Class and Classification Method
Appendix B
Appendix B.1. Confusion Matrices for the Highest Accuracy Results by Classification Method. Experiment 1
Appendix B.2. Confusion Matrices for the Highest Accuracy Results by Classification Method. Experiment 2
Appendix B.3. Confusion Matrices for the Highest Accuracy Results by Classification Method. Experiment 3
References
- Mendelssohn, I.A.; Byrnes, M.R.; Kneib, R.T.; Vittor, B.A. Coastal Habitats of the Gulf of Mexico. In Habitats and Biota of the Gulf of Mexico: Before the Deepwater Horizon Oil Spill; Springer: New York, NY, USA, 2017; pp. 359–640. [Google Scholar] [CrossRef]
- Lefcheck, J.S.; Hughes, B.B.; Johnson, A.J.; Pfirrmann, B.W.; Rasher, D.B.; Smyth, A.R.; Williams, B.L.; Beck, M.W.; Orth, R.J. Are coastal habitats important nurseries? A meta-analysis. Conserv. Lett. 2019, 12, e12645. [Google Scholar] [CrossRef]
- Florida Department of Environmental Protection. Benefits of Mangroves. 13 April 2022. Available online: https://floridadep.gov/water/submerged-lands-environmental-resources-coordination/content/what-mangrove (accessed on 3 August 2022).
- Elisha, O.D.; Felix, M.J. Destruction of Coastal Ecosystems and The Vicious Cycle of Poverty in Niger Delta Region 2021. Available online: https://www.ikppress.org/index.php/JOGAE/article/view/6602 (accessed on 3 August 2022).
- Kirwan, M.L.; Megonigal, J.P. Tidal wetland stability in the face of human impacts and sea-level rise. Nature 2013, 504, 53–60. [Google Scholar] [CrossRef] [PubMed]
- Van Asselen, P.S.; Verburg, H.; Vermaat, J.E.; Janse, J.H. Drivers of Wetland Conversion: A Global Meta-Analysis. PLoS ONE 2013, 8, e81292. [Google Scholar] [CrossRef] [PubMed]
- Lotze, H.K.; Lenihan, H.S.; Bourque, B.J.; Bradbury, R.H.; Cooke, R.G.; Kay, M.C.; Kidwell, S.M.; Kirby, M.X.; Peterson, C.H.; Jackson, J.B.C. Depletion, Degradation, and Recovery Potential of Estuaries and Coastal Seas. Science 2006, 312, 1806–1809. [Google Scholar] [CrossRef] [PubMed]
- Syvitski, J.P.M.; Vörösmarty, C.J.; Kettner, A.J.; Green, P. Impact of Humans on the Flux of Terrestrial Sediment to the Global Coastal Ocean. Science 2005, 308, 376–380. [Google Scholar] [CrossRef] [PubMed]
- McCarthy, M.J.; Colna, K.E.; El-Mezayen, M.M.; Laureano-Rosario, A.; Méndez-Lázaro, P.; Otis, D.B.; Toro-Farmer, G.; Vega-Rodriguez, M.; Muller-Karger, F.E. Satellite Remote Sensing for Coastal Management: A Review of Successful Applications. Environ. Manag. 2017, 60, 323–339. [Google Scholar] [CrossRef] [PubMed]
- Staver, L.W.; Stevenson, J.C.; Cornwell, J.C.; Nidzieko, N.J.; Owens, M.S.; Logan, L.; Kim, C.; Malkin, S.Y. Tidal Marsh Restoration at Poplar Island: II. Elevation Trends, Vegetation Development, and Carbon Dynamics. Wetlands 2020, 40, 1687–1701. [Google Scholar] [CrossRef]
- Bull, D.; Lim, N.; Frank, E. Perceptual improvements for Super-Resolution of Satellite Imagery. In Proceedings of the 2021 36th International Conference on Image and Vision Computing New Zealand (IVCNZ), Hamilton, New Zealand, 9–10 December 2021; pp. 1–6. [Google Scholar] [CrossRef]
- Gray, P.; Ridge, J.T.; Poulin, S.K.; Seymour, A.C.; Schwantes, A.M.; Swenson, J.J.; Johnston, D.W. Integrating Drone Imagery into High Resolution Satellite Remote Sensing Assessments of Estuarine Environments. Remote Sens. 2018, 10, 1257. [Google Scholar] [CrossRef]
- Jeziorska, J. UAS for Wetland Mapping and Hydrological Modeling. Remote Sens. 2019, 11, 1997. [Google Scholar] [CrossRef]
- Griffin, S.; Lasko, K. Using Unmanned Aircraft System (UAS) and Satellite Imagery to Map Aquatic and Terrestrial Vegetation. 2020. Available online: https://doi.org/10.21079/11681/38086 (accessed on 3 August 2022). [CrossRef]
- Thomas, O.; Stallings, C.; Wilkinson, B. Unmanned aerial vehicles can accurately, reliably, and economically compete with terrestrial mapping methods. J. Unmanned Veh. Syst. 2020, 8, 57–74. [Google Scholar] [CrossRef]
- Addo, K.A.; Jayson-Quashigah, P.-N. UAV photogrammetry and 3D reconstruction: Application in coastal monitoring. In Unmanned Aerial Systems; Elsevier: Amsterdam, The Netherlands, 2021; pp. 157–174. [Google Scholar] [CrossRef]
- Makri, D.; Stamatis, P.; Doukari, M.; Papakonstantinou, A.; Vasilakos, C.; Topouzelis, K. Multi-Scale Seagrass Mapping in Satellite Data and the Use of UAS in Accuracy Assessment. 2018. Available online: https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10773/107731T/Multi-scale-seagrass-mapping-in-satellite-data-and-the-use/10.1117/12.2326012.short (accessed on 3 August 2022).
- Suo, C.; McGovern, E.; Gilmer, A. Coastal Dune Vegetation Mapping Using a Multispectral Sensor Mounted on an UAS. Remote Sens. 2019, 11, 1814. [Google Scholar] [CrossRef]
- Pinton, D.; Canestrelli, A.; Angelini, C.; Wilkinson, B.; Ifju, P. Inferring the Spatial Distribution of Vegetation Height and Density in a Mesotidal Salt Marsh From UAV LIDAR Data. AGU Fall Meeting Abstracts. 2019. Available online: https://ui.adsabs.harvard.edu/abs/2019AGUFMEP11E2069P/abstract (accessed on 3 August 2022).
- Turner, I.L.; Harley, M.D.; Drummond, C.D. UAVs for coastal surveying. Coast. Eng. 2016, 114, 19–24. [Google Scholar] [CrossRef]
- Topouzelis, K.; Papakonstantinou, A.; Pavlogeorgatos, G. Coastline Change Detection Using Uav, Remote Sensing, GIS and 3D Reconstruction. 2015. Available online: https://d1wqtxts1xzle7.cloudfront.net/47781906/Abstract_CEMEPE_2015-with-cover-page-v2.pdf?Expires=1659623181&Signature=AYcxjwH4PeKXeJ3LopMhsWGgVNeIg8l6~ttUCQYXx-9L3GNFJFawP1dlc7gJNxrAogPZbeuX9BsTA~kWVVlIka-Z4tDsy6GXI24TfEwKmEAUfkRoIO-wIZJvY4SV6KNCkai7nQY65TjzdPaL2Xk4O6wwMwm8p8cjqOvEoXVLYQ2s99pYVa3tHXPRbkxXbo9zzmPy7epuk1XurWbQj25M1UKrSkLJo2lvgRvdx8Ye2YJAdepB2-cJQCv5MjnDUJjozRhXo0Css4fBlLjCDufR6FFr-cFOBuDgfzPdUGjivswdJt9z-ddfvjIquENbv6hul4-5qQkj2sF5mK2mIEmY3g__&Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA (accessed on 3 August 2022).
- Mancini, F.; Dubbini, M.; Gattelli, M.; Stecchi, F.; Fabbri, S.; Gabbianelli, G. Using Unmanned Aerial Vehicles (UAV) for High-Resolution Reconstruction of Topography: The Structure from Motion Approach on Coastal Environments. Remote Sens. 2013, 5, 6880–6898. [Google Scholar] [CrossRef]
- Long, N.; Millescamps, B.; Guillot, B.; Pouget, F.; Bertin, X. Monitoring the Topography of a Dynamic Tidal Inlet Using UAV Imagery. Remote Sens. 2016, 8, 387. [Google Scholar] [CrossRef]
- Ventura, D.; Bruno, M.; Lasinio, G.J.; Belluscio, A.; Ardizzone, G. A low-cost drone based application for identifying and mapping of coastal fish nursery grounds. Estuar. Coast. Shelf Sci. 2016, 171, 85–98. [Google Scholar] [CrossRef]
- Casella, E.; Collin, A.; Harris, D.; Ferse, S.; Bejarano, S.; Parravicini, V.; Hench, J.L.; Rovere, A. Mapping coral reefs using consumer-grade drones and structure from motion photogrammetry techniques. Coral Reefs 2017, 36, 269–275. [Google Scholar] [CrossRef]
- Cao, J.; Liu, K.; Zhuo, L.; Liu, L.; Zhu, Y.; Peng, L. Combining UAV-based hyperspectral and LiDAR data for mangrove species classification using the rotation forest algorithm. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102414. [Google Scholar] [CrossRef]
- Barbour, T.E.; Sassaman, K.E.; Zambrano, A.M.A.; Broadbent, E.N.; Wilkinson, B.; Kanaski, R. Rare pre-Columbian settlement on the Florida Gulf Coast revealed through high-resolution drone LiDAR. Proc. Natl. Acad. Sci. USA 2019, 116, 23493–23498. [Google Scholar] [CrossRef] [PubMed]
- Pinton, D.; Canestrelli, A.; Wilkinson, B.; Ifju, P.; Ortega, A. A new algorithm for estimating ground elevation and vegetation characteristics in coastal salt marshes from high-resolution UAV-based LiDAR point clouds. Earth Surf. Processes Landf. 2020, 45, 3687–3701. [Google Scholar] [CrossRef]
- Zhu, X.; Meng, L.; Zhang, Y.; Weng, Q.; Morris, J. Tidal and Meteorological Influences on the Growth of Invasive Spartina alterniflora: Evidence from UAV Remote Sensing. Remote Sens. 2019, 11, 1208. [Google Scholar] [CrossRef]
- Abeysinghe, T.; Simic Milas, A.; Arend, K.; Hohman, B.; Reil, P.; Gregory, A.; Vázquez-Ortega, A. Mapping Invasive Phragmites australis in the Old Woman Creek Estuary Using UAV Remote Sensing and Machine Learning Classifiers. Remote Sens. 2019, 11, 1380. [Google Scholar] [CrossRef]
- Baena, S.; Moat, J.; Whaley, O.; Boyd, D.S. Identifying species from the air: UAVs and the very high resolution challenge for plant conservation. PLoS ONE 2017, 12, e0188714. [Google Scholar] [CrossRef] [PubMed]
- Pix4D. Reflectance Map vs Orthomosaic. 2022. Available online: https://support.pix4d.com/hc/en-us/articles/202739409-Reflectance-map-vs-orthomosaic (accessed on 3 August 2022).
- Liu, T.; Abd-Elrahman, A.; Morton, J.; Wilhelm, V.L. Comparing fully convolutional networks, random forest, support vector machine, and patch-based deep convolutional neural networks for object-based wetland mapping using images from small unmanned aircraft system. GIScience Remote Sens. 2018, 55, 243–264. [Google Scholar] [CrossRef]
- Liu, T.; Abd-Elrahman, A. An Object-Based Image Analysis Method for Enhancing Classification of Land Covers Using Fully Convolutional Networks and Multi-View Images of Small Unmanned Aerial System. Remote Sens. 2018, 10, 457. [Google Scholar] [CrossRef]
- Chust, G.; Galparsoro, I.; Borja, Á.; Franco, J.; Uriarte, A. Coastal and estuarine habitat mapping, using LIDAR height and intensity and multi-spectral imagery. Estuar. Coast. Shelf Sci. 2008, 78, 633–643. [Google Scholar] [CrossRef]
- Schmid, K.A.; Hadley, B.C.; Wijekoon, N. Vertical Accuracy and Use of Topographic LIDAR Data in Coastal Marshes. J. Coast. Res. 2011, 275, 116–132. [Google Scholar] [CrossRef]
- Green, D.R.; Gregory, B.J.; Karachok, A.R. Unmanned Aerial Remote Sensing; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar] [CrossRef]
- Adade, R.; Aibinu, A.M.; Ekumah, B.; Asaana, J. Unmanned Aerial Vehicle (UAV) applications in coastal zone management—A review. Environ. Monit. Assess. 2021, 193, 154. [Google Scholar] [CrossRef]
- Zhu, X.; Hou, Y.; Weng, Q.; Chen, L. Integrating UAV optical imagery and LiDAR data for assessing the spatial relationship between mangrove and inundation across a subtropical estuarine wetland. ISPRS J. Photogramm. Remote Sens. 2019, 149, 146–156. [Google Scholar] [CrossRef]
- Xu, X.; Song, X.; Li, T.; Shi, Z.; Pan, B. Deep Autoencoder for Hyperspectral Unmixing via Global-Local Smoothing. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–16. [Google Scholar] [CrossRef]
- Sheykhmousa, M.; Mahdianpari, M.; Ghanbari, H.; Mohammadimanesh, F.; Ghamisi, P.; Homayouni, S. Support Vector Machine Versus Random Forest for Remote Sensing Image Classification: A Meta-Analysis and Systematic Review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6308–6325. [Google Scholar] [CrossRef]
- Pan, B.; Qu, Q.; Xu, X.; Shi, Z. Structure–Color Preserving Network for Hyperspectral Image Super-Resolution. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–12. [Google Scholar] [CrossRef]
- Kwenda, C.; Gwetu, M.; Dombeu, J.V.F. Machine Learning Methods for Forest Image Analysis and Classification: A Survey of the State of the Art. IEEE Access 2022, 10, 45290–45316. [Google Scholar] [CrossRef]
- Koetz, B.; Morsdorf, F.; van der Linden, S.; Curt, T.; Allgöwer, B. Multi-source land cover classification for forest fire management based on imaging spectrometry and LiDAR data. For. Ecol. Manag. 2008, 256, 263–271. [Google Scholar] [CrossRef]
- Gigović, L.; Pourghasemi, H.R.; Drobnjak, S.; Bai, S. Testing a New Ensemble Model Based on SVM and Random Forest in Forest Fire Susceptibility Assessment and Its Mapping in Serbia’s Tara National Park. Forests 2019, 10, 408. [Google Scholar] [CrossRef]
- Ahmad, A.M.; Minallah, N.; Ahmed, N.; Ahmad, A.M.; Fazal, N. Remote Sensing Based Vegetation Classification Using Machine Learning Algorithms. In Proceedings of the 2019 International Conference on Advances in the Emerging Computing Technologies (AECT), Al Madinah Al Munawwarah, Saudi Arabia, 10 February 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Wang, H.; Han, D.; Mu, Y.; Jiang, L.; Yao, X.; Bai, Y.; Lu, Q.; Wang, F. Landscape-level vegetation classification and fractional woody and herbaceous vegetation cover estimation over the dryland ecosystems by unmanned aerial vehicle platform. Agric. For. Meteorol 2019, 278, 107665. [Google Scholar] [CrossRef]
- Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Int. J. Remote Sens. 2017, 38, 2037–2052. [Google Scholar] [CrossRef]
- Erinjery, J.J.; Singh, M.; Kent, R. Mapping and assessment of vegetation types in the tropical rainforests of the Western Ghats using multispectral Sentinel-2 and SAR Sentinel-1 satellite imagery. Remote Sens. Environ. 2018, 216, 345–354. [Google Scholar] [CrossRef]
- Macintyre, P.D.; van Niekerk, A.; Dobrowolski, M.P.; Tsakalos, J.L.; Mucina, L. Impact of ecological redundancy on the performance of machine learning classifiers in vegetation mapping. Ecol. Evol. 2018, 8, 6728–6737. [Google Scholar] [CrossRef]
- Rommel, E.; Giese, L.; Fricke, K.; Kathöfer, F.; Heuner, M.; Mölter, T.; Deffert, P.; Asgari, M.; Näthe, P.; Dzunic, F.; et al. Very High-Resolution Imagery and Machine Learning for Detailed Mapping of Riparian Vegetation and Substrate Types. Remote Sens. 2022, 14, 954. [Google Scholar] [CrossRef]
- Zagajewski, B.; Kluczek, M.; Raczko, E.; Njegovec, A.; Dabija, A.; Kycko, M. Comparison of Random Forest, Support Vector Machines, and Neural Networks for Post-Disaster Forest Species Mapping of the Krkonoše/Karkonosze Transboundary Biosphere Reserve. Remote Sens. 2021, 13, 2581. [Google Scholar] [CrossRef]
- Uehara, T.D.T.; Corrêa, S.P.L.P.; Quevedo, R.P.; Körting, T.S.; Dutra, L.V.; Rennó, C.D. Landslide Scars Detection using Remote Sensing and Pattern Recognition Techniques: Comparison Among Artificial Neural Networks, Gaussian Maximum Likelihood, Random Forest, and Support Vector Machine Classifiers. Rev. Bras. De Cartogr. 2020, 72, 665–680. Available online: http://www.seer.ufu.br/index.php/revistabrasileiracartografia/article/view/54037/30208 (accessed on 3 August 2022). [CrossRef]
- Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data 2021, 8, 53. [Google Scholar] [CrossRef] [PubMed]
- Bengio, Y.; Courville, A.; Vincent, P. Representation Learning: A Review and New Perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1798–1828. [Google Scholar] [CrossRef]
- Li, B.; Pi, D. Network representation learning: A systematic literature review. Neural Comput. Appl. 2020, 32, 16647–16679. [Google Scholar] [CrossRef]
- Pedergnana, M.; Marpu, P.R.; Mura, M.D.; Benediktsson, J.A.; Bruzzone, L. A Novel Technique for Optimal Feature Selection in Attribute Profiles Based on Genetic Algorithms. IEEE Trans. Geosci. Remote Sens. 2013, 51, 3514–3528. [Google Scholar] [CrossRef]
- Novack, T.; Esch, T.; Kux, H.; Stilla, U. Machine Learning Comparison between WorldView-2 and QuickBird-2-Simulated Imagery Regarding Object-Based Urban Land Cover Classification. Remote Sens. 2011, 3, 2263–2282. [Google Scholar] [CrossRef]
- Topouzelis, K.; Psyllos, A. Oil spill feature selection and classification using decision tree forest on SAR image data. ISPRS J. Photogramm. Remote Sens. 2012, 68, 135–143. [Google Scholar] [CrossRef]
- Melgani, F.; Bruzzone, L. Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1778–1790. [Google Scholar] [CrossRef]
- Dozier, H.; Gaffney, J.F.; McDonald, S.K.; Johnson, E.R.R.L.; Shilling, D.G. Cogongrass in the United States: History, Ecology, Impacts, and Management. Weed Technol. 1998, 12, 737–743. [Google Scholar] [CrossRef]
- Kauffman, J.B.; Donato, D. Protocols for the Measurement, Monitoring and Reporting of Structure, Biomass and Carbon Stocks in Mangrove Forests. Center for International Forestry Research (CIFOR): Bogor, Indonesia, 2012. [Google Scholar] [CrossRef]
- Topcon Corporation. Hiper Lite +. January 2004. Available online: https://www.lengemann.us/pdf/HiPerLitePlus_Broch_REVC.pdf (accessed on 3 August 2022).
- Velodyne. Velodyne LiDAR HDL-32E User’s Manual. 2021. Available online: https://velodynelidar.com/downloads/ (accessed on 3 August 2022).
- Wilkinson, B.; Lassiter, H.A.; Abd-Elrahman, A.; Carthy, R.R.; Ifju, P.; Broadbent, E.; Grimes, N. Geometric Targets for UAS Lidar. Remote Sens. 2019, 11, 3019. [Google Scholar] [CrossRef]
- Agisoft LLC. Agisoft Metashape User Manual: Professional Edition, Version 1.8. 2022. Available online: https://www.agisoft.com/pdf/metashape-pro_1_8_en.pdf (accessed on 3 August 2022).
- Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-Motion’ photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
- Triggs, B.; McLauchlan, P.F.; Hartley, R.I.; Fitzgibbon, A.W. Bundle Adjustment—A Modern Synthesis. In International Workshop on Vision Algorithms; Springer: Berlin/Heidelberg, Germany, 2000; pp. 298–372. [Google Scholar] [CrossRef]
- Wolf, P.R.; Dewitt, B.A.; Wilkinson, B.E. Elements of Photogrammetry with Applications in GIS, 4th ed.; McGraw-Hill Education: New York, NY, USA, 2014; Available online: https://www.accessengineeringlibrary.com/content/book/9780071761123 (accessed on 3 August 2022).
- Hexagon © 2022. Inertial Explorer®. January 2022. Available online: https://novatel.com/products/waypoint-post-processing-software/inertial-explorer (accessed on 3 August 2022).
- Phoenix LiDAR Systems. Learn about the Latest Spatial Explorer 6.0 Features. January 2021. Available online: https://www.phoenixlidar.com/software/ (accessed on 3 August 2022).
- Rapidlasso GmbH. LAStools-Efficient LiDAR Processing Software. January 2022. Available online: https://rapidlasso.com/LAStools/ (accessed on 3 August 2022).
- Blue Marble Geographics®. Global Mapper Pro®. 2022. Available online: https://www.bluemarblegeo.com/global-mapper-pro/ (accessed on 3 August 2022).
- Dewberry. Dewberry to Collect and Process Lidar Data for Southwest Florida Water Management District. 2021. Available online: https://www.dewberry.com/insights-news/article/2017/05/16/dewberry-to-collect-and-process-lidar-data-for-southwest-florida-water-management-district (accessed on 3 August 2022).
- Fukunaga, K.; Hostetler, L. The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. Inf. Theory 1975, 21, 32–40. [Google Scholar] [CrossRef]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional Networks for Biomedical Image Segmentation. In Medical Image Computing and Computer-Assisted Intervention 2015; Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 234–241. [Google Scholar] [CrossRef]
- Chen, L.-C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 834–848. [Google Scholar] [CrossRef] [PubMed]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid Scene Parsing Network. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Heydari, S.S.; Mountrakis, G. Meta-analysis of deep neural networks in remote sensing: A comparative study of mono-temporal classification to support vector machines. ISPRS J. Photogramm. Remote Sens. 2019, 152, 192–210. [Google Scholar] [CrossRef]
- Ma, L.; Liu, Y.; Zhang, X.; Ye, Y.; Yin, G.; Johnson, B.A. Deep learning in remote sensing applications: A meta-analysis and review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
- Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Gill, E.; Molinier, M. A new fully convolutional neural network for semantic segmentation of polarimetric SAR imagery in complex land cover ecosystem. ISPRS J. Photogramm. Remote Sens. 2019, 151, 223–236. [Google Scholar] [CrossRef]
- Hsu, C.W.; Chang, C.C.; Lin, C.J. A Practical Guide to Support Vector Classification. 2003. Available online: https://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf (accessed on 3 August 2022).
- Ponraj, A. Decision Trees. DevSkrol. July 2020. Available online: https://devskrol.com/2020/07/26/random-forest-how-random-forest-works/ (accessed on 3 August 2022).
- Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- David, D. Random Forest Classifier Tutorial: How to Use Tree-Based Algorithms for Machine Learning. freeCodeCamp. 2020. Available online: https://www.freecodecamp.org/news/how-to-use-the-tree-based-algorithm-for-machine-learning/ (accessed on 3 August 2022).
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Hsu, C.-W.; Lin, C.-J. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 2002, 13, 415–425. [Google Scholar] [CrossRef]
- Chandra, M.A.; Bedi, S.S. Survey on SVM and their application in image classification. Int. J. Inf. Technol. 2018, 13, 1–11. [Google Scholar] [CrossRef]
- Zhou, Z.; Siddiquee, M.M.R.; Tajbakhsh, N.; Liang, J. UNet++: A Nested U-Net Architecture for Medical Image Segmentation. In Deep Learning in Medical Image Analysis and Multimodal Learning for Clinical Decision Support; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2018; pp. 3–11. [Google Scholar] [CrossRef]
- Feng, R.; Gu, J.; Qiao, Y.; Dong, C. Suppressing Model Overfitting for Image Super-Resolution Networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 16–17 June 2019. [Google Scholar]
- Buda, M.; Maki, A.; Mazurowski, M.A. A systematic study of the class imbalance problem in convolutional neural networks. Neural Netw. 2018, 106, 249–259. [Google Scholar] [CrossRef] [PubMed]
- Tsang, S.-H. Review: DeepLabv3—Atrous Convolution (Semantic Segmentation). Towards Data Science. 19 January 2019. Available online: https://towardsdatascience.com/review-deeplabv3-atrous-convolution-semantic-segmentation-6d818bfd1d74 (accessed on 3 August 2022).
- Chen, L.-C.; Papandreou, G.; Schroff, F.; Adam, H. Rethinking Atrous Convolution for Semantic Image Segmentation. arXiv 2017, arXiv:1706.05587. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar] [CrossRef]
- Canziani, A.; Paszke, A.; Culurciello, E. An Analysis of Deep Neural Network Models for Practical Applications. arXiv 2016, arXiv:1605.076782016. [Google Scholar]
- Michel, J.; Youssefi, D.; Grizonnet, M. Stable Mean-Shift Algorithm and Its Application to the Segmentation of Arbitrarily Large Remote Sensing Images. IEEE Trans. Geosci. Remote Sens. 2015, 53, 952–964. [Google Scholar] [CrossRef]
- Congalton, R.G. Accuracy assessment and validation of remotely sensed and other spatial information. Int. J. Wildland Fire 2001, 10, 321–328. [Google Scholar] [CrossRef]
- Barsi, Á.; Kugler, Z.; László, I.; Szabó, G.; Abdulmutalib, H.M. Accuracy Dimensions in Remote Sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 3. Available online: https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-3/61/2018/isprs-archives-XLII-3-61-2018.pdf (accessed on 3 August 2022). [CrossRef]
- Opitz, J.; Burst, S. Macro F1 and Macro F1. arXiv 2019, arXiv:1911.03347. [Google Scholar]
- Chicco, D.; Jurman, G. The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation. BMC Genom. 2020, 21, 6. [Google Scholar] [CrossRef]
- Bhatnagar, S.; Gill, L.; Ghosh, B. Drone Image Segmentation Using Machine and Deep Learning for Mapping Raised Bog Vegetation Communities. Remote Sens. 2020, 12, 2602. [Google Scholar] [CrossRef]
- Li, G.; Han, W.; Huang, S.; Ma, W.; Ma, Q.; Cui, X. Extraction of Sunflower Lodging Information Based on UAV Multi-Spectral Remote Sensing and Deep Learning. Remote Sens. 2021, 13, 2721. [Google Scholar] [CrossRef]
- Zhao, X.; Yuan, Y.; Song, M.; Ding, Y.; Lin, F.; Liang, D.; Zhang, D. Use of Unmanned Aerial Vehicle Imagery and Deep Learning UNet to Extract Rice Lodging. Sensors 2019, 19, 3859. [Google Scholar] [CrossRef] [PubMed]
- Baron, J.; Hill, D.J.; Elmiligi, H. Combining image processing and machine learning to identify invasive plants in high-resolution images. Int. J. Remote Sens. 2018, 39, 5099–5118. [Google Scholar] [CrossRef]
- Tichý, L.; Chytrý, M.; Landucci, F. GRIMP: A machine-learning method for improving groups of discriminating species in expert systems for vegetation classification. J. Veg. Sci. 2019, 30, 5–17. [Google Scholar] [CrossRef]
- Gao, Y.; Gan, Y.; Qi, L.; Zhou, H.; Dong, X.; Dong, J. A Perception-Inspired Deep Learning Framework for Predicting Perceptual Texture Similarity. IEEE Trans. Circuits Syst. Video Technol. 2020, 30, 3714–3726. [Google Scholar] [CrossRef]
- Xu, Y.; Zhu, H.; Hu, C.; Liu, H.; Cheng, Y. Deep learning of DEM image texture for landform classification in the Shandong area, China. Front. Earth Sci. 2021, 1–16. [Google Scholar] [CrossRef]
- University of Redlands. Brazilian Peppertree. University of Redlands Tree Website, 2022. Available online: https://sites.redlands.edu/trees/species-accounts/brazilianpeppertree/ (accessed on 3 August 2022).
- Virginia Tech Dept. of Forest Resources and Environmental Conservation. Brazilian Peppertree. Virginia Tech Dendrology, 2021. Available online: http://dendro.cnre.vt.edu/dendrology/syllabus/factsheet.cfm?ID=704 (accessed on 3 August 2022).
- University of Florida; IFAS; Geosciences Department, Boise State University; Entomology and Nematology Department. Interagency Brazilian Peppertree (Schinus terebinthifolius) Management Plan For Florida, 2nd ed.; University of Florid: Gainesville, FL, USA, 2006; Available online: https://ipm.ifas.ufl.edu/pdfs/BPmanagPlan.pdf (accessed on 3 August 2022).
- Vanderhoff, E.N.; Rentsch, J.D. A Review of Avian Dispersal of Non-Native and Invasive Plants in the Southeastern United States. Castanea 2022, 86, 225–244. [Google Scholar] [CrossRef]
- CZweig, L.; Burgess, M.A.; Percival, H.F.; Kitchens, W.M. Use of Unmanned Aircraft Systems to Delineate Fine-Scale Wetland Vegetation Communities. Wetlands 2015, 35, 303–309. [Google Scholar] [CrossRef]
- Mishra, N.; Mainali, K.; Shrestha, B.; Radenz, J.; Karki, D. Species-Level Vegetation Mapping in a Himalayan Treeline Ecotone Using Unmanned Aerial System (UAS) Imagery. ISPRS Int. J. Geo-Inf. 2018, 7, 445. [Google Scholar] [CrossRef]
- Sandino, J.; Gonzalez, F.; Mengersen, K.; Gaston, K.J. UAVs and Machine Learning Revolutionising Invasive Grass and Vegetation Surveys in Remote Arid Lands. Sensors 2018, 18, 605. [Google Scholar] [CrossRef] [PubMed]
- Durgan, S.D.; Zhang, C.; Duecaster, A.; Fourney, F.; Su, H. Unmanned Aircraft System Photogrammetry for Mapping Diverse Vegetation Species in a Heterogeneous Coastal Wetland. Wetlands 2020, 40, 2621–2633. [Google Scholar] [CrossRef]
- Al-Najjar, H.A.H.; Kalantar, B.; Pradhan, B.; Saeidi, V.; Halin, A.A.; Ueda, N.; Mansor, S. Land Cover Classification from fused DSM and UAV Images Using Convolutional Neural Networks. Remote Sens. 2019, 11, 1461. [Google Scholar] [CrossRef]
- Correll, M.D.; Hantson, W.; Hodgman, T.P.; Cline, B.B.; Elphick, C.S.; Shriver, W.G.; Tymkiw, E.L.; Olsen, B.J. Fine-Scale Mapping of Coastal Plant Communities in the Northeastern USA. Wetlands 2019, 39, 17–28. [Google Scholar] [CrossRef]
- Zhou, R.; Yang, C.; Li, E.; Cai, X.; Yang, J.; Xia, Y. Object-Based Wetland Vegetation Classification Using Multi-Feature Selection of Unoccupied Aerial Vehicle RGB Imagery. Remote Sens. 2021, 13, 4910. [Google Scholar] [CrossRef]
Land Cover Type or Vegetation Specie | Class Identifier | Scientific Name | Description |
---|---|---|---|
Brazilian peppertree | 1 | (Schinus terebinthifolia) | Invasive evergreen shrub or small tree. |
Cabbage palmetto | 2 | (Sabal palmetto) | Native species of palmetto |
Water | 3 | Water (from any source, sea intrusion, rainwater filling muddy car tracks, brackish water ponds or tidal creeks). | |
Black mangrove | 4 | (Avicennia germinans) | Black mangrove above 0.6 m of height. (In forested mangroves, seedlings are defined as individual trees <1.37 m in height [62]. |
Exposed sand | 5 | Exposed sand, usually white or light brown. | |
Organic brown soil | 6 | Organic brown Ssil, usually dark brown or light black loam. | |
Cogongrass | 7 | (Imperata cylindrica) | Very aggressive perennial invasive grass species, light green when healthy, brownish when stressed. |
Black needle rush | 8 | (Juncus roemerianas) | Flowering Juncus, native to North America, distributed along the Gulf Coast (since it is not a brush nor a tree, for this study it was included among the grasses). |
Seashore dropseed | 9 | (Sporobolus virginicus) | Grass. Seashore dropseed not submerged, growing in higher and drier areas, away from water bodies. |
Cordgrass | 10 | (Spartina spartinae) | Grass, commonly found in marshes and tidal mud flats. |
Dead vegetation | 11 | Exterminated or naturally dead vegetation of all types. | |
Red mangrove | 12 | (Rhizophora mangle) | Considered a native, grows as a shrub or a tree up to 60 feet tall in tidal swamps. |
Short mangrove | 13 | (Avicennia germinans) | Short plants with a characteristic longer stem compared to its branches and short height |
Leucaena | 14 | (Leucaena leucocefala) | Fast-growing, invasive evergreen shrub or tree with a height of up to 20 m. |
Submerged seashore dropseed | 15 | (Sporobolus virginicus) | Grass. Seashore dropseed dampened or submerged, growing close to water bodies where soils are saturated with water or where the plant is rooted but lives floating on shallow waters. The grass appears healthier, under simple visual inspection, than the ones growing in dry areas. |
Shepherd’s needles | 16 | (Bidens alba) | Healthy, unidentified green grass and bushes of various species that cover the areas between bigger and well-defined patches of vegetation and land cover types. |
Broomsedge mixed | 17 | (Andropogon virginicus) | Broomsedge grass mixed with numerous, less dominant, brown-colored grass and small bushes that cover the areas between bigger, well-defined patches of vegetation and other land cover types. |
Altitude AGL | 40 m |
Flight speed | 4.17 m/s |
Forward overlap | 80% |
Cross overlap | 80% |
Ground sample distance | 2.78 cm/pixel |
Time between capture | 1.28 s |
Distance between tracks | 7.11 m |
Mission flight time | 20 min (8 Acres) |
Number of images (total, 5 bands) | 4420 |
Band Name | Center Wavelength (nm) | Full Width at Half Maximum (FWHM) (nm) |
---|---|---|
Blue | 475 | 20 |
Green | 560 | 20 |
Red | 668 | 10 |
Near IR | 840 | 40 |
Red Edge | 717 | 10 |
Camera Features | |
---|---|
Ground sampling distance | 8.2 cm/pixel at 120 m (Above Ground Level AGL) |
Lens focal length (mm) | 5.5 |
Lens horizontal field of view (HFOV)(degrees) | 47.2 |
Imager size (mm) | 4.8 × 3.6 |
Imagery resolution (pixels) | 1280 × 960 |
Flight Mission and Sensor Parameters | |
---|---|
UAS flight altitude (AGL) | 40 m |
UAS flight speed | 6 m/s |
Sensor measurement range | Up to 100 m |
Sensor vertical field of view | 41.33° |
Sensor angular resolution (vertical) | 1.33° |
Sensor angular resolution (horizontal/azimuth) | 0.08°–0.33° |
Sensor field of view (horizontal) | 360° |
Sensor horizontal beam divergence | 2.79 milliradian |
Sensor vertical beam divergence | 1.395 milliradian |
Number of returns recorded by the sensor per pulse | 2 returns |
Number of returns recorded by the sensor per second | 695,000 returns/s (single return mode) 1,390,000 returns/s (double return mode) |
Beam footprint at 65 m | 18.1 × 9.1 cm (dimensions) |
Experiment | Description | Abbreviation |
---|---|---|
1 | Only spectral band combinations. | (Exp1-OSB) |
2 | Same spectral band combinations as in experiment 1 plus airborne low-density lidar products and photogrammetry-based DSM. | (Exp2-SB_AB_CHM) |
3 | Same spectral band combination as in experiment 1 plus high-density UAS lidar products (DSM and DTM). | (Exp3-SB_ UAS_CHM) |
Spectral Band Combinations for Exp1-OSB | |
B_G_R_RE_NIR | Composite of all the bands acquired by the multispectral sensor as follows: blue, green, red, red edge, and near infrared. Micasense MX RedEdge-MX (AgEagle Sensor Systems Inc., 2022) and Sentera’s 6X (SENTERA, 2022) multispectral sensors. |
B_G_R | Composite containing the visible blue, green, and red, bands traditionally captured by most low-cost consumer cameras. Emulates the use of built-in generic photographic UAV-mounted cameras. |
G_R_RE G_R_NIR B_G_R_NIR B_G_R_RE | Emulates the use of relatively inexpensive cameras (compared to full multispectral sensors) on the market, designed for specialized purposes with specific band combinations, built to meet customers specifications. |
Band Combinations for Exp2-SB_AB_CHM and Exp3-SB_UAS_CHM | |
For the two experiments that include canopy height models the 6 band combinations remain the same as above except for the addition of CHM. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gonzalez-Perez, A.; Abd-Elrahman, A.; Wilkinson, B.; Johnson, D.J.; Carthy, R.R. Deep and Machine Learning Image Classification of Coastal Wetlands Using Unpiloted Aircraft System Multispectral Images and Lidar Datasets. Remote Sens. 2022, 14, 3937. https://doi.org/10.3390/rs14163937
Gonzalez-Perez A, Abd-Elrahman A, Wilkinson B, Johnson DJ, Carthy RR. Deep and Machine Learning Image Classification of Coastal Wetlands Using Unpiloted Aircraft System Multispectral Images and Lidar Datasets. Remote Sensing. 2022; 14(16):3937. https://doi.org/10.3390/rs14163937
Chicago/Turabian StyleGonzalez-Perez, Ali, Amr Abd-Elrahman, Benjamin Wilkinson, Daniel J. Johnson, and Raymond R. Carthy. 2022. "Deep and Machine Learning Image Classification of Coastal Wetlands Using Unpiloted Aircraft System Multispectral Images and Lidar Datasets" Remote Sensing 14, no. 16: 3937. https://doi.org/10.3390/rs14163937