Study of a High Spectral Resolution Hyperspectral LiDAR in Vegetation Red Edge Parameters Extraction
<p>Schematic diagram of a tunable Hyperspectral LiDAR system based on AOTF.</p> "> Figure 2
<p>The supercontinuum laser source and the power density against the wavelength.</p> "> Figure 3
<p>Employed filter device. (<b>a</b>) AOTF, (<b>b</b>) LCTF.</p> "> Figure 4
<p>Transmittance of laser beam expander.</p> "> Figure 5
<p>Reflectivity of the reflector.</p> "> Figure 6
<p>Four different plants employed in lab experiment. (<b>a</b>) Dracaena, (<b>b</b>) Aloe, (<b>c</b>) Rubber plant, (<b>d</b>) Radermachera.</p> "> Figure 7
<p>First derivative of the spectral reflectance versus spectral values of dracaena green and yellow leaf measured by the AOTF-HSL and the SVC spectrometer. (<b>a</b>) Dracaena green leaf, (<b>b</b>) Dracaena yellow leaf.</p> "> Figure 8
<p>First derivative of the spectral reflectance versus spectral values of Aloe green and yellow leaf measured by AOTF-HSL and SVC. (<b>a</b>) Aloe green leaf, (<b>b</b>) Aloe yellow leaf.</p> "> Figure 9
<p>Rubber plant green leaf.</p> "> Figure 10
<p>Radermachera green leaf.</p> ">
Abstract
:1. Introduction
- This paper presents a more universal and applicable HSL with high spectral resolution to obtain vegetation spectral profiles, and three different RE position extraction methods were firstly employed for addressing the acquired HSL spectral profiles;
- This paper is just the beginning of using the high spectral resolution HSL for vegetation index detection, which might inspire estimation of other vegetation parameters or biochemical content using this advanced HSL.
2. Materials and Methods
2.1. AOTF-HSL Design and Components
2.1.1. SC Source
2.1.2. AOTF vs. LCTF
2.1.3. Collimator
2.1.4. Reflector
2.2. REP Extraction Methods
3. Results
3.1. REP, RE Slope and REA Measurement Results and Analysis
3.2. Comparison of REP Results from Differnent Calculating Methods
4. Discussion
- Green leaves have more uniform spectral reflectivity over their surface, since the contents affecting “Red Edge” related parameters are distributed evenly on them; in contrast to this, yellow leaves have uneven distributions of these contents as Figure 7a,b present and the reflectivity varies for different parts of the yellow leaf;
- As aforementioned in Section 2, the hardware design, optics system, and the measurement distance determine the diameter of the laser pulse footprint for sampling, which is approximately 1 cm in this experiment with a field of view (FOV) of 0.2 mill radian. The sampled area of the spectrometer is larger (resulting in a 5.5 cm radius footprint with a 25◦ field of view). Area coverage by the laser pulse has different reflectivity due to the non-uniformity of the yellow leaves.
- HSL with finer spectral resolution is anticipated to improve the performance in vegetation index or parameter extraction, and the ultimate HSL will have similar spectral resolution with the referenced SVC spectrometer. With better spectral resolution, it is of great significance to estimate the vegetation content and produce more comparable measurements. HSL spectral profiles covering the 500–1000 nm wavelength band with 2 nm resolution is anticipated for future work, whose resolution is more feasible to produce reliable results for vegetation-related applications.
- In this paper, the influence of the spectral resolution on the REP or further vegetation resolution is not investigated, limited by the hardware design. A 10 nm spectral resolution is employed in this paper, which is determined by the LiDAR raw measurement processing capacity; actually, the spectral resolution of the AOTF-HSL can be adjusted from 2 nm to 10 nm, it is of great significance for exploring the influence of the spectral resolution on the performance of the above REP extraction method.
- REP is one of the most important indicators for vegetation health monitoring, but there are still some other vegetation indexes for presenting vegetation growth or content; more work would be carried out on using HSL to extract these vegetation indexes.
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Plaza, A. Recent advances in techniques for hyperspectral image processing. Remote Sens. Environ. 2009, 113, S110–S122. [Google Scholar] [CrossRef]
- Dalponte, M.; Bruzzone, L.; Gianelle, D. Fusion of hyperspectral and lidar remote sensing data for classification of complex forest areas. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1416–1427. [Google Scholar] [CrossRef]
- Ravikanth, L.; Jayas, D.S.; White, N.D.; Fields, P.G.; Sun, D.W. Extraction of spectral information from hyperspectral data and application of hyperspectral imaging for food and agricultural products. Food Bioprocess Technol. 2017, 10, 1–33. [Google Scholar] [CrossRef]
- Camps-Valls, G.; Bruzzone, L. Kernel-based methods for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1351–1362. [Google Scholar] [CrossRef]
- Fernandez-Diaz, J.; Carter, W.; Glennie, C.; Shrestha, R.; Pan, Z.; Ekhtari, N.; Singhania, A.; Hauser, D.; Sartori, M. Capability assessment and performance metrics for the Titan multispectral mapping LiDAR. Remote Sens. 2016, 8, 936. [Google Scholar] [CrossRef]
- Burton, D.; Dunlap, D.B.; Wood, L.J.; Flaig, P.P. Lidar intensity as a remote sensor of rock properties. J. Sediment. Res. 2011, 81, 339–347. [Google Scholar] [CrossRef]
- Chen, Y.; Jiang, C.; Hyyppä, J.; Qiu, S.; Wang, Z.; Tian, M.; Bo, Y. Feasibility Study of Ore Classification Using Active Hyperspectral LiDAR. IEEE Geosci. Remote Sens. Lett. 2018, 99, 1–5. [Google Scholar] [CrossRef]
- Song, J.-H. Assessing the possibility of land-cover classification using lidar intensity data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 259–262. [Google Scholar]
- Chust, G.; Galparsoro, I.; Borja, A.; Franco, J.; Uriarte, A. Coastal and estuarine habitat mapping, using LIDAR height and intensity and multi-spectral imagery. Estuar. Coast. Shelf Sci. 2008, 78, 633–643. [Google Scholar] [CrossRef]
- Kaasalainen, S.; Hyyppa, H.; Kukko, A.; Litkey, P.; Ahokas, E.; Hyyppa, J.; Kaasalainen, M. Radiometric calibration of LiDAR intensity with commercially available reference targets. IEEE Trans. Geosci. Remote Sens. 2009, 47, 588–598. [Google Scholar] [CrossRef]
- Hata, A.; Wolf, D. Road marking detection using LIDAR reflective intensity data and its application to vehicle localization. In Proceedings of the IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 584–589. [Google Scholar]
- Eitel, J.U.; Magney, T.S.; Vierling, L.A.; Dittmar, G. Assessment of crop foliar nitrogen using a novel dual-wavelength laser system and implications for conducting laser-based plant physiology. ISPRS J. Photogramm. Remote Sens. 2014, 97, 229–240. [Google Scholar] [CrossRef]
- Gaulton, R.; Danson, F.M.; Ramirez, F.A.; Gunawan, O. The potential of dual-wavelength laser scanning for estimating vegetation moisture content. Remote Sens. Environ. 2013, 132, 32–39. [Google Scholar] [CrossRef]
- Zimble, D.A. Characterizing vertical forest structure using small-footprint airborne LiDAR. Remote Sens. Environ. 2003, 87, 171–182. [Google Scholar] [CrossRef] [Green Version]
- Douglas, E.S.; Strahler, A.; Martel, J.; Cook, T.; Mendillo, C.; Marshall, R.; Chakrabarti, S.; Schaaf, C.; Woodcock, C.; Li, Z.; et al. DWEL: A dual-wavelength echidna lidar for ground-based forest scanning. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 4998–5001. [Google Scholar]
- Asner, G.P.; Knapp, D.E.; Kennedy-Bowdoin, T.; Jones, M.O.; Martin, R.E.; Boardman, J.W.; Field, C.B. Carnegie airborne observatory: In-flight fusion of hyperspectral imaging and waveform light detection and ranging for three-dimensional studies of ecosystems. J. Appl. Remote Sens. 2007, 1, 013536. [Google Scholar] [CrossRef]
- Alonzo, M.; Bookhagen, B.; Roberts, D.A. Urban tree species mapping using hyperspectral and lidar data fusion. Remote Sens. Environ. 2014, 148, 70–83. [Google Scholar] [CrossRef]
- Sohn, G.; Dowman, I. Data fusion of high-resolution satellite imagery and LiDAR data for automatic building extraction. ISPRS J. Photogramm. Remote Sens. 2007, 62, 43–63. [Google Scholar] [CrossRef]
- Du, L.; Gong, W.; Shi, S.; Yang, J.; Sun, J.; Zhu, B.; Song, S. Estimation of rice leaf nitrogen contents based on hyperspectral LIDAR. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 136–143. [Google Scholar] [CrossRef]
- Wang, Z.; Chen, Y.; Li, C. A Hyperspectral LiDAR with Eight Channels Covering from VIS to SWIR[C]. IEEE Int. Geosci. Remote Sens. Symp. 2018, 4293–4296. [Google Scholar]
- Shi, S.; Song, S.; Gong, W.; Du, L.; Zhu, B.; Huang, X. Improving backscatter intensity calibration for multispectral LiDAR. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1421–1425. [Google Scholar] [CrossRef]
- Kaasalainen, S.; Lindroos, T.; Hyyppa, J. Toward hyperspectral lidar: Measurement of spectral backscatter intensity with a supercontinuum laser source. IEEE Geosci. Remote Sens. Lett. 2007, 4, 211–215. [Google Scholar] [CrossRef]
- Chen, Y.; Räikkönen, E.; Kaasalainen, S.; Suomalainen, J.; Hakala, T.; Hyyppä, J.; Chen, R. Two-channel hyperspectral LiDAR with a supercontinuum laser source. Sensors 2010, 10, 7057–7066. [Google Scholar] [CrossRef] [PubMed]
- Hakala, T.; Suomalainen, J.; Kaasalainen, S.; Chen, Y. Full waveform hyperspectral LiDAR for terrestrial laser scanning. Opt. Express 2012, 20, 7119–7127. [Google Scholar] [CrossRef] [PubMed]
- Nevalainen, O.; Hakala, T.; Suomalainen, J.; Mäkipää, R.; Peltoniemi, M.; Krooks, A.; Kaasalainen, S. Fast and nondestructive method for leaf level chlorophyll estimation using hyperspectral LiDAR. Agric. For. Meteorol. 2014, 198, 250–258. [Google Scholar] [CrossRef]
- Nevalainen, O.; Hakala, T.; Suomalainen, J.; Kaasalainen, S. Nitrogen concentration estimation with hyperspectral LiDAR[J]. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 2, 205–210. [Google Scholar] [CrossRef]
- Matikainen, L.; Karila, K.; Hyyppä, J.; Litkey, P.; Puttonen, E.; Ahokas, E. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating. ISPRS J. Photogramm. Remote Sens. 2017, 128, 298–313. [Google Scholar] [CrossRef]
- Puttonen, E.; Hakala, T.; Nevalainen, O.; Kaasalainen, S.; Krooks, A.; Karjalainen, M.; Anttila, K. Artificial target detection with a hyperspectral LiDAR over 26-h measurement. Opt. Eng. 2015, 54, 013105. [Google Scholar] [CrossRef]
- Niu, Z.; Xu, Z.; Sun, G.; Huang, W.; Wang, L.; Feng, M.; Gao, S. Design of a new multispectral waveform LiDAR instrument to monitor vegetation. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1506–1510. [Google Scholar]
- Wang, L.; Sun, G.; Niu, Z.; Gao, S.; Qiao, H. Estimation of leaf biochemical content using a novel hyperspectral full-waveform LiDAR system. Remote Sens. Lett. 2014, 5, 693–702. [Google Scholar]
- Li, W.; Jiang, C.; Chen, Y.; Hyyppä, J.; Tang, L.; Li, C.; Wang, S.W. A Liquid Crystal Tunable Filter-Based Hyperspectral LiDAR System and Its Application on Vegetation Red Edge Detection. IEEE Geosci. Remote Sens. Lett. 2018, 16, 291–295. [Google Scholar] [CrossRef]
- Chen, Y.; Li, W.; Hyyppä, J.; Wang, N.; Jiang, C.; Meng, F.; Li, C. A 10-nm Spectral Resolution Hyperspectral LiDAR System Based on an Acousto-Optic Tunable Filter. Sensors 2019, 19, 1620. [Google Scholar] [CrossRef]
- Malkamäki, T.; Kaasalainen, S.; Ilinca, J. Portable hyperspectral lidar utilizing 5 GHz multichannel full waveform digitization. Opt. Express 2019, 27, A468–A480. [Google Scholar] [CrossRef] [PubMed]
- Pu, R. Extraction of red edge optical parameters from Hyperion data for estimation of forest leaf area index. IEEE Trans. Geosci. Remote. Sens. 2003, 41, 916–921. [Google Scholar]
- Yao, F. Hyperspectral models for estimating vegetation chlorophyll content based on red edge parameter. Trans. Chin. Soc. Agric. Eng. 2009, 25, 123–129. [Google Scholar]
- Zarco-Tajeda, P.J. Detection of chlorophyll fluorescence in vegetation from airborne hyperspectral CASI imagery in the red edge spectral region. IEEE Int. Geosci. Remote Sens. Symp. 2004, 1, 598–600. [Google Scholar]
- Miller, J.R.; Hare, E.W.; Wu, J. Quantitative characterization of the vegetation red edge reflectance. An inverted-Gaussian reflectance model. Remote Sens. 1990, 11, 1755–1773. [Google Scholar] [CrossRef]
- Shafri, H.Z.; Hamdan, N. Hyperspectral imagery for mapping disease infection in oil palm plantationusing vegetation indices and red edge techniques. Am. J. Appl. Sci. 2009, 6, 10. [Google Scholar]
Parameter | AOTF | LCTF (VariSpec VIS) |
---|---|---|
spectral range | 430–1450 nm | 400–720 nm |
Response tine | 10 μs | 50 ms |
spectral resolution | 2–10 nm | 7/10 or 20 nm |
REP (nm) | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
Dracaena Green Leaf | 725 | 718.85 | 6.15 (0.85%) |
Dracaena Yellow Leaf | 695 | 685.55 | 9.45 (1.3%) |
Aloe Green Leaf | 725 | 718.85 | 6.15 (0.85%) |
Aloe Yellow Leaf | 715 | 690.9 | 24.1 (3.4%) |
Rubber Green Leaf | 725 | 725.45 | 0.45 (0.06%) |
Radermachera Green Leaf | 715 | 718.85 | 3.85 (0.53%) |
REP slope | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
Dracaena Green Leaf | 0.95 | 1.07 | 0.12 (11.2%) |
Dracaena Yellow Leaf | 1.44 | 1.03 | 0.41 (39.8%) |
Aloe Green Leaf | 1.29 | 1.3 | 0.01 (0.08%) |
Aloe Yellow Leaf | 1.1 | 0.59 | 0.51 (86%) |
Rubber Green Leaf | 1.09 | 1.17 | 0.08 (6.8%) |
Radermachera Green Leaf | 1 | 1.07 | 0.07 (6.5%) |
REA | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
Dracaena Green Leaf | 47.94 | 51.36 | 3.42 (6.7%) |
Dracaena Yellow Leaf | 36.67 | 26.34 | 10.33 (39.2%) |
Aloe Green Leaf | 52.54 | 51.1 | 1.43 (2.8%) |
Aloe Yellow Leaf | 41.63 | 31.53 | 10.1 (32.0%) |
Rubber Green Leaf | 44 | 43.68 | 0.32 (0.7%) |
Radermachera Green Leaf | 43.39 | 42.01 | 1.38 (3.2%) |
REP (nm) | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 717.80 | 722.32 | 4.52 (0.63%) |
LET | 709.42 | 712.24 | 2.82 (0.40%) |
FRS | 725 | 718.85 | 6.15 (0.85%) |
REP | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 636.79 | 692.43 | 55.64 (8.7%) |
LET | 628.65 | 679.91 | 51.26 (8.2%) |
FRS | 695 | 685.55 | 9.45 (1.3%) |
REP | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 718.68 | 715.13 | −3.55 (0.49%) |
LET | 720.32 | 719.95 | −0.37 (0.05%) |
FRS | 725 | 718.85 | 6.15 (0.85%) |
REP | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 715.61 | 674.27 | −41.34 (5.8%) |
LET | 721.12 | 669.01 | −52.11 (7.2%) |
FRS | 715 | 690.9 | 24.1 (3.4%) |
REP | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 712.63 | 748.79 | 36.16 (5.1%) |
LET | 722.60 | 726.10 | 3.5 (0.5%) |
FRS | 725 | 725.45 | 0.45 (0.06%) |
REP | |||
---|---|---|---|
AOTF-HSL | SVC | Difference | |
LFPIT | 712.14 | 709.25 | −2.89 (0.4%) |
LET | 717.91 | 718.67 | 0.76 (0.1%) |
FRS | 715 | 718.85 | 3.85 (0.53%) |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jiang, C.; Chen, Y.; Wu, H.; Li, W.; Zhou, H.; Bo, Y.; Shao, H.; Song, S.; Puttonen, E.; Hyyppä, J. Study of a High Spectral Resolution Hyperspectral LiDAR in Vegetation Red Edge Parameters Extraction. Remote Sens. 2019, 11, 2007. https://doi.org/10.3390/rs11172007
Jiang C, Chen Y, Wu H, Li W, Zhou H, Bo Y, Shao H, Song S, Puttonen E, Hyyppä J. Study of a High Spectral Resolution Hyperspectral LiDAR in Vegetation Red Edge Parameters Extraction. Remote Sensing. 2019; 11(17):2007. https://doi.org/10.3390/rs11172007
Chicago/Turabian StyleJiang, Changhui, Yuwei Chen, Haohao Wu, Wei Li, Hui Zhou, Yuming Bo, Hui Shao, Shaojing Song, Eetu Puttonen, and Juha Hyyppä. 2019. "Study of a High Spectral Resolution Hyperspectral LiDAR in Vegetation Red Edge Parameters Extraction" Remote Sensing 11, no. 17: 2007. https://doi.org/10.3390/rs11172007
APA StyleJiang, C., Chen, Y., Wu, H., Li, W., Zhou, H., Bo, Y., Shao, H., Song, S., Puttonen, E., & Hyyppä, J. (2019). Study of a High Spectral Resolution Hyperspectral LiDAR in Vegetation Red Edge Parameters Extraction. Remote Sensing, 11(17), 2007. https://doi.org/10.3390/rs11172007