Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution
<p>(<b>A</b>) Unmanned aerial vehicle (UAV), model microdrone MD4-1000, with the visible-light camera attached, flying over the sunflower crop in the early season; (<b>B</b>) TetraCam Multispectral camera; and (<b>C</b>) Spectralon<sup>®</sup> panel placed in the middle of the field to calibrate the spectral data.</p> "> Figure 2
<p>UAV images collected over the sunflower field at 40 m on three different dates in the early season (<b>top</b>) and associated on-ground photograph (<b>bottom</b>).</p> "> Figure 3
<p>Flowchart of the OBIA procedure applied for crop-row classification and weed detection.</p> "> Figure 4
<p>Examples of four sampling frames showing: (<b>1</b>) correct classification; (<b>2</b>) underestimation of weeds; (<b>3</b>) false negative errors (<span class="html-italic">i.e.</span>, no detection of weeds); and (<b>4</b>) false positive errors (<span class="html-italic">i.e.</span>, overestimation of weeds) in three scenarios: (<b>A</b>) On-ground photographs; (<b>B</b>) manual classification of observed data; and (<b>C</b>) image classification performed by the OBIA algorithm.</p> "> Figure 5
<p>Details of the image spatial resolution captured by the visible-light camera (1) and the multispectral camera (2) at: (<b>A</b>) 40 m altitude; and (<b>B</b>) 100 m altitude.</p> "> Figure 6
<p>Individual UAV images collected with the multispectral (<b>1</b>) and the visible-light (<b>2</b>) cameras at: 40 m (<b>A</b>); and 100 m (<b>B</b>) altitude. The yellow squares serve to compare the area covered by the images from each camera at both flight altitudes. The sunlight effect (light spot) observed in 1-B and 2-B were minimized after applying vegetation indices (see <a href="#sec2dot3-sensors-15-05609" class="html-sec">Section 2.3</a>).</p> "> Figure 7
<p>Image classified in weeds (red), sunflower crop rows (green) and bare soil (brown) using an UAV flying at 40 m altitude with: (<b>A</b>) visible-light camera (ExG index); and (<b>B</b>) multispectral camera (NDVI index).</p> ">
Abstract
:1. Introduction
2. Experimental Section
2.1. Study Site
2.2. UAV Flights: Camera, Altitudes and Dates
2.3. OBIA Algorithm
- (a)
- Field segmentation in sub-plots: The algorithm segmented the UAV images into small plots of a customised size to address the spatial and spectral variability of the crop field. In our case, sub-plots of 5 × 5 m were selected and sequentially analysed.
- (b)
- Sub-plots segmentation in objects: The image sub-plots were sub-segmented using the multi-resolution algorithm implemented in eCognition to create multi-pixel objects representing the elements of the fields, i.e., crop and weed plants (vegetation objects) and soil background (bare soil objects). Segmentation is a bottom-up region-merging process based on band weights and on five parameters (scale, colour, shape, smoothness and compactness) defined by the operator. After visually testing several segmentation outputs, the selected values were 10, 0.9, 0.1, 0.5 and 0.5 for scale, colour, shape, smoothness and compactness, respectively. Within the range of spatial resolutions (a few centimetres) studied in this investigation, this segmentation setting was adequate for all the studied scenarios. However, this issue merits further investigation aiming to optimize the segmentation setting as affected by the crop pattern (e.g., crop row separation) and image spatial resolution [17]. The resulting objects contained new contextual and morphological information (e.g., orientation, position, size, shape, and others) that were used in the next phases of the classification process.
- (c)
- Vegetation objects discrimination: After segmentation, the first step in the classification process was to discriminate the vegetation objects from the bare soil objects. Two spectral indices were used: (1) the Excess Green index (ExG, Equation (1)) for the visible-light camera [21,22]; and (2) the Normalised Difference Vegetation Index (NDVI, Equation (2)) for the multispectral camera [23]. The indices were calculated as follows:
- (d)
- Crop-row classification: Once the vegetation objects were discriminated, the crop-row structure was classified by following three steps: (1) estimation of the crop-row orientation; (2) image gridding based on stripes following the crop-row orientation; and (3) crop-row classification. First, crop-row orientation was determined by an iterative process in which the image was repeatedly segmented in stripes with different angles (from 0° to 180°, with 1° of increase ratio), with the selected orientation the one in which the stripes showed a higher percentage of vegetation objects. Next, a new segmentation level (i.e., upper level) was created above the previous multi-resolution one (i.e., lower level) in which the image was segmented to create a mesh of stripes with the same direction as the selected crop-row orientation angle. Finally, the stripe in the upper segmentation level with the higher percentage of vegetation objects in the lower segmentation level were classified as crop rows, following the criteria described in [25]. In this process, after a stripe was classified as a crop-row, the separation distance between rows (0.7 m in sunflower) was used to mask the neighbouring stripes within this distance, which avoided classifying areas with high weed infestation as crop rows.
- (e)
- Weed and crop discrimination: Once the crop-rows were classified, the remaining stripes were classified as crop-row buffer (strings in contact with the crop rows) and non-crop area in the upper segmentation level. Next, the hierarchical relationship between the upper and the lower segmentation levels was used to execute the discrimination of crop and weeds. The vegetation objects (in the lower segmentation level) that were located either under the crop rows or under the non-crop area (in the upper segmentation level) were classified either as sunflower or as weeds, respectively. The remaining vegetation objects located under the buffer area were classified following a criterion of minimum spectral distance, i.e., an unclassified vegetation object was assigned to the sunflower or weed class depending on its higher degree of spectral similarity to its surrounding sunflower or weed objects, respectively.
- (f)
- Weed coverage assessment: A vector file containing 30 geo-referenced sampling frames, 1 × 1 m in size, was overlapped in the classified image to calculate the relative area corresponding to each class, i.e., sunflower, weeds and bare soil, in every frame. Weed coverage was determined as the percentage of pixels classified as weed per unit of ground surface. Information derived from these frames was used for validation purposes, as explained in the next section.
2.4. Evaluation of OBIA Algorithm Performance
3. Results and Discussion
3.1. Image Spatial Resolution and Covered Area As Affected by Flight Altitude
Flight Altitude | Pixel Size (cm) | Covered Area (ha) | ||
---|---|---|---|---|
Visible-Light Camera | Multispectral Camera | Visible-Light Camera | Multispectral Camera | |
40 m | 1.52 | 2.16 | 0.28 | 0.06 |
60 m | 2.28 | 3.27 | 0.63 | 0.14 |
80 m | 3.04 | 4.33 | 1.13 | 0.25 |
100 m | 3.81 | 5.41 | 1.77 | 0.38 |
3.2. Accuracy Assessment on Classification of Crop-Rows
3.3. Weed Discrimination As Affected by Camera, Date and Flight Altitude
Flight date (Phenological Stage) | Weed Presence | Flight Altitude (m) | Camera (Vegetation Index) | |||||||
---|---|---|---|---|---|---|---|---|---|---|
Visible-Light (ExG) | Multispectral (NDVI) | |||||||||
Correct (%) | Under-Estimated (%) | False − (%) | False + (%) | Correct (%) | Under-estimated (%) | False − (%) | False + (%) | |||
Date 1–44 DAS (4 true leaves) | Weed | 40 | 71 | 5 | 14 | 10 | 71 | 10 | 19 | - |
60 | 43 | 10 | 43 | 4 | 62 | 10 | 28 | - | ||
80 | 29 | 10 | 48 | 13 | 57 | 10 | 33 | - | ||
100 | 19 | 10 | 24 | 47 | 43 | 14 | 29 | 14 | ||
No weed | 40 | 100 | - | - | - | 100 | - | - | - | |
60 | 100 | - | - | - | 100 | - | - | - | ||
80 | 44 | - | - | 56 | 100 | - | - | - | ||
100 | 33 | - | - | 67 | 100 | - | - | - | ||
Date 2–50 DAS (5–6 true leaves) | Weed | 40 | 77 | 5 | 14 | 4 | 91 | 9 | - | - |
60 | 64 | 18 | 14 | 4 | 62 | 19 | 19 | - | ||
80 | 45 | 14 | 27 | 14 | 55 | 9 | 36 | - | ||
100 | 45 | 5 | 9 | 41 | 50 | 14 | 27 | 9 | ||
No weed | 40 | 88 | - | - | 12 | 100 | - | - | - | |
60 | 88 | - | - | 12 | 100 | - | - | - | ||
80 | 63 | - | - | 37 | 100 | - | - | - | ||
100 | 37 | - | - | 63 | 88 | - | - | 12 | ||
Date 3–57 DAS (7–8 true leaves) | Weed | 40 | 68 | 14 | 18 | - | 60 | 7 | 33 | - |
60 | 68 | 14 | 18 | - | 50 | 9 | 36 | 5 | ||
80 | 50 | - | 14 | 36 | 41 | 9 | 36 | 14 | ||
100 | 27 | - | 41 | 32 | 41 | - | 45 | 14 | ||
No weed | 40 | 88 | - | - | 12 | 100 | - | - | - | |
60 | 100 | - | - | - | 88 | - | - | 12 | ||
80 | 63 | - | - | 37 | 88 | - | - | 12 | ||
100 | 63 | - | - | 37 | 100 | - | - | - |
4. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- FAO. FAOSTAT. Available online: http://faostat3.fao.org/ (accessed on 10 August 2014).
- MAGRAMA. Annual Agrarian Statistics. Available online: http://www.magrama.gob.es/es/estadistica (accessed on 23 May 2014).
- Jurado-Expósito, M.; López-Granados, F.; García-Torres, L.; García-Ferrer, A.; Sánchez de la Orden, M.; Atenciano, S. Multi-species weed spatial variability and site-specific management maps in cultivated sunflower. Weed Sci. 2003, 51, 319–328. [Google Scholar] [CrossRef]
- Jurado-Expósito, M.; López-Granados, F.; Peña-Barragán, J.M.; García-Torres, L. A digital elevation model to aid geostatistical mapping of weeds in sunflower crops. Agron. Sustain. Dev. 2009, 29, 391–400. [Google Scholar] [CrossRef]
- Peña-Barragán, J.M.; López-Granados, F.; Jurado-Expósito, M.; García-Torres, L. Mapping Ridolfia segetum patches in sunflower crop using remote sensing. Weed Res. 2007, 47, 164–172. [Google Scholar] [CrossRef]
- Shaw, D.R. Translation of remote sensing data into weed management decisions. Weed Sci. 2005, 53, 264–273. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; López-Granados, F.; de Castro, A.I.; Peña-Barragán, J.M. Configuration and Specifications of an Unmanned Aerial Vehicle (UAV) for Early Site Specific Weed Management. PLoS One 2013, 8, e58210. [Google Scholar] [CrossRef] [PubMed]
- Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Labbé, S.; Baret, F. Assessment of Unmanned Aerial Vehicles Imagery for Quantitative Monitoring of Wheat Crop in Small Plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef]
- López-Granados, F. Weed detection for site-specific weed management: Mapping and real-time approaches. Weed Res. 2011, 51, 1–11. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
- Yu, Q.; Gong, P.; Clinton, N.; Biging, G.; Kelly, M.; Schirokauer, D. Object-based detailed vegetation classification with airborne high spatial resolution remote sensing imagery. Photogramm. Eng. Remote Sens. 2006, 72, 799–811. [Google Scholar] [CrossRef]
- Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef]
- De Castro, A.I.; López Granados, F.; Jurado-Expósito, M. Broad-scale cruciferous weed patch classification in winter wheat using QuickBird imagery for in-season site-specific control—Springer. Precis. Agric. 2013, 14, 392–413. [Google Scholar] [CrossRef]
- Peña, J.M.; Torres-Sánchez, J.; de Castro, A.I.; Kelly, M.; López-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS One 2013, 8, e77151. [Google Scholar] [PubMed]
- Montalvo, M.; Guerrero, J.M.; Romeo, J.; Emmi, L.; Guijarro, M.; Pajares, G. Automatic expert system for weeds/crops identification in images from maize fields. Expert Syst. Appl. 2013, 40, 75–82. [Google Scholar] [CrossRef]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015. under review. [Google Scholar]
- Meier, U. Growth Stages of Mono- and Dicotyledonous Plants. BBCH Monograph; Federal Biological Research Centre for Agriculture and Forestry: Berlin/Braunschweig, Germany, 2001. [Google Scholar]
- Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef]
- Peña-Barragán, J.M.; Kelly, M.; de Castro, A.I.; López-Granados, F. Object-based approach for crop row characterizatioin in UAV images for site-specific weed management. In Proceedings of the 4th GEOBIA, Rio de Janeiro, Brazil, 8 May 2012; pp. 426–430.
- Tellaeche, A.; BurgosArtizzu, X.P.; Pajares, G.; Ribeiro, A.; Fernández-Quintanilla, C. A new vision-based approach to differential spraying in precision agriculture. Comput. Electron. Agric. 2008, 60, 144–155. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; von Bargen, K.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. Am. Soc. Agric. Eng. 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. In Technical Presentations, Proceedings of the 3rd Earth Resources Technology Satellite-1 Symposium, Washington, DC, USA, 10–14 December 1973; Freden, S.C., Mercanti, E.P., Becker, M.A., Eds.; NASA Sp-351: Washington, DC, USA; Volume 1, pp. 309–317.
- Jackson, R.D.; Huete, A.R. Interpreting vegetation indices. Prev. Vet. Med. 1991, 11, 185–200. [Google Scholar] [CrossRef]
- Guerrero, J.M.; Guijarro, M.; Montalvo, M.; Romeo, J.; Emmi, L.; Ribeiro, A.; Pajares, G. Automatic expert system based on images for accuracy crop row detection in maize fields. Expert Syst. Appl. 2013, 40, 656–664. [Google Scholar] [CrossRef]
- Gómez-Candón, D.; Castro, A.I.D.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef]
- Hengl, T. Finding the right pixel size. Comput. Geosci. 2006, 32, 1283–1298. [Google Scholar] [CrossRef]
- Gibson, K.D.; Dirks, R.; Medlin, C.S.; Johnston, L. Detection of weed species in soybean using multispectral digital images. Weed Technol. 2004, 18, 742–749. [Google Scholar] [CrossRef]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Peña, J.M.; Torres-Sánchez, J.; Serrano-Pérez, A.; De Castro, A.I.; López-Granados, F. Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution. Sensors 2015, 15, 5609-5626. https://doi.org/10.3390/s150305609
Peña JM, Torres-Sánchez J, Serrano-Pérez A, De Castro AI, López-Granados F. Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution. Sensors. 2015; 15(3):5609-5626. https://doi.org/10.3390/s150305609
Chicago/Turabian StylePeña, José M., Jorge Torres-Sánchez, Angélica Serrano-Pérez, Ana I. De Castro, and Francisca López-Granados. 2015. "Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution" Sensors 15, no. 3: 5609-5626. https://doi.org/10.3390/s150305609
APA StylePeña, J. M., Torres-Sánchez, J., Serrano-Pérez, A., De Castro, A. I., & López-Granados, F. (2015). Quantifying Efficacy and Limits of Unmanned Aerial Vehicle (UAV) Technology for Weed Seedling Detection as Affected by Sensor Resolution. Sensors, 15(3), 5609-5626. https://doi.org/10.3390/s150305609