WO2021229248A1 - Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence - Google Patents
Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence Download PDFInfo
- Publication number
- WO2021229248A1 WO2021229248A1 PCT/GR2021/000024 GR2021000024W WO2021229248A1 WO 2021229248 A1 WO2021229248 A1 WO 2021229248A1 GR 2021000024 W GR2021000024 W GR 2021000024W WO 2021229248 A1 WO2021229248 A1 WO 2021229248A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixels
- crop
- classification
- thermal
- coloring
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000001514 detection method Methods 0.000 title claims description 14
- 230000036579 abiotic stress Effects 0.000 title description 4
- 238000013473 artificial intelligence Methods 0.000 title description 4
- 238000004040 coloring Methods 0.000 claims abstract description 16
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 16
- 238000003064 k means clustering Methods 0.000 claims abstract description 5
- 230000001755 vocal effect Effects 0.000 claims abstract description 4
- 238000012545 processing Methods 0.000 claims description 12
- 239000002689 soil Substances 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 4
- 230000002902 bimodal effect Effects 0.000 claims description 2
- 238000010276 construction Methods 0.000 claims description 2
- 238000006243 chemical reaction Methods 0.000 claims 1
- 238000000638 solvent extraction Methods 0.000 claims 1
- 235000013311 vegetables Nutrition 0.000 claims 1
- 238000012800 visualization Methods 0.000 abstract 1
- 241000196324 Embryophyta Species 0.000 description 13
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 6
- 239000002028 Biomass Substances 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 4
- 230000035882 stress Effects 0.000 description 4
- 208000005156 Dehydration Diseases 0.000 description 3
- 241001237728 Precis Species 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 2
- 241000209219 Hordeum Species 0.000 description 2
- 235000007340 Hordeum vulgare Nutrition 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 229910002092 carbon dioxide Inorganic materials 0.000 description 2
- 229930002875 chlorophyll Natural products 0.000 description 2
- 235000019804 chlorophyll Nutrition 0.000 description 2
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000008635 plant growth Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 241001092142 Molina Species 0.000 description 1
- 241000209140 Triticum Species 0.000 description 1
- 235000021307 Triticum Nutrition 0.000 description 1
- 241000204362 Xylella fastidiosa Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000004790 biotic stress Effects 0.000 description 1
- 238000009529 body temperature measurement Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 239000001569 carbon dioxide Substances 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000012010 growth Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 230000004962 physiological condition Effects 0.000 description 1
- 230000008121 plant development Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 238000004171 remote diagnosis Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000005068 transpiration Effects 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N25/00—Investigating or analyzing materials by the use of thermal means
- G01N25/20—Investigating or analyzing materials by the use of thermal means by investigating the development of heat, i.e. calorimetry, e.g. by measuring specific heat, by measuring thermal conductivity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
- G06F18/24133—Distances to prototypes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/762—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
- G06V10/763—Non-hierarchical techniques, e.g. based on statistics of modelling distributions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30128—Food products
Definitions
- the present invention relates to a method for detection and classification of biotic - abiotic stress in crops from thermal photographs using Artificial Intelligence and in particular to an automated processing method of thermal photographs and classification using Convolutional Neural Networks - CNNs in order to construction digital photographs of standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of the classification of stress.
- Spectral sensors have the ability to receive radiation in the visible and non- visible spectrum so they can be used to obtain the phenotype of a crop (Candiago et al., 2015. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 7, 4026-4047;). Their main disadvantages are complex data processing and sensitivity to weather conditions (Colomina and Molina, 2014. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 92, 79-97; Nasi et al., 2015. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens.
- the Light Detection and Ranging (LiDAR) sensor determining ranges by targeting an object with a laser and uses the photoelectric detection method. It can be used to measure biomass and plant height. Its advantage is the effective acquisition of high precision horizontal and vertical structure of the vegetation. Its disadvantages are the high cost of acquisition and the large amounts of data processing (Ota et al., 2015. Above ground Biomass Estimation Using Structure from Motion Approach with Aerial Photographs in a Seasonal Tropical Forest. Forests 6, 3882-3898.; Wallace etal., 2012. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 4, 1519-1543.).
- the thermal camera uses infrared radiation. Therefore it can be used to measure the canopy temperature of a crop and the rate of water vapor leave from leaves, and the carbon dioxide (C02) entering in leaves (Baluja et al., 2012. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 30, 511-522.). In this way, it is possible to determine the growth status of the crop indirectly. Disadvantages governing the use of the thermal camera are the sensitivity to weather conditions and the difficulty of eliminating the soil’s effect (Gonzalez-Dugo et al.,
- the treatment method applicable to a thermal image of a crop is the Crop Water Stress Index - CWSI, which determines the water availability of a crop through infrared temperature measurements (thermal photography) of the plant crown.
- the plant’s crown’s temperature is an indicator of the culture's aqueous condition because of the foliage's stomata closing response to the water's depletion, causing a decrease in transpiration and increasing the temperature of the leaf.
- a method was found to find biotic-abiotic stress in crops from thermal photographs using Artificial Intelligence and, in particular using, Convolutional Neural Networks.
- the process of automated thermal photo processing is initially performed by converting the thermal photo (pseudo-coloring) to grayscale (Fig. 1). Then the OTSU method is used to isolate the background, i.e., the soil (Fig. 2).
- the OTSU algorithm assumes that the image contains two classes of pixels that follow the bimodal histogram (foreground pixels and background pixels), then calculates the optimal threshold separating the two classes so that their combined spread (minimum variability) is minimal or equivalent (because the sum of the pairs of square distances is constant) so that their variability is maximal.
- Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) which has been converted to HSV color model (Hue, Saturation, Value), in order to find the class that has the more pixels in shades of green. After finding the class with the most pixels with shades of green, the corresponding class of pixels of thermal photography is selected. Then, export the metadata of the thermal image which is the temperature of each pixel of the class selected by the above procedure. In this way, a table is made which contains the coordinates and the temperature of each pixel.
- RGB aligned digital photo
- HSV color model Hue, Saturation, Value
- the k-Means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of organizing patterns (temperatures) into groups, where the group members are similar. Depending on the group you belong to, you assign a specific color to each pixel (e.g., the pixel belonging to the group with the lowest temperature, you assign the dark green).
- the purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops.
- a new photo is created.
- the new normalized photo shows only the crop's temperature fluctuation with standard pseudo-coloring (Fig. 4).
- the coloring and the coordinates of the pixels representing the highest temperatures are selected.
- the pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5).
- the pseudo-coloring digital photo is then fed to a trained Convolutional Neural Network, which performs stressed sub-areas classification within the crop with a specific delimitation of the stressed region with contour.
- the final result is the digital photograph of standardized pseudo-coloring of the crop with delimited outlines of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Probability & Statistics with Applications (AREA)
- Remote Sensing (AREA)
- Molecular Biology (AREA)
- Quality & Reliability (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Bioinformatics & Computational Biology (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
In the grayscale thermal image (Fig. 1), the Otsu method is applied (Fig. 2). Finding the pixels group of the OTSU method representing the crop is done by selecting the corresponding group with the most pixels in shades of green in the aligned digital photo (Fig. 3). From the metadata of the thermal photography of each pixel of the selected category, an array is constructed where it contains coordinates and temperatures, and k-Means clustering is applied to group the temperatures. Each pixel, depending on the group it belongs to, assigned a specific color to it (Fig. 4). Then, the coloring and the coordinates of the pixels representing the highest temperatures are selected. The pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5). The digital pseudo-color photo is powered to a trained Convolutional Neural Network, which performs classification of stressed sub-area within the crop with an outline and verbal visualization (Fig. 6, C1 stress).
Description
DESCRIPTION
METHOD FOR DETECTION AND CLASSIFICATION OF BIOTIC- ABIOTIC STRESS IN CROPS FROM THERMAL PHOTOGRAPHS USING ARTIFICIAL INTELLIGENCE.
The present invention relates to a method for detection and classification of biotic - abiotic stress in crops from thermal photographs using Artificial Intelligence and in particular to an automated processing method of thermal photographs and classification using Convolutional Neural Networks - CNNs in order to construction digital photographs of standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of the classification of stress.
Due to the rapid technological development in recent years, various sensors have been developed that can be used to collect data in crops. These are digital camera (RGB), spectral sensor, Light Detection and Ranging (LiDAR), and thermal camera. Some of their applications are plant height control, biomass, Leaf Area Index (LAI), and other physiological characteristics (Rahaman et al., 2015. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci. 6.; Zhang and Kovacs J.M., 2012. The application of small unmanned aerial systems for precision agriculture: a review. Precis. Agric. 13, 693-712).
The use of digital cameras (RGB) is more common compared to other sensors because they have low cost, lightweight, and simple data processing. Their disadvantages are low radiometric resolution and lack of proper calibration (Ballesteros et al., 2014. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 15, 579-592.; Bendig et al., 2014. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 6, 10395-10412.). They can be used in the rapid acquisition of color photographs to calculate the height of the crop, the LAI, and the color of the
leaves so that through already developed algorithms with the method of photo processing, the damaged "dry" leaves can be detected. However, this method lags in obtaining phenotype information and crop characteristics due to the lack of a digital camera to capture the invisible spectrum (A1 Hiary et al., 2011a. Fast and Accurate Detection and Classification of Plant Diseases. Int. J. Comput. Appl. 17, 31-38; Singh and Misra, 2017b. Detection of plant leaf diseases using image segmentation and soft computing techniques. Inf. Process. Agric. 4, 41-49.).
Spectral sensors have the ability to receive radiation in the visible and non- visible spectrum so they can be used to obtain the phenotype of a crop (Candiago et al., 2015. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 7, 4026-4047;). Their main disadvantages are complex data processing and sensitivity to weather conditions (Colomina and Molina, 2014. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 92, 79-97; Nasi et al., 2015. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens. 7, 15467-15493; Thorp et al., 2015. Proximal hyperspectral sensing and data analysis approaches for field- based plant phenomics. Comput. Electron. Agric. 118, 225-236; Zarco-Tejada et al., 2013. Spatio-temporal patterns of chlorophyll fluorescence and physiological and structural indices acquired from hyperspectral imagery as compared with carbon fluxes measured with eddy covariance. Remote Sens. Environ. 133, 102-115.).
The Light Detection and Ranging (LiDAR) sensor determining ranges by targeting an object with a laser and uses the photoelectric detection method. It can be used to measure biomass and plant height. Its advantage is the effective acquisition of high precision horizontal and vertical structure of the vegetation. Its disadvantages are the high cost of acquisition and the large amounts of data processing (Ota et al., 2015. Above ground Biomass Estimation Using Structure from Motion Approach with Aerial Photographs in a Seasonal Tropical Forest.
Forests 6, 3882-3898.; Wallace etal., 2012. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 4, 1519-1543.).
The analysis methodology and control of chlorophyll content, the LAI, and the leaf nitrogen content using remote sensing is fully developed (Ballesteros et al.,
2014. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 15, 579-592.; Bendig et al., 2014. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV -Based RGB Imaging. Remote Sens. 6, 10395-10412.). Therefore, we can have accurate plant growth information because the leaves’ spectral characteristics are directly related to the above indices.
The thermal camera uses infrared radiation. Therefore it can be used to measure the canopy temperature of a crop and the rate of water vapor leave from leaves, and the carbon dioxide (C02) entering in leaves (Baluja et al., 2012. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 30, 511-522.). In this way, it is possible to determine the growth status of the crop indirectly. Disadvantages governing the use of the thermal camera are the sensitivity to weather conditions and the difficulty of eliminating the soil’s effect (Gonzalez-Dugo et al.,
2015. Using High-Resolution Hyperspectral and Thermal Airborne Imagery to Assess Physiological Condition in the Context of Wheat Phenotyping. Remote Sens. 7, 13586-13605.; Jones et al., 2009. Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field. Funct. Plant Biol. 36, 978.; Sugiura et al., 2007. Correction of Low-altitude Thermal Images applied to estimating Soil Water Status. Biosyst. Eng. 96, 301-313.).
The treatment method applicable to a thermal image of a crop (in the whole area of cultivation, and not at each plant separately) is the Crop Water Stress Index - CWSI, which determines the water availability of a crop through infrared temperature measurements (thermal photography) of the plant crown. The plant’s crown’s
temperature is an indicator of the culture's aqueous condition because of the foliage's stomata closing response to the water's depletion, causing a decrease in transpiration and increasing the temperature of the leaf. On the contrary, the adequacy of water in the soil keeps the stomata open and a strong respiration rate, resulting in a decrease in the foliage's temperature compared to the atmospheric temperature above the crop (Idso et ah, 1981. Normalizing the stress-degree-day parameter for environmental variability. Agric. Meteorol. 24, 45-55.). Finally, plants with biotic or abiotic stress have been shown to exhibit higher canopy temperatures than healthy plants (Zarco- Tejada et al., 2012. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 117, 322-337.; Zarco-Tejada et al., 2018. Previsual symptoms of Xylella fasti diosa infection revealed in spectral plant- trait alterations. 7, 432-439.).
Considering the above, a method was found to find biotic-abiotic stress in crops from thermal photographs using Artificial Intelligence and, in particular using, Convolutional Neural Networks. The process of automated thermal photo processing is initially performed by converting the thermal photo (pseudo-coloring) to grayscale (Fig. 1). Then the OTSU method is used to isolate the background, i.e., the soil (Fig. 2). The OTSU algorithm assumes that the image contains two classes of pixels that follow the bimodal histogram (foreground pixels and background pixels), then calculates the optimal threshold separating the two classes so that their combined spread (minimum variability) is minimal or equivalent (because the sum of the pairs of square distances is constant) so that their variability is maximal.
Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) which has been converted to HSV color model (Hue, Saturation, Value), in order to find the class that has the more pixels in shades of green. After
finding the class with the most pixels with shades of green, the corresponding class of pixels of thermal photography is selected. Then, export the metadata of the thermal image which is the temperature of each pixel of the class selected by the above procedure. In this way, a table is made which contains the coordinates and the temperature of each pixel.
Next, the k-Means clustering method is applied to the data in the table above. The k-Means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of organizing patterns (temperatures) into groups, where the group members are similar. Depending on the group you belong to, you assign a specific color to each pixel (e.g., the pixel belonging to the group with the lowest temperature, you assign the dark green). The purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops.
Using the coordinates and the coloring of each pixel, a new photo is created. The new normalized photo shows only the crop's temperature fluctuation with standard pseudo-coloring (Fig. 4). Then, the coloring and the coordinates of the pixels representing the highest temperatures are selected. The pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5). The pseudo-coloring digital photo is then fed to a trained Convolutional Neural Network, which performs stressed sub-areas classification within the crop with a specific delimitation of the stressed region with contour.
The final result is the digital photograph of standardized pseudo-coloring of the crop with delimited outlines of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).
Claims
1. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs which includes the following phases: a. conversion of thermal photography (pseudo-coloring) to Grayscale
(Fig- 1). b. Using the OTSU method to isolate the background, which is the soil (Fig. 2). The algorithm OTSU assumes that the image contains two classes of pixels following a bimodal histogram (foreground pixels and background pixels). c. Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) converted to HSV color model (Hue, Saturation, Value), to find the class that has the more pixels in shades of green. After finding the class with the most pixels with shades of green, the corresponding class of thermal photography pixels is selected. d. exporting the metadata of the thermal image, which is the temperature of each class’s pixel selected by the above procedure. e. construction of an array that contains the coordinates and the temperature of each pixel. f. application of the k-means clustering method in the data of the above array. The k-means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of grouping similar objects (temperatures) into different groups, or more precisely, the partitioning of a data set into subsets so that the data in each subset according to some defined distance measure.
g. assigning a specific color to each pixel depending on the group it belongs to; for example, the pixel that belongs to the group with the lowest temperature assigns the dark green. The purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops. h. constructing a new photo using the coordinates and coloring of each pixel. The new normalized photo (Fig. 4) depicts only the crop with standard pseudo-coloring. i. selection of the coloring and the coordinates of the pixels representing the highest temperatures and places the pseudo-color on the corresponding (same coordinates) pixels of the digital photograph (Fig.
5)· j. feeding of digital pseudo-coloring photo in a trained Convolutional Neural Network. k. classification stressed sub-areas within the crop with a specific delimitation of the stressed region with contour. l. illustration of the digital photograph of a standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).
2. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is a vineyard.
3. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is treelike.
4. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is an industrial crop.
5. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is vegetables.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GR20200100243A GR1009898B (en) | 2020-05-12 | 2020-05-12 | Method of detection and evaluation of the biotic-abiotic stress in cultivations via thermal photographs and use of artificial intelligence |
GR20200100243 | 2020-05-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021229248A1 true WO2021229248A1 (en) | 2021-11-18 |
Family
ID=75107708
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GR2021/000024 WO2021229248A1 (en) | 2020-05-12 | 2021-04-28 | Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence |
Country Status (2)
Country | Link |
---|---|
GR (1) | GR1009898B (en) |
WO (1) | WO2021229248A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115200722A (en) * | 2022-09-16 | 2022-10-18 | 江苏宸洋食品有限公司 | Temperature measuring method and refrigerator car temperature measuring system applying same |
KR102631597B1 (en) * | 2023-06-29 | 2024-02-02 | 주식회사 리플로그 | Strawberry stress index calculation method and cultivation management system using chlorophyll fluorescence value |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1263962A1 (en) * | 2000-02-25 | 2002-12-11 | Avestha Gengraine Technologies PVT Ltd | A process for constructing dna based molecular marker for enabling selection of drought and diseases resistant germplasm screening |
CN101727665B (en) * | 2008-10-27 | 2011-09-07 | 广州飒特电力红外技术有限公司 | Method and device for fusing infrared images and visible light images |
KR101729169B1 (en) * | 2014-10-29 | 2017-05-11 | 서울대학교산학협력단 | Method of diagnosing responses of plants to abiotic stress or herbicide using thermal image |
CN105719304B (en) * | 2016-01-25 | 2018-04-13 | 中山大学 | A kind of flower image dividing method based on Otsu |
CN105678793B (en) * | 2016-02-26 | 2019-01-15 | 浙江大学 | A kind of method of early diagnosis and device of the Prospect on Kiwifruit Bacterial Canker based on image co-registration |
KR101830056B1 (en) * | 2017-07-05 | 2018-02-19 | (주)이지팜 | Diagnosis of Plant disease using deep learning system and its use |
FR3069940B1 (en) * | 2017-08-03 | 2019-09-06 | Universite D'orleans | METHOD AND SYSTEM FOR CARTOGRAPHY OF THE HEALTH CONDITION OF CULTURES |
CN107767364B (en) * | 2017-09-12 | 2021-03-23 | 中国林业科学研究院林业研究所 | Method for accurately extracting temperature of tree canopy based on infrared thermal image |
CN108537777A (en) * | 2018-03-20 | 2018-09-14 | 西京学院 | A kind of crop disease recognition methods based on neural network |
-
2020
- 2020-05-12 GR GR20200100243A patent/GR1009898B/en active IP Right Grant
-
2021
- 2021-04-28 WO PCT/GR2021/000024 patent/WO2021229248A1/en active Application Filing
Non-Patent Citations (21)
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115200722A (en) * | 2022-09-16 | 2022-10-18 | 江苏宸洋食品有限公司 | Temperature measuring method and refrigerator car temperature measuring system applying same |
KR102631597B1 (en) * | 2023-06-29 | 2024-02-02 | 주식회사 리플로그 | Strawberry stress index calculation method and cultivation management system using chlorophyll fluorescence value |
Also Published As
Publication number | Publication date |
---|---|
GR1009898B (en) | 2021-01-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230292647A1 (en) | System and Method for Crop Monitoring | |
Xu et al. | Multispectral imaging and unmanned aerial systems for cotton plant phenotyping | |
Solano et al. | A methodology based on GEOBIA and WorldView-3 imagery to derive vegetation indices at tree crown detail in olive orchards | |
Onishi et al. | Automatic classification of trees using a UAV onboard camera and deep learning | |
de Oca et al. | The AgriQ: A low-cost unmanned aerial system for precision agriculture | |
CN109325431B (en) | Method and device for detecting vegetation coverage in feeding path of grassland grazing sheep | |
Chauhan et al. | Wheat lodging assessment using multispectral UAV data | |
Ponti et al. | Precision agriculture: Using low-cost systems to acquire low-altitude images | |
Shirzadifar et al. | Field identification of weed species and glyphosate-resistant weeds using high resolution imagery in early growing season | |
WO2021229248A1 (en) | Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence | |
Izzuddin et al. | Analysis of multispectral imagery from unmanned aerial vehicle (UAV) using object-based image analysis for detection of ganoderma disease in oil palm | |
Tian et al. | Machine learning-based crop recognition from aerial remote sensing imagery | |
Hanapi et al. | A review on remote sensing-based method for tree detection and delineation | |
Preethi et al. | An comprehensive survey on applications of precision agriculture in the context of weed classification, leave disease detection, yield prediction and UAV image analysis | |
Silva et al. | Mapping two competing grassland species from a low-altitude Helium balloon | |
Lobitz et al. | Grapevine remote sensing analysis of phylloxera early stress (GRAPES): remote sensing analysis summary | |
Cimtay et al. | A new vegetation index in short-wave infrared region of electromagnetic spectrum | |
Aziz et al. | Detection of Bacterial Leaf Blight Disease Using RGB-Based Vegetation Indices and Fuzzy Logic | |
Yano et al. | Weed identification in sugarcane plantation through images taken from remotely piloted aircraft (RPA) and KNN classifier | |
Jurišić et al. | The evaluation of the RGB and multispectral camera on the unmanned aerial vehicle (UAV) for the machine learning classification of Maize | |
Yang et al. | Using multispectral imagery and linear spectral unmixing techniques for estimating crop yield variability | |
CN112577954B (en) | Urban green land biomass estimation method | |
López et al. | Multi-Spectral Imaging for Weed Identification in Herbicides Testing | |
Ghasemloo et al. | Vegetation species determination using spectral characteristics and artificial neural network (SCANN) | |
Papić et al. | On Olive Groves Analysis using UAVs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21726448 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21726448 Country of ref document: EP Kind code of ref document: A1 |