Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors
<p>Proposed methodology for PM damage level detection, where image collection is used for feature extraction and selection. A multiclassification is operated with the results of the classification process. In the end, a performance evaluation is conducted to verify the optimal classification.</p> "> Figure 2
<p>A timeline of the sampling days and the phenological growth stages to identify PM damage levels. The phenological stages (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>S</mi> <mn>8</mn> </msub> </semantics></math>) and the sampling days (<math display="inline"><semantics> <msub> <mi>D</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>D</mi> <mn>19</mn> </msub> </semantics></math>) are considered as basic information. Then, four PM damage levels are defined: <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> for healthy leaves, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> for leaves with spore in germination, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> for leaves with the first symptoms, and <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math> for diseased leaves.</p> "> Figure 3
<p>Visual evaluation of cucurbit leaves where four PM damage levels were defined: (<b>a</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math>: healthy leaves, (<b>b</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>: leaves with spore in germination, (<b>c</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>: leaves with the first symptoms, and (<b>d</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>: diseased leaves.</p> "> Figure 4
<p>Exploration by parts of the leaf for the selection of the region of interest (ROI): (<b>a</b>) division of the leaf, central part (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>1</mn> </msub> </semantics></math>), lower right lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>2</mn> </msub> </semantics></math>), upper right lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>3</mn> </msub> </semantics></math>), upper central lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>4</mn> </msub> </semantics></math>), upper left lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>5</mn> </msub> </semantics></math>) and lower left lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>6</mn> </msub> </semantics></math>), (<b>b</b>) first symptoms at <math display="inline"><semantics> <msub> <mi>R</mi> <mn>4</mn> </msub> </semantics></math>.</p> "> Figure 5
<p>Preprocessing of the ROI images starting with the color transformation and separation of color components (CCs), where the sample image (<math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math>) is the original image, which is followed by the analysis of ROI results in a new sample in RGB (<math display="inline"><semantics> <mrow> <mi>R</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>), then a contrast adjust (<math display="inline"><semantics> <mrow> <mi>C</mi> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </semantics></math>) is performed to obtain the transformation of the image (<math display="inline"><semantics> <mrow> <mi>T</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>) in the different color spaces (<math display="inline"><semantics> <mrow> <mi>G</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>L</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>Y</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>) and the separation for color components.</p> "> Figure 6
<p>Calculation of the GLCM matrix in a gray image. The distance is <math display="inline"><semantics> <mrow> <mi>d</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, and the angle is <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>: (<b>a</b>) gray image, (<b>b</b>) gray levels <math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math>, and (<b>c</b>) GLCM matrix with the paired pixels <math display="inline"><semantics> <mrow> <mi>g</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>.</p> "> Figure 7
<p>Processed image (<math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>); color transformation (<math display="inline"><semantics> <mrow> <mi>H</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>); components <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>; and their GLCM matrices <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> with 255 gray levels.</p> "> Figure 8
<p>Process of feature extraction through the color component images.</p> "> Figure 9
<p>Feature selection process consists of a Lilliefors test, then an analysis of variance, and Tukey’s test.</p> "> Figure 10
<p>Results of the ANOVA and Tuke’s test: (<b>a</b>) mean values of the damage levels of diss-BB; (<b>b</b>) Tukey’s test, where the means of the damage levels are significantly different; (<b>c</b>) mean values of the damage levels of auto-A; (<b>d</b>) Tukey’s test, where the means of <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>, and <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math> are equal but significantly different from <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math>.</p> "> Figure 11
<p>Kernel selection in the multiclassification system with the feature vectors in different color spaces with the optimal hyperplane: (<b>a</b>) linear kernel in 2D with diss<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>B</mi> <mi>B</mi> </mrow> </msub> </semantics></math> versus cont<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math>, (<b>b</b>) 3D optimal hyperplane, (<b>c</b>) training and validation data with error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, (<b>d</b>) polynomial kernel in 2D with auto<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus savg<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math>, (<b>e</b>) 3D optimal hyperplane, (<b>f</b>) training and validation data with error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>g</b>) sigmoidal kernel in 2D with ener<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>G</mi> <mi>G</mi> </mrow> </msub> </semantics></math> versus dvar<math display="inline"><semantics> <msub> <mrow/> <mi>A</mi> </msub> </semantics></math>, (<b>h</b>) 3D optimal hyperplane, (<b>i</b>) training and validation data with the error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>, (<b>j</b>) radial base function kernel in 2D with diss<math display="inline"><semantics> <msub> <mrow/> <mi>Y</mi> </msub> </semantics></math> versus inf<math display="inline"><semantics> <msub> <mrow/> <mn>1</mn> </msub> </semantics></math><math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>B</mi> <mi>B</mi> </mrow> </msub> </semantics></math> con kernel RBF, (<b>k</b>) 3D optimal hyperplane, and (<b>l</b>) training and validation data with the error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>.</p> "> Figure 12
<p>Kernel selection in the multiclassification system with the feature vectors in different color space with the optimal hyperplane: (<b>a</b>) radial base function kernel in 2D with auto<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus dent<math display="inline"><semantics> <msub> <mrow/> <mi>S</mi> </msub> </semantics></math>, (<b>b</b>) 3D optimal hyperplane, (<b>c</b>) training and validating data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>d</b>) linear kernel in 2D with idmn<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math> versus diss<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math>, (<b>e</b>) 3D optimal hyperplane, (<b>f</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, (<b>g</b>) polynomial kernel in 2D with dvar<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>C</mi> <mi>R</mi> </mrow> </msub> </semantics></math> versus homo<math display="inline"><semantics> <msub> <mrow/> <mi>Y</mi> </msub> </semantics></math>, (<b>h</b>) 3D optimal hyperplane, (<b>i</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>j</b>) radial base function kernel in 2D with ener<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus entr<math display="inline"><semantics> <msub> <mrow/> <mi>S</mi> </msub> </semantics></math> con kernel RBF, (<b>k</b>) 3D optimal hyperplane, and, (<b>l</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>.</p> "> Figure 13
<p>One-versus-one multiclassification method. The main inputs are the support vectors <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>s</mi> <mn>6</mn> </msub> </mrow> </semantics></math>), the validation data for each binary classifier <math display="inline"><semantics> <mrow> <msub> <mi>M</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>M</mi> <mn>6</mn> </msub> </mrow> </semantics></math>, and <math display="inline"><semantics> <mi>σ</mi> </semantics></math>. Each block <math display="inline"><semantics> <mrow> <msub> <mi>V</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>V</mi> <mn>4</mn> </msub> </mrow> </semantics></math> contains the different support vector machines for multiple classification.</p> "> Figure 14
<p>SVM binary classifiers: (<b>a</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math> and SVM-classified data, (<b>b</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>2</mn> </msub> </semantics></math> and SVM-classified data, (<b>c</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>3</mn> </msub> </semantics></math> and SVM-classified data, (<b>d</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>4</mn> </msub> </semantics></math> and SVM-classified data, and (<b>e</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>5</mn> </msub> </semantics></math> and SVM-classified data.</p> "> Figure 15
<p>SVM binary classifiers with components of the same color space: (<b>a</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>1</mn> </msub> </semantics></math> and SVM-classified data, (<b>b</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>2</mn> </msub> </semantics></math> and SVM-classified data, (<b>c</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>3</mn> </msub> </semantics></math> and SVM-classified data, (<b>d</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>4</mn> </msub> </semantics></math> and SVM-classified data, and (<b>e</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>5</mn> </msub> </semantics></math> and SVM-classified data.</p> ">
Abstract
:1. Introduction
Literature Review
2. Materials and Methods
2.1. Acquisition
2.2. Proposed Powdery Mildew Damage Levels
2.3. Preprocessing
2.4. Feature Extraction
2.5. Feature Selection
2.6. Formation of the Feature Vectors
2.7. Proposed Multiclass Classification Framework
2.8. Performance Evaluation
3. Results
3.1. Different Color Space Feature Vectors
3.2. Same Color Space Feature Vectors
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
PM | Powdery mildew |
RGB | Red, green, and blue |
HSV | Hue, saturation, and value |
L*a*b | Luminance, red, and blue crominance |
YCbCr | Luma component, Cb and Cr chroma components |
ANOVA | Analysis of variance |
ROI | Region of interest |
GLCM | Gray-level co-ocurrence matrix |
CC | Color component |
TD | Texture descriptor |
References
- Barickman, T.C.; Horgan, T.E.; Wilson, J.C. Efficacy of fungicide applications and powdery mildew resistance in three pumpkin cultivars. Crop Prot. 2017, 101, 90–94. [Google Scholar] [CrossRef]
- Gudbrand, O.A. Methods for Detection of Powdery Mildew in Agricultural Plants with Hyperspectral Imaging. Master’s Thesis, Norwegian University of Life Sciences, Ås, Norway, 2017. [Google Scholar]
- Burdon, J.J.; Zhan, J. Climate change and disease in plant communities. PLoS Biol. 2020, 18, e3000949. [Google Scholar] [CrossRef] [PubMed]
- Pydipati, R.; Burks, T.; Lee, W. Identification of citrus disease using color texture features and discriminant analysis. Comput. Electron. Agric. 2006, 52, 49–59. [Google Scholar] [CrossRef]
- Camargo, A.; Smith, J. Image pattern classification for the identification of disease causing agents in plants. Comput. Electron. Agric. 66, 121–125. [CrossRef]
- Pawar, P.; Turkar, V.; Patil, P. Cucumber disease detection using artificial neural network. In Proceedings of the International Conference on Inventive Computation Technologies, ICICT, Coimbatore, India, 26–27 August 2016. [Google Scholar]
- Costa Lage, D.A.; Marouelli, W.A.; da S. S. Duarte, H.; Café-Filho, A.C. Standard area diagrams for assessment of powdery mildew severity on tomato leaves and leaflets. Crop Prot. 2015, 67, 26–34. [Google Scholar] [CrossRef]
- Kumar, S.; Sharma, B.R.; Sharma, V.K.; Sharma, H.; Bansal, J.C. Plant leaf disease identification using exponential spider monkey optimization. Sustain. Comput. Inform. Syst. 2020, 28, 100283. [Google Scholar] [CrossRef]
- Lamba, S.; Kukreja, V.; Baliyan, A.; Rani, S.; Ahmed, S.H. A Novel Hybrid Severity Prediction Model for Blast Paddy Disease Using Machine Learning. Sustainability 2023, 15, 1502. [Google Scholar] [CrossRef]
- Kaya, Y.; Gürsoy, E. A novel multi-head CNN design to identify plant diseases using the fusion of RGB images. Ecol. Inform. 2023, 75, 101998. [Google Scholar] [CrossRef]
- Xu, Q.; Cai, J.; Ma, L.; Tan, B.; Li, Z.; Sun, L. Custom-Developed Reflection–Transmission Integrated Vision System for Rapid Detection of Huanglongbing Based on the Features of Blotchy Mottled Texture and Starch Accumulation in Leaves. Plants 2023, 12, 616. [Google Scholar] [CrossRef]
- Sabat-Tomala, A.; Raczko, E.; Zagajewski, B. Comparison of support vector machine and random forest algorithms for invasive and expansive species classification using airborne hyperspectral data. Remote Sens. 2020, 12, 516. [Google Scholar] [CrossRef]
- Kasinathan, T.; Singaraju, D.; Uyyala, S.R. Insect classification and detection in field crops using modern machine learning techniques. Inf. Process. Agric. 2020, 8, 446–457. [Google Scholar] [CrossRef]
- Yağ, İ.; Altan, A. Artificial Intelligence-Based Robust Hybrid Algorithm Design and Implementation for Real-Time Detection of Plant Diseases in Agricultural Environments. Biology 2022, 11, 1732. [Google Scholar] [CrossRef] [PubMed]
- Fernández, C.I.; Leblon, B.; Wang, J.; Haddadi, A.; Wang, K. Cucumber powdery mildew detection using hyperspectral data. Can. J. Plant Sci. 2022, 102, 20–32. [Google Scholar] [CrossRef]
- Rivera-Romero, C.A.; Palacios-Hernández, E.R.; Trejo-Durán, M.; Rodríguez-Liñán, M.d.C.; Olivera-Reyna, R.; Morales-Saldaña, J.A. Visible and near-infrared spectroscopy for detection of powdery mildew in Cucurbita pepo L. leaves. J. Appl. Remote Sens. 2020, 14, 044515. [Google Scholar] [CrossRef]
- Mattonen, S.A.; Huang, K.; Ward, A.D.; Senan, S.; Palma, D.A. New techniques for assessing response after hypofractionated radiotherapy for lung cancer. J. Thorac. Dis. 2014, 6, 375–386. [Google Scholar]
- Brynolfsson, P.; Nilsson, D.; Torheim, T. Haralick texture features from apparent diffusion coefficient (ADC) MRI images depend on imaging and pre-processing parameters. Sci. Rep. 2017, 7, 4041. [Google Scholar] [CrossRef] [PubMed]
- Dinstein, I.; Shanmugam, K.; Haralick, R.M. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern. 1973, 3, 610–621. [Google Scholar]
- Johnson, R.A.; Wichern, D.W. Applied Multivariate Statistical Analysis, 6th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2007. [Google Scholar]
- Conover, W.J. Practical Nonparametric Statistics, 3rd ed.; John Wiley & Sons: Hoboken, NJ, USA, 1998. [Google Scholar]
- Lilliefors, H.W. On the kolmogorov-smirnov test for normality with mean and variance unknown. J. Am. Stat. Assoc. 1967, 62, 399–402. [Google Scholar] [CrossRef]
- Rumpf, T.; Mahlein, A.K.; Steiner, U.; Oerke, E.C.; Dehne, H.W.; Plümer, L. Early detection and classification of plant diseases with Support Vector Machines based on hyperspectral reflectance. Comput. Electron. Agric. 2010, 74, 91–99. [Google Scholar] [CrossRef]
- Deng, X.; Liu, Q.; Deng, Y.; Mahadevan, S. An improved method to construct basic probability assignment based on the confusion matrix for classification problem. Inf. Sci. 2016, 340–341, 250–261. [Google Scholar] [CrossRef]
- Salla, R.; Wilhelmiina, H.; Sari, K.; Mikaela, M.; Pekka, M.; Jaakko, M. Evaluation of the confusion matrixmethod in the validation of an automated system for measuring feeding behaviour of cattle. Behav. Process. 2018, 148, 56–62. [Google Scholar]
- Ma, J.; Du, K.; Zheng, F.; Zhang, L.; Gong, Z.; Sun, Z. A recognition method for cucumber diseases using leafsymptom images based on deep convolutional neural network. Comput. Electron. Agric. 2018, 154, 18–24. [Google Scholar] [CrossRef]
- Griffel, L.; Delparte, D.; Edwards, J. Using support vector machines classification to differentiate spectralsignatures of potato plants infected with potato virus y. Comput. Electron. Agric. 2018, 153, 318–324. [Google Scholar] [CrossRef]
- Ohsaki, M.; Wang, P.; Matsuda, K.; Katagiri, S.; Watanabe, H.; Ralescu, A. Confusion-matrix-based kernel logistic regression for imbalanced data classification. IEEE Trans. Knowl. Data Eng. 2017, 29, 1806–1819. [Google Scholar] [CrossRef]
- Vieira, S.M.; Kaymak, U.; Sousa, J.M.C. Cohen’s kappa coefficient as a performance measure for feature selection. In Proceedings of the International Conference on Fuzzy Systems, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Pare, S.; Bhandari, A.K.; Kumar, A.; Singh, G.K. An optimal color image multilevel thresholding technique using grey-level co-occurrence matrix. Expert Syst. Appl. 2017, 87, 335–362. [Google Scholar] [CrossRef]
- Kadir, A. A model of plant identification system using GLCM, lacunarity and shen features. Res. J. Pharm. Biol. Chem. Sci. 2014, 5, 1–10. [Google Scholar]
- Ehsanirad, A.; Sharath Kumar, Y.H. Leaf recognition for plant classification using GLCM and PCA methods. Orient. J. Comput. Sci. Technol. 2010, 3, 31–36. [Google Scholar]
- Malegori, C.; Franzetti, L.; Guidetti, R.; Casiraghi, E.; Rossi, R. GLCM, an image analysis technique for early detection of biofilm. J. Food Eng. 2016, 185, 48–55. [Google Scholar] [CrossRef]
- Mukherjee, G.; Chatterjee, A.; Tudu, B. Study on the potential of combined glcm features towards medicinalplant classification. In Proceedings of the 2016 2nd International Conference on Control, Instrumentation, Energy Communication (CIEC), Kolkata, India, 28–30 January 2016; pp. 98–102. [Google Scholar]
- Arabi, P.M.; Joshi, G.; Deepa, N.V. Performance evaluation of glcm and pixel intensity matrix for skin textureanalysis. Perspect. Sci. 2016, 8, 203–206. [Google Scholar] [CrossRef]
- Barbedo, J.G.A. Using digital image processing for counting whiteflies on soybean leaves. J. Asia-Pac. Entomol. 2014, 17, 685–694. [Google Scholar] [CrossRef]
Equation | DTs | Texture Descriptors |
---|---|---|
auto | Autocorrelation | |
cont | Contrast | |
corr | Correlation 1 | |
cpro | Cluster Prominence 1 | |
csha | Cluster Shade 1 | |
diss | Dissimilarity | |
ener | Energy | |
entr | Entropy | |
homo | Homogeneity | |
maxp | Maximum Probability 1 | |
sosv | Sum of Squares | |
savg | Sum Average | |
svar | Sum Variance | |
sent | Sum Entropy | |
dvar | Difference Variance 1 | |
dent | Difference Entropy | |
inf | Information Measure of Correlation 2 3 | |
inf | Information Measure of Correlation 2 3 | |
indn | Inverse Difference Normalized | |
idmn | Inverse Difference Moment Normalized |
Gray | RGB | HSV | L * a * b | YCbCr | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
TDs | G | R | GG | BB | H | S | V | L | A | B | Y | CB | CR | |
diss | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | ||
1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | ||
0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | ||
homo | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ||
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ||
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ||
idmn | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | |
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ||
1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ||
1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |
Feature | F Statistic | ||||
---|---|---|---|---|---|
ener | 184.7 | a | b | c | d |
corr | 174.7 | a | b | c | d |
homo | 171.2 | a | b | c | d |
corr | 158.6 | a | b | c | d |
ener | 143.2 | a | b | c | d |
ener | 142.6 | a | b | c | d |
dent | 134.5 | a | b | c | d |
sosv | 71.4 | a | b | c | d |
dvar | 125.5 | a | b | c | d |
idmn | 124.4 | a | b | c | d |
cpro | 122.4 | a | b | c | d |
homo | 119.4 | a | b | c | d |
entr | 112.7 | a | b | c | d |
homo | 111.3 | a | b | c | d |
cont | 109.7 | a | b | c | d |
dvar | 109.7 | a | b | c | d |
dvar | 105.9 | a | b | c | d |
TDs | Vector | Features |
---|---|---|
Different | auto, dent, svar, svag, sosv, savg | |
color | entr, homo, idmn, idmn, dvar, cont | |
space | dvar, cont, idmn, idmn, idmn, cont | |
combinations | cont, homo, entr, homo, cpro, idmn | |
cont, dvar, dent, ener, ener, corr | ||
Same | sent, entr, idmn, cont, diss, inf | |
color | dent, entr, auto, svar, sosv, ener | |
space | diss, savg, idmn, cont, dvar, ener | |
combinations | diss, homo, corr, idmn, dvar, cont | |
diss, savg, idmn, cont, dvar, homo |
Kernel | SVM | p// | R | h | % Error | ||
---|---|---|---|---|---|---|---|
Lineal | - | 433.36 | 1.0 × 1015 | 0.0 + 90.74 | 2.4 × 1012 | 17.4 | |
Lineal | - | 448.28 | 1.1 × 1015 | 0.0 + 9.53i | 2.6 × 1012 | 18.6 | |
Polynomial | 4 | 3917 | 1.2 × 1016 | 0.0 + 3.0i | 3105.6 | 30 | |
Polynomial | 4 | 3989 | 7.1 × 1016 | 0.0 + 8.7i | 1801.79 | 20.2 | |
Sigmoidal | 1 | 2192.8 | 1096.5 | 0.0 + 1.67i | 500 | 17.6 | |
Sigmoidal | 7 | 1614.5 | 8072.3 | 0.0 + 9.1i | 500 | 43.4 | |
RBF | 0.5 | 0.9978 | 460.17 | 4.1048 | 461.18 | 0 | |
RBF | 0.5 | 0.9977 | 464.95 | 4.1237 | 465.98 | 0 | |
RBF | 0.5 | 0.9979 | 493.85 | 4.2359 | 494.87 | 0 | |
RBF | 0.5 | 0.9979 | 487.92 | 4.2132 | 488.94 | 0 | |
RBF | 1 | 0.9972 | 413.45 | 3.9135 | 414.60 | 0 | |
RBF | 1 | 0.9974 | 456.07 | 4.0885 | 457.22 | 0 |
Kernel | SVM | p// | R | h | % Error | ||
---|---|---|---|---|---|---|---|
Lineal | - | 478.89 | 1.1 × 1015 | 0.0 + 9.74i | 2.4 × 1012 | 16.8 | |
Lineal | - | 455.57 | 1.0 × 1015 | 0.0 + 8.59i | 2.2 × 1012 | 15 | |
Polinomial | 6 | 9.95 × 1018 | 4.97 × 1021 | 0.0 + 26i | 500 | 16 | |
Polinomial | 6 | 9.28 × 1018 | 1.24 × 1015 | 0.0 + 98.2i | 0.0001 | 0 | |
Sigmoidal | 3 | 2137.45 | 1065.1 | 0.0 + 11.8i | 500 | 17.2 | |
Sigmoidal | 3 | 2104.09 | 1052.5 | 0.0 + 11.1i | 500 | 17.8 | |
RBF | 1 | 0.9957 | 458.48 | 4.098 | 460.42 | 0 | |
RBF | 0.5 | 0.9979 | 469.96 | 4.1434 | 470.95 | 0 | |
RBF | 0.5 | 0.9978 | 491.79 | 4.2280 | 492.79 | 0 | |
RBF | 0.5 | 0.9979 | 488.64 | 4.2160 | 489.64 | 0 | |
RBF | 2 | 0.9799 | 34,676.99 | 25.8 | 35,385.5 | 0 | |
RBF | 1 | 0.9962 | 764.31 | 5.1414 | 767.17 | 0 |
Vector | (%) | |||||
---|---|---|---|---|---|---|
93.1 | 0.832 | 0.965 | 0.887 | 0.035 | 85.8 | |
88.4 | 0.700 | 0.945 | 0.811 | 0.055 | 75.1 | |
88.9 | 0.682 | 0.958 | 0.844 | 0.042 | 75.5 | |
90.0 | 0.728 | 0.957 | 0.850 | 0.043 | 78.4 | |
91.2 | 0.754 | 0.964 | 0.875 | 0.036 | 81.0 |
Vector | ||||||
---|---|---|---|---|---|---|
87.3 | 0.625 | 0.956 | 0.824 | 0.044 | 0.711 | |
90.8 | 0.776 | 0.952 | 0.843 | 0.048 | 0.808 | |
94.4 | 0.877 | 0.967 | 0.898 | 0.033 | 0.887 | |
91.4 | 0.752 | 0.968 | 0.887 | 0.032 | 0.814 | |
87.3 | 0.678 | 0.938 | 0.786 | 0.062 | 0.728 |
b) SVM- | Classified | % Correct | ||||
71 | 6 | 0 | 2 | 79 | 89.87 | |
2 | 9 | 0 | 0 | 11 | 81.82 | |
0 | 0 | 12 | 0 | 12 | 100.00 | |
2 | 0 | 0 | 2 | 4 | 50.00 | |
Test data | 75 | 15 | 12 | 4 | 106 | |
% Correct | 94.67 | 60.00 | 100.00 | 50.00 | 88.68 | |
d) SVM- | Classified | % Correct | ||||
46 | 5 | 2 | 2 | 55 | 83.64 | |
4 | 7 | 0 | 0 | 11 | 63.64 | |
3 | 0 | 7 | 0 | 10 | 70.00 | |
2 | 0 | 0 | 17 | 19 | 89.47 | |
Test data | 55 | 12 | 9 | 19 | 95 | |
% Correct | 83.64 | 58.33 | 77.78 | 89.47 | 81.05 | |
f) SVM- | Classified | % Correct | ||||
92 | 4 | 1 | 1 | 98 | 93.88 | |
4 | 5 | 3 | 0 | 12 | 41.67 | |
2 | 1 | 0 | 0 | 3 | 0.00 | |
3 | 0 | 0 | 6 | 9 | 66.67 | |
Test data | 101 | 10 | 4 | 7 | 122 | |
% Correct | 91.09 | 50.00 | 0.00 | 85.71 | 84.43 | |
b) SVM- | Classified | % Correct | ||||
74 | 4 | 2 | 1 | 81 | 91.36 | |
3 | 3 | 1 | 0 | 7 | 42.86 | |
1 | 1 | 6 | 0 | 8 | 75.00 | |
1 | 0 | 2 | 8 | 11 | 72.73 | |
Test data | 79 | 8 | 11 | 9 | 107 | |
% Correct | 93.67 | 37.50 | 54.55 | 88.89 | 85.05 | |
d) SVM- | Classified | % Correct | ||||
86 | 3 | 2 | 0 | 91 | 94.51 | |
0 | 0 | 0 | 0 | 0 | 0.00 | |
6 | 0 | 0 | 0 | 6 | 0.00 | |
3 | 0 | 0 | 12 | 15 | 80.00 | |
Test data | 95 | 3 | 2 | 12 | 112 | |
% Correct | 90.53 | 0.00 | 0.00 | 100.00 | 87.50 | |
Time | 5–6 ms |
Vector | Features | Kappa | % Correct | |
---|---|---|---|---|
auto, dent, svar, | 0.931 | 0.7874 | 88.68 | |
savg, sosv, savg | ||||
cont, dvar, dent, | 0.912 | 0.7841 | 87.50 | |
ener, ener, corr | ||||
diss, savg, idmn | 0.944 | 0.7638 | 89.76 | |
cont, dvar, ener | ||||
diss, homo, corr | 0.914 | 0.7835 | 88.68 | |
idmn, dvar, cont |
b) SVM- | Classified | % Correct | ||||
56 | 3 | 2 | 1 | 62 | 90.32 | |
0 | 2 | 0 | 0 | 2 | 100.00 | |
6 | 0 | 6 | 0 | 12 | 50.00 | |
1 | 0 | 3 | 11 | 15 | 73.33 | |
Test data | 63 | 5 | 11 | 12 | 91 | |
% Correct | 88.89 | 40.00 | 54.55 | 91.67 | 82.42 | |
d) SVM- | Classified | % Correct | ||||
82 | 5 | 4 | 2 | 93 | 88.17 | |
2 | 9 | 1 | 0 | 12 | 75.00 | |
2 | 0 | 1 | 0 | 3 | 33.33 | |
2 | 0 | 0 | 5 | 7 | 71.43 | |
Test data | 88 | 14 | 6 | 7 | 115 | |
% Correct | 93.18 | 64.29 | 16.67 | 71.43 | 84.35 | |
b) SVM- | Classified | % Correct | ||||
101 | 0 | 2 | 0 | 103 | 98.06 | |
4 | 4 | 4 | 0 | 12 | 33.33 | |
0 | 0 | 0 | 0 | 0 | 0.00 | |
3 | 0 | 0 | 9 | 12 | 75.00 | |
Test data | 108 | 4 | 6 | 9 | 127 | |
% Correcs | 93.52 | 100.00 | 0.00 | 100.00 | 89.76 | |
d) SVM- | Classified | % Correct | ||||
90 | 0 | 0 | 0 | 90 | 100.00 | |
6 | 4 | 0 | 0 | 10 | 40.00 | |
4 | 0 | 0 | 0 | 4 | 0.00 | |
2 | 0 | 0 | 0 | 2 | 0.00 | |
Prueba | 102 | 4 | 0 | 0 | 106 | |
% Corrects | 88.24 | 100.00 | 0.00 | 0.00 | 88.68 | |
f) SVM- | Classified | % Corrects | ||||
67 | 3 | 0 | 2 | 72 | 93.06 | |
9 | 7 | 3 | 0 | 19 | 36.84 | |
3 | 2 | 5 | 0 | 10 | 50.00 | |
3 | 2 | 0 | 20 | 25 | 80.00 | |
Test data | 82 | 14 | 8 | 22 | 126 | |
% Corrects | 81.71 | 50.00 | 62.50 | 90.91 | 78.57 | |
Time | 5–6 ms |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rivera-Romero, C.A.; Palacios-Hernández, E.R.; Vite-Chávez, O.; Reyes-Portillo, I.A. Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors. Inventions 2024, 9, 8. https://doi.org/10.3390/inventions9010008
Rivera-Romero CA, Palacios-Hernández ER, Vite-Chávez O, Reyes-Portillo IA. Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors. Inventions. 2024; 9(1):8. https://doi.org/10.3390/inventions9010008
Chicago/Turabian StyleRivera-Romero, Claudia Angélica, Elvia Ruth Palacios-Hernández, Osbaldo Vite-Chávez, and Iván Alfonso Reyes-Portillo. 2024. "Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors" Inventions 9, no. 1: 8. https://doi.org/10.3390/inventions9010008
APA StyleRivera-Romero, C. A., Palacios-Hernández, E. R., Vite-Chávez, O., & Reyes-Portillo, I. A. (2024). Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors. Inventions, 9(1), 8. https://doi.org/10.3390/inventions9010008