Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 

Inventions and Innovation in Smart Sensing Technologies for Agriculture

A special issue of Inventions (ISSN 2411-5134). This special issue belongs to the section "Inventions and Innovation in Design, Modeling and Computing Methods".

Deadline for manuscript submissions: closed (31 January 2025) | Viewed by 10517

Special Issue Editor


E-Mail Website
Guest Editor
College of Engineering, China Agricultural University, Beijing 100083, China
Interests: smart sensing; smart agriculture; Internet of Things; energy harvesting sensing; self-powered sensing; battery-free sensing; food monitoring
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Agriculture plays a crucial role in ensuring global food security and sustainable development. Smart sensing technologies have emerged as powerful tools to enhance productivity, optimize resource utilization, and improve overall agricultural practices. By gathering cutting-edge research and technological advancements in smart sensing technologies, the Special Issue on “Inventions and Innovation of Smart Sensing Technologies in Agriculture” aims to contribute to the advancement of precision agriculture, sustainability, and productivity in the agricultural sector.

The Special Issue aims to cover a wide range of topics including, but not limited to, the following:

  • Design and development of novel smart sensors for monitoring soil conditions, crop health, irrigation, fertilization and agri-food quality management.
  • Wireless sensor networks and Internet of Things (IoT) applications in precision agriculture for real-time data collection, analysis, and decision-making.
  • Remote sensing techniques and satellite imagery for crop monitoring, yield estimation, and land management.
  • Integration of advanced technologies such as robotics, drones, and artificial intelligence in smart sensing systems for automated farming operations.
  • Smart sensing technologies for monitoring and controlling environmental factors such as temperature, humidity, light, and air quality in greenhouses and controlled environments.
  • Use of smart sensing technologies for pest and disease detection, early warning systems, and plant protection strategies.
  • Innovative approaches for data analytics, modeling, and predictive algorithms to optimize agricultural practices and improve resource efficiency.
  • Field trials, case studies, and practical applications of smart sensing technologies in different agricultural sectors including crop production, livestock management, aquaculture, agroforestry, and the agri-food supply chain.

Authors are encouraged to submit their high-quality research and innovation contributions to this Special Issue, with a focus on the development and application of smart sensing technologies in agriculture.

Dr. Xinqing Xiao
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Inventions is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • smart sensing technologies
  • precision agriculture
  • wireless sensor networks
  • inventions and innovation

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 6291 KiB  
Article
Internet of Things Smart Beehive Network: Homogeneous Data, Modeling, and Forecasting the Honey Robbing Phenomenon
by Igor Kurdin and Aleksandra Kurdina
Inventions 2025, 10(2), 23; https://doi.org/10.3390/inventions10020023 - 3 Mar 2025
Viewed by 219
Abstract
The role of experimental data and the use of IoT-based monitoring systems are gaining broader significance in research on bees across several aspects: bees as global pollinators, as biosensors, and as examples of swarm intelligence. This increases the demands on monitoring systems to [...] Read more.
The role of experimental data and the use of IoT-based monitoring systems are gaining broader significance in research on bees across several aspects: bees as global pollinators, as biosensors, and as examples of swarm intelligence. This increases the demands on monitoring systems to obtain homogeneous, continuous, and standardized experimental data, which can be used for machine learning, enabling models to be trained on new online data. However, the continuous operation of monitoring systems introduces new risks, particularly the cumulative impact of electromagnetic radiation on bees and their behavior. This highlights the need to balance IoT energy consumption, functionality, and continuous monitoring. We present a novel IoT-based bee monitoring system architecture that has been operating continuously for several years, using solar energy only. The negative impact of IoT electromagnetic fields is minimized, while ensuring homogeneous and continuous data collection. We obtained experimental data on the adverse phenomenon of honey robbing, which involves elements of swarm intelligence. We demonstrate how this phenomenon can be predicted and illustrate the interactions between bee colonies and the influence of solar radiation. The use of criteria for detecting honey robbing will help to reduce the spread of diseases and positively contribute to the sustainable development of precision beekeeping. Full article
Show Figures

Figure 1

Figure 1
<p>Smart beehive.</p>
Full article ">Figure 2
<p>Smart hive: (<b>a</b>) front view, (<b>b</b>) back view, (<b>c</b>) exploded view.</p>
Full article ">Figure 3
<p>Smart beehive scale.</p>
Full article ">Figure 4
<p>Smart hive sensors: (<b>a</b>) internal temperature and humidity sensor; (<b>b</b>) IoT block with external temperature and humidity sensor.</p>
Full article ">Figure 5
<p>Smart apiary.</p>
Full article ">Figure 6
<p>View of query to database server, biological information.</p>
Full article ">Figure 7
<p>View of query to database server, technical information.</p>
Full article ">Figure 8
<p>Periods of daily activity of honey bees. 1. Night-time; 2. Departure of foragers; 3. Return of foragers.</p>
Full article ">Figure 9
<p>Data visualization according to food availability: (<b>a</b>) the option with abundant food, where the bee colony successfully accumulates reserves; (<b>b</b>) the option with scarce food, where the colony consumes almost everything the foragers collect; (<b>c</b>) the option with insufficient food, where foragers are unable to provide adequate food for the colony.</p>
Full article ">Figure 10
<p>Weight change of beehives (6–27 September 2023).</p>
Full article ">Figure 11
<p>Weight changes of hive 883 (6–24 September 2023).</p>
Full article ">Figure 12
<p>Temperature inside and outside hive 883.</p>
Full article ">Figure 13
<p>Temperature and solar panel voltage, hive 883.</p>
Full article ">Figure 14
<p>Cumulative weight changes (18–24 September 2023).</p>
Full article ">
16 pages, 4592 KiB  
Article
Enhancing Tractor Stability and Safety through Individual Actuators in Active Suspension
by Jinho Son, Yeongsu Kim, Seokho Kang and Yushin Ha
Inventions 2024, 9(2), 29; https://doi.org/10.3390/inventions9020029 - 6 Mar 2024
Viewed by 2119
Abstract
Tractor overturning accidents are a prominent safety concern in the field of agriculture. Many studies have been conducted to prevent tractor overturning accidents. Rollover protective structures and seat belts currently installed on tractors cannot prevent them from overturning. The posture of a tractor [...] Read more.
Tractor overturning accidents are a prominent safety concern in the field of agriculture. Many studies have been conducted to prevent tractor overturning accidents. Rollover protective structures and seat belts currently installed on tractors cannot prevent them from overturning. The posture of a tractor was controlled by installing individual actuators. The overturning angles of the tractor equipped with an actuator were compared with those of a tractor with no actuator. For the overturning angles in all directions of the tractor, it rotated 15° from 0° to 345°, and the actuator height suitable for the tractor posture was controlled by establishing an equation according to the tractor posture. Consequently, posture control using actuators was noticeably improved. This study proposes that tractors operating on irregular and sloping terrain be equipped with individual actuators. These results prevent tractor rollover accidents and improve safety and driving stability. Full article
Show Figures

Figure 1

Figure 1
<p>3D model of a tractor for CoG.</p>
Full article ">Figure 2
<p>Schematic diagram of the rotation of the tractor: (<b>a</b>) Rotated condition; (<b>b</b>) Tilting condition.</p>
Full article ">Figure 3
<p>Schematic view of the tractor for mathematic: (<b>a</b>) Static condition; (<b>b</b>) Tilting condition.</p>
Full article ">Figure 4
<p>Proof of the rotational matrix equation.</p>
Full article ">Figure 5
<p>Secondary plane coordinate system for the tractor wheel position.</p>
Full article ">Figure 6
<p>Projected figure for each wheel actuator height.</p>
Full article ">Figure 7
<p>Enlarged picture of the projected figure.</p>
Full article ">Figure 8
<p>Schematic diagram of the hydraulic actuator.</p>
Full article ">Figure 9
<p>Individual actuator control logic for maintaining the horizontal attitude of the tractor.</p>
Full article ">Figure 10
<p>Each wheel’s x, y coordinates of the secondary plane coordinate systems: (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>R</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>R</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>R</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>R</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>F</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>F</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>d</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>F</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>F</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; and (<b>e</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>g</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>g</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates.</p>
Full article ">Figure 10 Cont.
<p>Each wheel’s x, y coordinates of the secondary plane coordinate systems: (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>R</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>R</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>R</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>R</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>F</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>F</mi> <mi>R</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; (<b>d</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>F</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>F</mi> <mi>L</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates; and (<b>e</b>) <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>x</mi> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>g</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mrow> <mi>y</mi> </mrow> <mrow> <mi>c</mi> <mi>o</mi> <mi>g</mi> </mrow> </msub> <mo>′</mo> </mrow> </semantics></math> coordinates.</p>
Full article ">Figure 11
<p>Actuator height for each wheel entry angle: (<b>a</b>) Nonlimited condition; (<b>b</b>) 200 mm limit condition, where RR is the rear right actuator height, RL is the rear left actuator height, FR is the front right actuator height, and FL is the front left actuator height.</p>
Full article ">Figure 12
<p>Result of simulation for rollover angle of the tractor, where RL is the rear left wheel, RR is the rear right wheel, FL is the front left wheel, and FR is the front right wheel.</p>
Full article ">
33 pages, 4009 KiB  
Article
Enhancing Smart Agriculture Monitoring via Connectivity Management Scheme and Dynamic Clustering Strategy
by Fariborz Ahmadi, Omid Abedi and Sima Emadi
Inventions 2024, 9(1), 10; https://doi.org/10.3390/inventions9010010 - 5 Jan 2024
Viewed by 2070
Abstract
The evolution of agriculture towards a modern, intelligent system is crucial for achieving sustainable development and ensuring food security. In this context, leveraging the Internet of Things (IoT) stands as a pivotal strategy to enhance both crop quantity and quality while effectively managing [...] Read more.
The evolution of agriculture towards a modern, intelligent system is crucial for achieving sustainable development and ensuring food security. In this context, leveraging the Internet of Things (IoT) stands as a pivotal strategy to enhance both crop quantity and quality while effectively managing natural resources such as water and fertilizer. Wireless sensor networks, the backbone of IoT-based smart agricultural infrastructure, gather ecosystem data and transmit them to sinks and drones. However, challenges persist, notably in network connectivity, energy consumption, and network lifetime, particularly when facing supernode and relay node failures. This paper introduces an innovative approach to address these challenges within heterogeneous wireless sensor network-based smart agriculture. The proposed solution comprises a novel connectivity management scheme and a dynamic clustering method facilitated by five distributed algorithms. The first and second algorithms focus on path collection, establishing connections between each node and m-supernodes via k-disjoint paths to ensure network robustness. The third and fourth algorithms provide sustained network connectivity during node and supernode failures by adjusting transmission powers and dynamically clustering agriculture sensors based on residual energy. In the fifth algorithm, an optimization algorithm is implemented on the dominating set problem to strategically position a subset of relay nodes as migration points for mobile supernodes to balance the network’s energy depletion. The suggested solution demonstrates superior performance in addressing connectivity, failure tolerance, load balancing, and network lifetime, ensuring optimal agricultural outcomes. Full article
Show Figures

Figure 1

Figure 1
<p>Smart agriculture based on heterogenous wireless sensor network.</p>
Full article ">Figure 2
<p>A flowchart of the proposed method.</p>
Full article ">Figure 3
<p>Redundant control message: (<b>a</b>) initial topology, (<b>b</b>) disjoint paths of nodes D and E, (<b>c</b>) disjoint paths of nodes D and E after the pathinfo message has been sent by node B, and (<b>d</b>) disjoint paths of nodes D and E after the pathinfo message has been sent by node D.</p>
Full article ">Figure 4
<p>Percentage of failed nodes when supernode disconnectivity occurs: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 5
<p>Percentage of failed nodes when k-vetrtex supernode disconnectivity occurs: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 6
<p>Comparison of supernode connectivity lifetime in DPV, ADPV and KDPMS: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 6 Cont.
<p>Comparison of supernode connectivity lifetime in DPV, ADPV and KDPMS: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 7
<p>Comparison of k-vertex supernode connectivity lifetime in DPV, ADPV and KDPMS: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 7 Cont.
<p>Comparison of k-vertex supernode connectivity lifetime in DPV, ADPV and KDPMS: (<b>a</b>) k = 2, Sr = 3%, (<b>b</b>) k = 2, Sr = 5%, (<b>c</b>) k = 3, Sr = 3%, (<b>d</b>) k = 3, Sr = 5%, (<b>e</b>) k = 4, Sr = 3%, and (<b>f</b>) k = 5, Sr = 5%.</p>
Full article ">Figure 8
<p>Comparison of supernode failure tolerance in DPV, ADPV, and KDPMS: (<b>a</b>) N = 300, (<b>b</b>) N = 350, (<b>c</b>) N = 400, (<b>d</b>) N = 450, (<b>e</b>) N = 500, and (<b>f</b>) N = 550.</p>
Full article ">Figure 8 Cont.
<p>Comparison of supernode failure tolerance in DPV, ADPV, and KDPMS: (<b>a</b>) N = 300, (<b>b</b>) N = 350, (<b>c</b>) N = 400, (<b>d</b>) N = 450, (<b>e</b>) N = 500, and (<b>f</b>) N = 550.</p>
Full article ">Figure 9
<p>Number of restored connections in ADPV and KDPMS: (<b>a</b>) k = 2, (<b>b</b>) k = 3, and (<b>c</b>) k = 4.</p>
Full article ">Figure 10
<p>Number of message transmissions in KDPMS and MADPV algorithms.</p>
Full article ">Figure 11
<p>Lifetime comparison of the KDPMS and MADPV algorithms.</p>
Full article ">
24 pages, 5149 KiB  
Article
Early-Stage Identification of Powdery Mildew Levels for Cucurbit Plants in Open-Field Conditions Based on Texture Descriptors
by Claudia Angélica Rivera-Romero, Elvia Ruth Palacios-Hernández, Osbaldo Vite-Chávez and Iván Alfonso Reyes-Portillo
Inventions 2024, 9(1), 8; https://doi.org/10.3390/inventions9010008 - 3 Jan 2024
Cited by 2 | Viewed by 2631
Abstract
Constant monitoring is necessary for powdery mildew prevention in field crops because, as a fungal disease, it modifies the green pigments of the leaves and is responsible for production losses. Therefore, there is a need for solutions that assure early disease detection to [...] Read more.
Constant monitoring is necessary for powdery mildew prevention in field crops because, as a fungal disease, it modifies the green pigments of the leaves and is responsible for production losses. Therefore, there is a need for solutions that assure early disease detection to realize proactive control and management of the disease. The methodology currently used for the identification of powdery mildew disease uses RGB leaf images to detect damage levels. In the early stage of the disease, no symptoms are visible, but this is a point at which the disease can be controlled before the symptoms appear. This study proposes the implementation of a support vector machine to identify powdery mildew on cucurbit plants using RGB images and color transformations. First, we use an image dataset that provides photos covering five growing seasons in different locations and under natural light conditions. Twenty-two texture descriptors using the gray-level co-occurrence matrix result are calculated as the main features. The proposed damage levels are ’healthy leaves’, ’leaves in the fungal germination phase’, ’leaves with first symptoms’, and ’diseased leaves’. The implementation reveals that the accuracy in the L * a * b color space is higher than that when using the combined components, with an accuracy value of 94% and kappa Cohen of 0.7638. Full article
Show Figures

Figure 1

Figure 1
<p>Proposed methodology for PM damage level detection, where image collection is used for feature extraction and selection. A multiclassification is operated with the results of the classification process. In the end, a performance evaluation is conducted to verify the optimal classification.</p>
Full article ">Figure 2
<p>A timeline of the sampling days and the phenological growth stages to identify PM damage levels. The phenological stages (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>S</mi> <mn>8</mn> </msub> </semantics></math>) and the sampling days (<math display="inline"><semantics> <msub> <mi>D</mi> <mn>1</mn> </msub> </semantics></math> to <math display="inline"><semantics> <msub> <mi>D</mi> <mn>19</mn> </msub> </semantics></math>) are considered as basic information. Then, four PM damage levels are defined: <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> for healthy leaves, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> for leaves with spore in germination, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> for leaves with the first symptoms, and <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math> for diseased leaves.</p>
Full article ">Figure 3
<p>Visual evaluation of cucurbit leaves where four PM damage levels were defined: (<b>a</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math>: healthy leaves, (<b>b</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>: leaves with spore in germination, (<b>c</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>: leaves with the first symptoms, and (<b>d</b>) <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>: diseased leaves.</p>
Full article ">Figure 4
<p>Exploration by parts of the leaf for the selection of the region of interest (ROI): (<b>a</b>) division of the leaf, central part (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>1</mn> </msub> </semantics></math>), lower right lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>2</mn> </msub> </semantics></math>), upper right lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>3</mn> </msub> </semantics></math>), upper central lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>4</mn> </msub> </semantics></math>), upper left lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>5</mn> </msub> </semantics></math>) and lower left lobe (<math display="inline"><semantics> <msub> <mi>R</mi> <mn>6</mn> </msub> </semantics></math>), (<b>b</b>) first symptoms at <math display="inline"><semantics> <msub> <mi>R</mi> <mn>4</mn> </msub> </semantics></math>.</p>
Full article ">Figure 5
<p>Preprocessing of the ROI images starting with the color transformation and separation of color components (CCs), where the sample image (<math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math>) is the original image, which is followed by the analysis of ROI results in a new sample in RGB (<math display="inline"><semantics> <mrow> <mi>R</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>), then a contrast adjust (<math display="inline"><semantics> <mrow> <mi>C</mi> <mo>(</mo> <mi>p</mi> <mo>,</mo> <mi>q</mi> <mo>)</mo> </mrow> </semantics></math>) is performed to obtain the transformation of the image (<math display="inline"><semantics> <mrow> <mi>T</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>) in the different color spaces (<math display="inline"><semantics> <mrow> <mi>G</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>L</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>H</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>Y</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>) and the separation for color components.</p>
Full article ">Figure 6
<p>Calculation of the GLCM matrix in a gray image. The distance is <math display="inline"><semantics> <mrow> <mi>d</mi> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, and the angle is <math display="inline"><semantics> <mrow> <mi>θ</mi> <mo>=</mo> <mn>0</mn> </mrow> </semantics></math>: (<b>a</b>) gray image, (<b>b</b>) gray levels <math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math>, and (<b>c</b>) GLCM matrix with the paired pixels <math display="inline"><semantics> <mrow> <mi>g</mi> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </semantics></math>.</p>
Full article ">Figure 7
<p>Processed image (<math display="inline"><semantics> <mrow> <mi>I</mi> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>U</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>); color transformation (<math display="inline"><semantics> <mrow> <mi>H</mi> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </semantics></math>); components <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msub> <mi>H</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>s</mi> <mo>,</mo> <mi>t</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>; and their GLCM matrices <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>, and <math display="inline"><semantics> <mrow> <msub> <mi>G</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </semantics></math> with 255 gray levels.</p>
Full article ">Figure 8
<p>Process of feature extraction through the color component images.</p>
Full article ">Figure 9
<p>Feature selection process consists of a Lilliefors test, then an analysis of variance, and Tukey’s test.</p>
Full article ">Figure 10
<p>Results of the ANOVA and Tuke’s test: (<b>a</b>) mean values of the damage levels of diss-BB; (<b>b</b>) Tukey’s test, where the means of the damage levels are significantly different; (<b>c</b>) mean values of the damage levels of auto-A; (<b>d</b>) Tukey’s test, where the means of <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>, and <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math> are equal but significantly different from <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math>.</p>
Full article ">Figure 11
<p>Kernel selection in the multiclassification system with the feature vectors in different color spaces with the optimal hyperplane: (<b>a</b>) linear kernel in 2D with diss<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>B</mi> <mi>B</mi> </mrow> </msub> </semantics></math> versus cont<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math>, (<b>b</b>) 3D optimal hyperplane, (<b>c</b>) training and validation data with error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, (<b>d</b>) polynomial kernel in 2D with auto<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus savg<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math>, (<b>e</b>) 3D optimal hyperplane, (<b>f</b>) training and validation data with error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>g</b>) sigmoidal kernel in 2D with ener<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>G</mi> <mi>G</mi> </mrow> </msub> </semantics></math> versus dvar<math display="inline"><semantics> <msub> <mrow/> <mi>A</mi> </msub> </semantics></math>, (<b>h</b>) 3D optimal hyperplane, (<b>i</b>) training and validation data with the error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math>, (<b>j</b>) radial base function kernel in 2D with diss<math display="inline"><semantics> <msub> <mrow/> <mi>Y</mi> </msub> </semantics></math> versus inf<math display="inline"><semantics> <msub> <mrow/> <mn>1</mn> </msub> </semantics></math><math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>B</mi> <mi>B</mi> </mrow> </msub> </semantics></math> con kernel RBF, (<b>k</b>) 3D optimal hyperplane, and (<b>l</b>) training and validation data with the error in SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>.</p>
Full article ">Figure 12
<p>Kernel selection in the multiclassification system with the feature vectors in different color space with the optimal hyperplane: (<b>a</b>) radial base function kernel in 2D with auto<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus dent<math display="inline"><semantics> <msub> <mrow/> <mi>S</mi> </msub> </semantics></math>, (<b>b</b>) 3D optimal hyperplane, (<b>c</b>) training and validating data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>3</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>d</b>) linear kernel in 2D with idmn<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math> versus diss<math display="inline"><semantics> <msub> <mrow/> <mi>G</mi> </msub> </semantics></math>, (<b>e</b>) 3D optimal hyperplane, (<b>f</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>, (<b>g</b>) polynomial kernel in 2D with dvar<math display="inline"><semantics> <msub> <mrow/> <mrow> <mi>C</mi> <mi>R</mi> </mrow> </msub> </semantics></math> versus homo<math display="inline"><semantics> <msub> <mrow/> <mi>Y</mi> </msub> </semantics></math>, (<b>h</b>) 3D optimal hyperplane, (<b>i</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>4</mn> </msub> </semantics></math>, (<b>j</b>) radial base function kernel in 2D with ener<math display="inline"><semantics> <msub> <mrow/> <mi>V</mi> </msub> </semantics></math> versus entr<math display="inline"><semantics> <msub> <mrow/> <mi>S</mi> </msub> </semantics></math> con kernel RBF, (<b>k</b>) 3D optimal hyperplane, and, (<b>l</b>) training and validate data with the error in the SVM <math display="inline"><semantics> <msub> <mi>T</mi> <mn>1</mn> </msub> </semantics></math> versus <math display="inline"><semantics> <msub> <mi>T</mi> <mn>2</mn> </msub> </semantics></math>.</p>
Full article ">Figure 13
<p>One-versus-one multiclassification method. The main inputs are the support vectors <math display="inline"><semantics> <mrow> <msub> <mi>s</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>s</mi> <mn>6</mn> </msub> </mrow> </semantics></math>), the validation data for each binary classifier <math display="inline"><semantics> <mrow> <msub> <mi>M</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>M</mi> <mn>6</mn> </msub> </mrow> </semantics></math>, and <math display="inline"><semantics> <mi>σ</mi> </semantics></math>. Each block <math display="inline"><semantics> <mrow> <msub> <mi>V</mi> <mn>1</mn> </msub> <mo>,</mo> <mo>…</mo> <mo>,</mo> <msub> <mi>V</mi> <mn>4</mn> </msub> </mrow> </semantics></math> contains the different support vector machines for multiple classification.</p>
Full article ">Figure 14
<p>SVM binary classifiers: (<b>a</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>1</mn> </msub> </semantics></math> and SVM-classified data, (<b>b</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>2</mn> </msub> </semantics></math> and SVM-classified data, (<b>c</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>3</mn> </msub> </semantics></math> and SVM-classified data, (<b>d</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>4</mn> </msub> </semantics></math> and SVM-classified data, and (<b>e</b>) test data <math display="inline"><semantics> <msub> <mi>F</mi> <mn>5</mn> </msub> </semantics></math> and SVM-classified data.</p>
Full article ">Figure 15
<p>SVM binary classifiers with components of the same color space: (<b>a</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>1</mn> </msub> </semantics></math> and SVM-classified data, (<b>b</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>2</mn> </msub> </semantics></math> and SVM-classified data, (<b>c</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>3</mn> </msub> </semantics></math> and SVM-classified data, (<b>d</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>4</mn> </msub> </semantics></math> and SVM-classified data, and (<b>e</b>) test data <math display="inline"><semantics> <msub> <mi>G</mi> <mn>5</mn> </msub> </semantics></math> and SVM-classified data.</p>
Full article ">
24 pages, 15079 KiB  
Article
Sensing Spontaneous Combustion in Agricultural Storage Using IoT and ML
by Umar Farooq Shafi, Imran Sarwar Bajwa, Waheed Anwar, Hina Sattar, Shabana Ramzan and Aqsa Mahmood
Inventions 2023, 8(5), 122; https://doi.org/10.3390/inventions8050122 - 26 Sep 2023
Cited by 4 | Viewed by 2535
Abstract
The combustion of agricultural storage represents a big hazard to the safety and quality preservation of crops during lengthy storage times. Cotton storage is considered more prone to combustion for many reasons, i.e., heat by microbial growth, exothermic and endothermic reactions in storage [...] Read more.
The combustion of agricultural storage represents a big hazard to the safety and quality preservation of crops during lengthy storage times. Cotton storage is considered more prone to combustion for many reasons, i.e., heat by microbial growth, exothermic and endothermic reactions in storage areas, and extreme weather conditions in storage areas. Combustion not only increases the chances of a big fire outbreak in the long run, but it may also affect cotton’s quality factors like its color, staple length, seed quality, etc. The cotton’s quality attributes may divert from their normal range in the presence of combustion. It is difficult to detect, monitor, and control combustion. The Internet of Things (IoT) offers efficient and reliable solutions for numerous research problems in agriculture, healthcare, business analytics, and industrial manufacturing. In the agricultural domain, the IoT provides various applications for crop monitoring, warehouse protection, the prevention of crop diseases, and crop yield maximization. We also used the IoT for the smart and real-time sensing of spontaneous combustion inside storage areas in order to maintain cotton quality during lengthy storage. In the current research, we investigate spontaneous combustion inside storage and identify the primary reasons for it. Then, we proposed an efficient IoT and machine learning (ML)-based solution for the early sensing of combustion in storage in order to maintain cotton quality during long storage times. The proposed system provides real-time sensing of combustion-causing factors with the help of the IoT-based circuit and prediction of combustion using an efficient artificial neural network (ANN) model. The proposed smart sensing of combustion is verified by a different set of experiments. The proposed ANN model showed a 99.8% accuracy rate with 95–98% correctness and 97–99% completeness. The proposed solution is very efficient in detecting combustion and enables storage owners to become aware of combustion hazards in a timely manner; hence, they can improve the storage conditions for the preservation of cotton quality in the long run. The whole article consists of five sections. Full article
Show Figures

Figure 1

Figure 1
<p>Design architecture of deep learning combustion predictor.</p>
Full article ">Figure 2
<p>The components of the IoT circuit: (<b>a</b>) soil moisture sensor; (<b>b</b>) MQ-9 gas sensor; (<b>c</b>) DHT-11 temperature and humidity sensor; (<b>d</b>) circuit diagram.</p>
Full article ">Figure 3
<p>User app.</p>
Full article ">Figure 4
<p>System Flowchart.</p>
Full article ">Figure 5
<p>Implemented ANN Model.</p>
Full article ">Figure 6
<p>ANN working algorithm steps.</p>
Full article ">Figure 7
<p>Training validation accuracy and loss graph.</p>
Full article ">Figure 8
<p>Compiled ANN.</p>
Full article ">Figure 9
<p>Prediction graph.</p>
Full article ">Figure 10
<p>ANN prediction precision and recall.</p>
Full article ">Figure 11
<p>ANN prediction results.</p>
Full article ">Figure 12
<p>Confusion matrix.</p>
Full article ">Figure 13
<p>Dataset plotting.</p>
Full article ">Figure 14
<p>Plotting dataset factors: (<b>a</b>) temperature and combustion; (<b>b</b>) moisture and combustion; (<b>c</b>) methane and combustion; (<b>d</b>) heatmap.</p>
Full article ">Figure 15
<p>Features’ effects on the negative class using deepSHAP.</p>
Full article ">Figure 16
<p>Features’ role in predicting positive class using deepSHAP.</p>
Full article ">
Back to TopTop