Electronics 10 01400
Electronics 10 01400
Electronics 10 01400
Article
Design and Implementation of a Hydroponic Strawberry
Monitoring and Harvesting Timing Information Supporting
System Based on Nano AI-Cloud and IoT-Edge
Sun Park * and JongWon Kim
Artificial Intelligence Graduate School, Gwangju Institute of Science and Technology, 123 Cheomdangwagi-ro,
Buk-gu, Gwangju 61005, Korea; jongwon@gist.ac.kr
* Correspondence: sunpark@gist.ac.kr; Tel.: +82-62-715-6380
Abstract: The strawberry market in South Korea is actually the largest market among horticultural
crops. Strawberry cultivation in South Korea changed from field cultivation to facility cultivation
in order to increase production. However, the decrease in production manpower due to aging
is increasing the demand for the automation of strawberry cultivation. Predicting the harvest of
strawberries is an important research topic, as strawberry production requires the most manpower
for harvest. In addition, the growing environment has a great influence on strawberry production
as hydroponic cultivation of strawberries is increasing. In this paper, we design and implement an
integrated system that monitors strawberry hydroponic environmental data and determines when to
harvest with the concept of IoT-Edge-AI-Cloud. The proposed monitoring system collects, stores and
visualizes strawberry growing environment data. The proposed harvest decision system classifies the
Citation: Park, S.; Kim, J. Design and
strawberry maturity level in images using a deep learning algorithm. The monitoring and analysis
Implementation of a Hydroponic
results are visualized in an integrated interface, which provides a variety of basic data for strawberry
Strawberry Monitoring and
Harvesting Timing Information
cultivation. Even if the strawberry cultivation area increases, the proposed system can be easily
Supporting System Based on Nano expanded and flexibly based on a virtualized container with the concept of IoT-Edge-AI-Cloud. The
AI-Cloud and IoT-Edge. Electronics monitoring system was verified by monitoring a hydroponic strawberry environment for 4 months.
2021, 10, 1400. https://doi.org/ In addition, the harvest decision system was verified using strawberry pictures acquired from Smart
10.3390/electronics10121400 Berry Farm.
Academic Editors: Lei Shu, Keywords: strawberry; cultivation environment monitoring; harvest decision; IoT edge; AI cloud;
Simon Pearson, Ye Liu, Mohamed deep learning; hydroponic
Amine Ferrag and Leandros Maglaras
help farmers to control ripening progress. In general, the detection of all types of dis-
eases and evaluation of the ripening stage of strawberries are carried out through manual
inspection and evaluation according to the personal experience of the farmer. Manual
identification of mature strawberries for harvesting is time- and labor-intensive. At the
same time, strawberry production has to overcome unfavorable agricultural conditions,
such as water shortages, changes in the growing environment and climate change. Thus,
improving various strawberry farming practices with innovative technologies can enhance
the strategic advantage of agricultural production.
In order to overcome such obstacles to strawberry farming, this paper designs and
develops a system capable of monitoring the strawberry cultivation environment in real
time and supporting decision makers with enhanced information about harvesting timing.
The proposed system is designed to collect, store and analyze strawberry environmental
data and photos with the concept of IoT-Edge-AI-Cloud [3,4]. The proposed monitoring
system collects 13 types of related cultivation environment data using the IoT-Edge module.
The collected environmental data are stored and visualized in a nano-sized private cloud-
based database server and visualization server, respectively. The proposed harvest decision
system classifies strawberry objects according to their maturity level using a classification
model based on a deep learning YOLO (You Only Look Once) algorithm in a nano-sized
private AI-Cloud-based analysis server. The IoT-Edge device of the proposed system
can be implemented cost-effectively based on Arduino and Raspberry Pi. The proposed
system can also easily and flexibly expand the system via container virtualization of the
system based on AI-Cloud, even if the applied strawberry farm area increases. In other
words, the number of container servers based on virtualization can easily be increased
whenever necessary. The system can efficiently manage strawberry cultivation and harvest
by integrating and visualizing strawberry cultivation environment monitoring data and
strawberry ripeness classification. The monitoring and analysis results are visualized in
an integrated interface, which provides a variety of basic data (e.g., production volume,
harvest time and pest diagnosis) for strawberry cultivation. The proposed monitoring
system can easily and stably construct big data of strawberry cultivation environments.
In other words, it can monitor the 13 types of collected environmental data in real time to
understand the growth environment. Additionally, the optimal growth environment can
be analyzed by utilizing big data that have been accumulated from the growth cycle. The
remainder of this paper is organized as follows: Section 2 describes the related studies on
methods of growing environment monitoring and maturity classification; Section 3 explains
the hydroponic strawberry monitoring and harvest decision system; Section 4 describes the
operational use cases of the hydroponic strawberry monitoring and the tests of the harvest
decision system. Finally, Section 5 discusses the conclusions and future direction.
2. Related Works
This chapter reviews the related studies on fruit and vegetable cultivation monitoring
and maturity classification. Bharti et al. [5] proposed a hydroponic tomato monitoring
system that uses a microprocessor to transmit temperature and plant size data to the
cloud using the message queuing telemetry transport (MQTT). It can also check the saved
data via an Android application. Joshitha et al. [6] used Raspberry Pi and sensors to
store data on temperature, humidity, water level, soil level, etc., in the Ubidots cloud
database from a hydroponic cultivation system. Herman and Surantha [7] proposed an
intelligent monitoring and control system for hydroponic precision farming. The system
was used to monitor the water and nutrition needs of plants, while fuzzy logic was designed
to precisely control the supply of water and nutrients. Fakhrurroja et al. [8] proposed
an automatic pH (potential of hydrogen) and humidity control system for hydroponic
cultivation using a pH sensor, humidity sensor, Arduino, Raspberry Pi, and fuzzy logic.
As a result of the fuzzy model, the pH of the water is controlled using a nutrient pump
and a weak acid pump. Verma et al. [9] proposed a framework to predict the absolute
crop growth rate using a machine learning method for the tomato crop in a hydroponic
Electronics 2021, 10, 1400 3 of 15
system. Their method helps to understand the impact of important variables in the correct
nutrient supply. Pawar et al. [10] designed an IoT-enabled Automated Hydroponics
system using NodeMCU and Blynk. Their method consists of a monitoring stage and
automation. In the monitoring stage, temperature, humidity and pH are monitored.
During automation, the levels of pH, water, temperature and humidity are adjusted.
Issarny et al. [11] introduced the LATTICE framework for the optimization of IoT system
configuration at the edge, provided the ontological description of the target IoT system.
The framework showed an example applied to a hydroponic room of vegetables for
monitoring and controlling several physical variables (e.g., temperature, humidity, CO2 ,
air flow, lighting and fertilizer concentration, balance and pH). Samijayani et al. [12]
implemented wireless sensor networks with Zigbee and Wi-Fi for hydroponics plants. The
networks are used by the Zigbee-based transceiver in the sensor node and the Wi-Fi-based
gateway in the coordinator node. Adidrana and Surantha [13] proposed a monitoring
system to measure pH, TDS (total dissolved solids) and nutrient temperature values in the
nutrient film technique using a couple of sensors. The system used lettuce as the object of
experiments and applied the k-nearest neighbor algorithm to predict the classification of
nutrient conditions.
Ge et al. [14] presented a machine vision system in a strawberry-harvesting robot
for the localization of strawberries and environment perception in tabletop strawberry
production. The system utilized a deep learning network for instance segmentation to
detect the target strawberries. An environment perception algorithm was proposed to
identify a safe manipulation region and the strawberries within this region. A safe region
classification method was proposed to identify the pickable strawberries. Yu et al. [15]
proposed a harvesting robot for ridge-planted strawberries and a fruit pose estimator.
The proposed harvesting robot was designed on the servo control system of a strawberry-
harvesting robot suitable for the narrow ridge-planting mode. The fruit pose estimator,
based on the rotated YOLO (R-YOLO), was suitable for strawberry fruit in the narrow
spaces of the ridge-planting mode. Feng et al. [16] designed a harvesting robot for tabletop-
cultivated strawberry. The robot system consists of an information acquisition part, a
harvesting execution part, a controller and other auxiliaries. The information acquisition
part includes a distant- and close-view camera, an artificial light source and obstacle
detection sensors. The distant-view camera is used to dynamically identify and locate the
mature fruit in the robot’s view field. The close-range camera is used to obtain close-view
images of the fruit. The artificial light source can compensate for the variable sunlight
conditions under agricultural environments. Huang et al. [17] proposed a fuzzy Mask
R-CNN (regions with convolutional neural network) model to automatically identify the
ripeness levels of cherry tomatoes. The proposed method used a fuzzy c-means model to
maintain the spatial information of various foreground and background elements of the
image. It also used Mask R-CNN to precisely identify each tomato. The method used a
hue saturation value color model and fuzzy inference rules to predict the ripeness of the
tomatoes. Altaheri et al. [18] proposed a machine vision framework for date fruit harvesting,
which uses three classification models to classify date fruit images in real time according
to their type, maturity and harvesting decision. Zhang et al. [19] proposed a CNN-based
classification method for a tomato-harvesting robot to improve the accuracy and scalability
of tomato ripeness with a small amount of training data. The authors of [20] proposed
a scheme using machine-vision-based techniques for automated grading of mangoes
according to their maturity level in terms of actual days to rot and quality attributes such
as size and shape. Saputro et al. [21] introduced a banana maturity prediction system using
visible near-infrared imaging based on the chlorophyll characteristic to estimate maturity
and chlorophyll content non-destructively. Kuang et al. [22] proposed a kiwifruit classifier
using a multivariate alternating decision tree and deep learning.
The proposed system collects 13 types of strawberry growth environment data,
whereas the environmental information monitored in the previous related works [5–13]
consisted of two to eight types. By collecting more environmental information compared
Electronics 2021, 10, 1400 4 of 15
Electronics 2021, 10, x FOR PEER REVIEW 4 of 16
to the related
consisted works,
of two it istypes.
to eight possible to accessmore
By collecting moreenvironmental
diverse methods when analyzing
information compared the
to the related
growing works, it isAs
environment. possible to access
the related works more diverse
[5–13] methods
involve when
simply analyzing
collecting the
environ-
growing
mental environment.
data As the related
or simply storing works [5–13]
the collected data ininvolve simply
the cloud, thecollecting
additionenvironmen-
of functions to
tal related
the data or simply
works storing
is limited.the collected
By contrast, datathe in the cloud, the
proposed addition
method of functions
is designed to to the
facilitate
related works
function is limited.
expansion By contrast,
and analysis, as the proposed
it consists of method
an IoT-Edgeis designed
module to and
facilitate func-
a nano-sized
tion expansion
private AI-Cloudand analysis,The
module. as itproposed
consists ofmethod
an IoT-Edge modulethe
determines andstrawberry
a nano-sized private
harvest time
AI-Cloud module. The proposed method determines the strawberry harvest
by using a deep-learning-based method similar to related studies [14,15,17,19]. However, time by using
a deep-learning-based
the methodstudies
difference from the related similar istothatrelated
it is studies
designed [14,15,17,19].
based on a However,
virtualizedthe dif-
container
ference from the related studies is
to increase the scalability of the function.that it is designed based on a virtualized container to
increase the scalability of the function.
3. Hydroponic Strawberry Monitoring and Harvest Decision System
3. Hydroponic Strawberry Monitoring and Harvest Decision System
The following sections describe the components of the device hardware and module
The following
architecture for the sections
hydroponic describe the components
strawberry monitoring of the
anddevice hardware
harvest decisionand module
system.
architecture for the hydroponic strawberry monitoring and harvest decision system.
3.1. System Overview and Device Components
3.1. System Overview and Device Components
A hydroponic strawberry monitoring and harvest decision system prototype was de-
signedAfor
hydroponic
collectingstrawberry monitoringdata,
growth environment and analyzing
harvest decision
optimalsystem prototype
growing was de-and
environments
signed for collecting
identifying growth environment
mature strawberries. data, analyzing
The proposed optimal
system consists ofgrowing environments
a hydroponic strawberry
and identifying
monitoring mature
IoT-Edge strawberries.
device and a GPUTheworkstation
proposed system consists
device, of a in
as shown hydroponic
Figure 1. straw-
berry monitoring IoT-Edge device and a GPU workstation device, as shown in Figure 1.
Figure1.1. The
Figure The hydroponic
hydroponic strawberry
strawberrymonitoring
monitoringand
andharvest
harvestdecision system.
decision system.
3.1.1.
3.1.1. Hydroponics System
Hydroponics System
In
Inthis
this paper,
paper, a home hydroponic
a home hydroponiccultivation
cultivationsystem
system was
was used
used to monitor
to monitor strawberry
strawberry
growth
growthinformation and verify
information and verifyharvest
harvestdecision
decisioninformation,
information, asas shown
shown in Figure
in Figure 1(6).
1(6). TheThe
hydroponic
hydroponic cultivation systemconsists
cultivation system consistsofofa ahydroponic
hydroponic shelf,
shelf, a water
a water tank
tank andand a water
a water
pump.
pump.TheThe hydroponic shelf consists
hydroponic shelf consistsofofaatotal
totalofofthree
three floors
floors withwith
twotwo pipes
pipes perper floor
floor andand
32
32plants.
plants. Seolhyang strawberry[23]
Seolhyang strawberry [23]was
wasgrown
grownininthe the hydroponic
hydroponic cultivation
cultivation system.
system.
3.1.2.
3.1.2. Hydroponic StrawberryMonitoring
Hydroponic Strawberry MonitoringIoT-Edge
IoT-EdgeDevice
Device
The
TheIoT-Edge
IoT-Edge device collectsgrowth
device collects growthenvironment
environment data
data forfor optimal
optimal cultivation
cultivation environ-
environ-
ment
mentanalysis when growing
analysis when growingthe thehydroponic
hydroponic strawberry
strawberry solution.
solution. In In addition,
addition, strawberry
strawberry
images
imagesare are taken
taken for identification ofmature
identification of maturestrawberries
strawberries and
and calculation
calculation of the
of the normalized
normalized
difference vegetationindex
difference vegetation index (NDVI).
(NDVI). TheThe IoT-Edge
IoT-Edge device
device is composed
is composed of Raspberry
of Raspberry Pi, Ar- Pi,
Arduino, sensors,
duino, sensors, Raspberry
Raspberry Pi cameras
Pi cameras andand a power
a power supply,
supply, as shown
as shown in Figure
in Figure 1b. Table
1b. Table 1 1
showsthe
shows thecomponent
component hardware
hardware specification
specification of the IoT-Edge device.device. In
In this
this study,
study, envi-
environ-
ronmental
mental datadata
and and strawberry
strawberry photos
photos were
were collected
collected in ainfixed
a fixed environment
environment forfor verifica- of
verification
tion of the proposed system. For this reason, one strawberry plant
the proposed system. For this reason, one strawberry plant was selected and the was selected and the
distance
between the camera and the strawberry plant was fixed. The Raspberry Pi Camera v2.1
module has a fixed focus, so we manually turned the focus ring to focus. The camera
Electronics 2021, 10, 1400 5 of 15
module in Figure 1(4) is connected to the Arducam multi-camera module that is connected
to the Raspberry Pi module in Figure 1(1).
• Raspberry Pi 3B+
- CPU: ARM Cortex-A53 1.4 GHz
- RAM: 1 GB SRAM
- Wi-Fi: 2.4 GHz and 5 GHz
- Ethernet: 300Mbps
Raspberry Pi
(Figure 1(1)) • microSD 256 GB
• Arducam Multi Camera Adapter Module V2.1
- Work with 5 MP or 8 MP cameras
- Accommodate 4 Raspberry Pi cameras on a single RPi board
- 3 GPIOs required for multiplexing
- Cameras work sequentially, not simultaneously
Figure
Figure 2. 2.
TheThe system
system module
module architecture.
architecture.
not support ADCs (analog-to-digital converters). Additionally, the number of sensors that
can be attached is limited because there are special-purpose GPIO pins. The Arduino module
has 16 analog input pins and 54 digital input/output pins, so it can easily add sensors. The
Arducam Multi Camera Adapter Module uses a single CSI camera port on the Raspberry
Pi module to connect two cameras. It also uses three GPIOs on the Raspberry Pi module
for multiplexing support. The Sensor Data Collector of Figure 2(1) collects 13 types of
environmental data from eight sensors. Table 3 shows the sensors related to the data being
collected. The Data Collector is programmed in C language using Sketch, an Arduino
integrated development environment (IDE), to send the 13 types of sensing data to the
IoT-Edge device every 0.5 s. The monitoring function of the IoT-Edge module is verified by
collecting and transmitting data that are repeated every 0.5 s.
The Sensors Data Sender in Figure 2(4) stores the data collected through USB serial
communication in the database of the GPU workstation. The Data Sender is programmed
to receive sensing data from /dev/ttyUSB0 of the Raspberry Pi serial communication
port using Python language and store it in the sensors table of the Maria database (i.e.,
Figure 2(7) Sensors Data DB LXD Container) of the workstation. The Multi Camera Image
Collector in Figure 2(2) captures and stores strawberry images from IR cut (infrared cut-
off filter) and non-IR cut cameras. The Python example program code of the Arducam
Multi Camera Adapter (i.e., Multi_Camera_Adapter_V2.2_python [24]) was modified to
capture strawberry images every 2 h with IR cut and non-IR cut cameras and store them
in the IoT-Edge device. The Multi Camera Image Sender in Figure 2(3) saves the images
stored on the IoT-Edge device to the workstation’s image storage. The Image Sender
was programmed in Bash shell script using the Linux command line utility SCP (Secure
Copy) so that strawberry images stored on the Raspberry Pi module can be saved to the
workstation’s image storage every day.
Figure 3. Nano-sized private AI-Cloud for the functions of analysis station module.
Figure 3. Nano-sized private AI-Cloud for the functions of analysis station module.
Label 0 1 2 3 4 5
Immature 30% Mature 50% Mature 60% Mature 80% Mature Mature
Category Sum
Meaning Harvest Harvest Harvest Harvest Harvest Harvest
after 1 Month after 3 Weeks after 2 Weeks after 1 Week after 3 Days after 1 Day
Number of images 1005 1012 1017 1043 1038 1042 6156
93263 2020-08-26 13:55 1 30.97 992.31 175.8 28.43 357.5 0.16 515 NR 32.75 7.3 2.16 0.01 943.76
93264 2020-08-26 13:57 1 31.23 991.94 178.97 30.29 189.17 0.12 500 NR 32.44 7.31 2.13 0.01 943.76
93265 2020-08-26 13:59 1 31.24 991.98 178.63 30.33 197.5 0.12 497 NR 32.44 7.31 2.1 0.01 943.76
93266 2020-08-26 14:00 1 31.24 991.99 178.51 30.31 199.17 0.12 495 NR 32.44 7.31 2.07 0.01 943.76
93267 2020-08-26 14:00 1 31.22 992 178.42 30.22 200.83 0.16 495 NR 32.44 7.31 2.04 0.01 943.76
93268 2020-08-26 14:01 1 31.22 991.97 178.73 30.28 190 0.12 509 NR 32.44 7.31 2.02 0.01 943.76
93269 2020-08-26 14:03 1 31.17 992.01 178.39 30.48 203.33 0.12 508 NR 32.44 7.31 2.07 0.01 943.76
93270 2020-08-26 14:05 1 31.19 991.96 178.76 30.38 199.17 0.12 510 NR 32.44 7.31 2.05 0.01 943.76
93271 2020-08-26 14:05 1 31.2 991.99 178.54 30.38 197.5 0.12 510 NR 32.44 7.31 2.04 0.01 943.76
93272 2020-08-26 14:06 1 31.19 991.96 178.76 30.38 195 0.12 510 NR 32.44 7.31 2.05 0.01 943.76
93273 2020-08-26 14:06 1 31.21 991.99 178.55 30.43 191.67 0.12 508 NR 32.44 7.31 2.06 0.01 943.76
93274 2020-08-26 14:08 1 31.06 991.97 178.69 30.03 187.5 0.12 510 NR 32.44 7.31 2.03 0.01 943.76
93275 2020-08-26 14:10 1 30.92 991.92 179.13 30.31 183.33 0.12 495 NR 32.44 7.31 2.05 0.01 943.76
93276 2020-08-26 14:11 1 30.88 991.94 178.95 30.1 183.33 0.16 496 NR 32.44 7.31 2.14 0.01 943.76
93277 2020-08-26 14:11 1 30.85 992 178.41 30.46 184.17 0.12 498 NR 32.44 7.31 2.08 0.01 943.76
93278 2020-08-26 14:12 1 30.81 991.99 178.53 30.58 185 0.16 500 NR 32.44 7.31 2.03 0.01 943.76
93279 2020-08-26 14:14 1 30.88 991.91 179.24 30.88 193.33 0.13 499 NR 32.44 7.31 2.12 0.01 943.76
93280 2020-08-26 14:16 1 30.93 991.93 179.04 30.32 210.83 0.12 511 NR 32.44 7.31 2.05 0.01 943.76
93281 2020-08-26 14:16 1 30.96 991.91 179.24 30.75 216.67 0.16 511 NR 32.44 7.31 2.1 0.01 943.76
93282 2020-08-26 14:17 1 30.96 991.9 179.27 30.59 223.33 0.12 511 NR 32.44 7.31 2.13 0.01 943.76
4.3.Analysis
4.3. AnalysisServer
Serverfor
forStrawberry
StrawberryHarvest
HarvestDetermination
Determinationand andNDVI
NDVICalculation
Calculation
InInthis
thissubsection,
subsection,the thestrawberry
strawberryclassification
classificationfunction
functionand andNDVI
NDVIcalculation
calculationtests
tests
aredescribed.
are described.The Theclassification
classificationaccuracy
accuracywaswasused
usedtototest
testthe
thestrawberry
strawberryclassification
classification
function.The
function. Theaccuracy
accuracyrateratewas
wascalculated
calculatedbybycomparing
comparingthe thecategory
categoryclassified
classifiedbybythe
the
strawberryclassification
strawberry classification model
model andand
thethe category
category classified
classified by expert.
by the the expert.
The The classifica-
classification
tion model
model was created
was created using using the strawberry
the strawberry training
training data indata
Tablein 5Table 5 and
and the the YOLO
YOLO algo-
algorithm.
The training
rithm. of the classification
The training model (i.e.,
of the classification YOLO
model (i.e.,V3) was V3)
YOLO repeated 50,020 times
was repeated in 12
50,020 h
times
and
in 5 min.
12 h andFour
5 NVidia
min. Four GTX 1080ti
NVidia graphic
GTX cards
1080ti werecards
graphic used were
for training.
used Figure
for 6 shows
training. an 6
Figure
Electronics 2021, 10, x FOR PEER REVIEW 12 of 16
average
shows an lossaverage
rate of the
losstraining data
rate of the of 0.0328
training forof50,019
data 0.0328iterations
for 50,019 initerations
the YOLOinV3 themodel.
YOLO
V3 model.
Figure6.6.Average
Figure Averageloss
lossrate
rateofoftraining
trainingdata
datafor
forthe
thestrawberry
strawberryclassification
classificationmodel.
model.
The training data in Table 4 were labeled to be optimized for the YOLO algorithm.
For this reason, comparison with other methods is meaningless; therefore, in this paper,
the YOLO V2, YOLO V3, TinyYOLO V2 and TinyYOLO V3 models are compared. YOLO
V1 was excluded from the evaluation due to its low accuracy. YOLO models were created
by transfer learning with the training data in Table 5 on the basic models of YOLO V2,
Electronics 2021, 10, 1400 12 of 15
The training data in Table 4 were labeled to be optimized for the YOLO algorithm. For
this reason, comparison with other methods is meaningless; therefore, in this paper, the
YOLO V2, YOLO V3, TinyYOLO V2 and TinyYOLO V3 models are compared. YOLO V1
was excluded from the evaluation due to its low accuracy. YOLO models were created by
transfer learning with the training data in Table 5 on the basic models of YOLO V2, YOLO
V3, TinyYOLO V2, and TinyYOLO V3. The 4327 evaluation objects in 1575 strawberry
photos consist of photos taken directly from strawberry farms and photos retrieved from a
Google Images search. Figure 7 shows the comparison results of the average accuracy rates
of the YOLO models for 4327 evaluation objects. In Figure 7, the average accuracy of YOLO
V3 is approximately 3.667% higher than that of YOLO V2, 9.477% higher that of TinyYOLO
V3 and 6.
Figure 16.247%
Averagehigher that
loss rate of TinyYOLO
of training data for V2. Due to theclassification
the strawberry well-labeled training data and
model.
well-generated models, the accuracy rates of YOLO V3 and YOLO V2 are considered to be
high. The
As the weight
training of the
data TinyYOLO
in Table 4 weremodel
labeled is to
smaller than thatfor
be optimized ofthe
YOLO,
YOLO thealgorithm.
training
data are not well reflected in the generated model. As a result of analyzing 75
For this reason, comparison with other methods is meaningless; therefore, in this paper,misclassified
strawberry
the YOLO objects
V2, YOLOwithV3,
YOLO V3, theV2
TinyYOLO objects overlap oneV3
and TinyYOLO another
models orare
thecompared.
pictures are out
YOLO
ofV1focus.
was excluded from the evaluation due to its low accuracy. YOLO models were created
Figure 8learning
by transfer shows thewithprocess of calculating
the training data inthe NDVI
Table value
5 on the from
basic amodels
strawberry picture.
of YOLO V2,
As shown in Figure 8a, after selecting a strawberry object from a strawberry photo using
YOLO V3, TinyYOLO V2, and TinyYOLO V3. The 4327 evaluation objects in 1575 straw-
a strawberry classification model, the coordinates of the object are extracted. As shown
berry photos consist of photos taken directly from strawberry farms and photos retrieved
in Figure 8b, the NDVI values are calculated by using the coordinates of each object
from a Google Images search. Figure 7 shows the comparison results of the average accu-
and Equation (1). Figure 8(1,2) were classified as label 1, which is 30% mature, by the
racy rates of the YOLO models for 4327 evaluation objects. In Figure 7, the average accu-
strawberry classification model. Figure 8(3) was classified as label 4, which is 80% mature,
racy of YOLO V3 is approximately 3.667% higher than that of YOLO V2, 9.477% higher
by the strawberry classification model. In general, the NDVI value approaches −1 as the
that of TinyYOLO V3 and 16.247% higher that of TinyYOLO V2. Due to the well-labeled
strawberry matures, and the NDVI value approaches 1 the more immature the strawberry is.
training data and well-generated models, the accuracy rates of YOLO V3 and YOLO V2
The NDVI values of the strawberry objects in Figure 8(1,2) are very different. It is analyzed
are considered to be high. As the weight of the TinyYOLO model is smaller than that of
that the NDVI value is different because the intensity of the light source of the strawberry
YOLO, the training data are not well reflected in the generated model. As a result of ana-
photos is different. In order to calculate an accurate NDVI value, an environment with a
lyzing 75 misclassified strawberry objects with YOLO V3, the objects overlap one another
light source of constant intensity is required, such as a smart farm factory using LEDs.
or the pictures are out of focus.
Figure 7. Result of comparison of the YOLO models for 4327 evaluation objects.
Figure 7. Result of comparison of the YOLO models for 4327 evaluation objects.
Figure 8 shows the process of calculating the NDVI value from a strawberry picture.
As shown in Figure 8a, after selecting a strawberry object from a strawberry photo using
the strawberry classification model. In general, the NDVI value approaches −1 as the
strawberry matures, and the NDVI value approaches 1 the more immature the strawberry
is. The NDVI values of the strawberry objects in Figure 8(1,2) are very different. It is ana-
lyzed that the NDVI value is different because the intensity of the light source of the straw-
Electronics 2021, 10, 1400 berry photos is different. In order to calculate an accurate NDVI value, an environment13 of 15
with a light source of constant intensity is required, such as a smart farm factory using
LEDs.
Figure 8. The process of calculating the NDVI value of the strawberry object.
Figure 8. The process of calculating the NDVI value of the strawberry object.
4.4.Visualization
4.4. VisualizationServer
ServerforforStrawberry
StrawberryMonitoring
Monitoringand andHarvest
HarvestDecision
Decision
Onthe
On thevisualization
visualizationserver,
server,the
themonitoring
monitoringand andanalysis
analysisresults
resultscan
canbebevisualized
visualized
withan
with anintegrated
integratedinterface
interfacetotouse
usevarious
variousbasic
basicdata
datafor
forgrowing
growingstrawberries.
strawberries.Figure
Figure99
showsthe
shows thevisualization
visualization of of
thethe monitored
monitored strawberry
strawberry environment
environment data data andstrawberry
and the the straw-
berry classification
classification results
results with with relation
relation to harvest to determination.
harvest determination.
Figure 9aFigure
shows 9atheshows
result ofthe
classifying strawberry photos by using the strawberry classification function of the analysisof
result of classifying strawberry photos by using the strawberry classification function
the analysis
server and theserver and the
IoT-Edge’s IRIoT-Edge’s
cut camera.IRThere
cut camera.
are threeThere are three
strawberry strawberry
objects objects
in the photo.
in the
One photo.
object wasOne object was
classified classified
as 30% mature,as 30% mature,
but the otherbut
twotheobjects
other two
wereobjects were not
not classified
classified
because theybecause they overlapped
overlapped each other.each
Figureother. Figurethe
9b shows 9b visualization
shows the visualization of envi-
of environmental
data, such as
ronmental humidity,
data, such astemperature, water temperature,
humidity, temperature, light, ultraviolet,
water temperature, CO2 , altitude,
light, ultraviolet, CO2,
pressure,
altitude, pH and dissolved
pressure, oxygen, stored
pH and dissolved oxygen, instored
the database server. Visualization
in the database of the
server. Visualization
NDVI
of thevalues
Electronics 2021, 10, x FOR PEER REVIEW NDVI was valuesexcluded from the
was excluded visualization
from server because
the visualization the intensity
server because of of
light
the intensity
14 16 of
constantly changed in the natural light environment.
light constantly changed in the natural light environment.
Figure 9. Visualization of the strawberry environment monitoring and the harvest decision.
Figure 9. Visualization of the strawberry environment monitoring and the harvest decision.
5. Conclusions
In this study, we designed and implemented a system that monitors the strawberry
hydroponic cultivation environment and determines the harvest time of strawberries. The
proposed system uses an IoT-Edge module to collect strawberry hydroponic environment
data and strawberry photos. The collected environmental data and strawberry photos are
Electronics 2021, 10, 1400 14 of 15
5. Conclusions
In this study, we designed and implemented a system that monitors the strawberry
hydroponic cultivation environment and determines the harvest time of strawberries. The
proposed system uses an IoT-Edge module to collect strawberry hydroponic environment
data and strawberry photos. The collected environmental data and strawberry photos
are transferred to a nano-sized private AI-Cloud-based analysis station module and are
visualized and determined when harvesting. The monitoring and analysis results visual-
ized with an integrated interface provide a variety of basic data, such as varying yields,
harvest times and pest diagnosis for strawberry cultivation. The proposed system was
designed with the concept of an AI-Cloud, and the server container can be flexibly and
easily increased if the system needs to be expanded. While growing Seolhyang strawberries
in a home hydroponic cultivation system, the proposed monitoring system was tested
by monitoring 1,316,848 actual environmental data pieces related to 13 data types over
a period of 4 months. The proposed harvest decision system predicted the harvest time
using 1575 strawberry pictures acquired from the Smart Berry Farm and a Google Images
search and showed a high accuracy rate of 98.267%. As future research, we plan to study
analysis methods that analyze the monitored strawberry growing environment data. In
addition, we plan to study how the analysis results affect strawberry maturity.
Author Contributions: Conceptualization, S.P. and J.K.; methodology, S.P.; software, S.P.; validation,
S.P. and J.K.; formal analysis, S.P.; investigation, S.P.; resources, S.P. data curation, S.P.; writing-
original draft preparation, S.P.; writing-review and editing, J.K.; visualization, S.P. All authors have
read and agreed to the published version of the manuscript.
Funding: This research was funded by Artificial Intelligence Graduate School Program (GIST).
Data Availability Statement: The data used to support the findings of this study are included within
the article.
Acknowledgments: This work was supported by Institute of Information and Communications
Technology Planning and Evaluation (IITP) grant funded by the Korean government (MSIT) (No.2019-
0-01842, Artificial Intelligence Graduate School Program (GIST)).
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Global Strawberry Market Size. Available online: https://www.thecowboychannel.com/story/43599204/fresh-strawberry-
market-2021-is-estimated-to-clock-a-modest-cagr-of-34nbspduring-the-forecast-period-2021-2026-with-top-countries-data (ac-
cessed on 4 May 2021).
2. Worldwide Strawberry Production. Available online: https://www.hortidaily.com/article/9252391/global-strawberry-
production-up-by-almost-40/ (accessed on 4 May 2021).
3. Han, J.S.; Hong, Y.J.; Kim, J.W. Refining Microservices Placement Employing Workload Profiling Over Multiple Kubernetes
Clusters. IEEE Access 2020, 8, 192543–192556. [CrossRef]
4. Han, J.S.; Park, S.; Kim, J.W. Dynamic OverCloud: Realizing Microservices-Based IoT-Cloud Service Composition over Multiple
Cloud. Electronics 2020, 9, 969. [CrossRef]
5. Bharti, N.K.; Dongargaonkar, M.D.; Kudkar, I.B.; Das, S.; Kenia, M. Hydroponics System for Soilless Farming Integrated with
Android Application by Internet of Things and MQTT Broker. In Proceedings of the 2019 IEEE Pune Section International
Conference (PuneCon), Pune, India, 18–20 December 2019.
6. Joshitha, C.; Kanakaraja, P.; Kumar, K.S.; Akanksha, P.; Satish, G. An eye on hydroponics: The IoT initiative. In Proceedings of the
7th International Conference on Electrical Energy Systems (ICEES), Chennai, India, 11–13 February 2021.
7. Herman; Surantha, N. Intelligent Monitoring and Controlling System for Hydroponics Precision Agriculture. In Proceedings of the
7th International Conference on Information and Communication Technology (ICoICT), Kuala Lumpur, Malaysia, 24–26 July 2019.
8. Fakhrurroja, H.; Mardhotillah, S.A.; Mahendra, O.; Munandar, A.; Rizqyawan, M.I.; Pratama, R.P. Automatic pH and Humidity
Control System for Hydroponics Using Fuzzy Logic. In Proceedings of the 2019 International Conference on Computer, Control,
Informatics and Its Applications (IC3INA), Tangerang, Indonesia, 23–24 October 2019.
9. Verma, M.S.; Gawade, S.D. A machine learning approach for prediction system and analysis of nutrients uptake for better crop
growth in the Hydroponics system. In Proceedings of the International Conference on Artificial Intelligence and Smart Systems
(ICAIS), Coimbatore, India, 25–27 March 2021.
Electronics 2021, 10, 1400 15 of 15
10. Pawar, S.; Tembe, S.; Acharekar, R.; Khan, S.; Yadav, S. Design of an IoT enabled Automated Hydroponics system using NodeMCU
and Blynk. In Proceedings of the IEEE 5th International Conference for Convergence in Technology (I2CT), Bombay, India, 29–31
March 2019.
11. Issarny, V.; Billet, B.; Bouloukakis, G.; Florescu, D.; Toma, C. LATTICE: A Framework for Optimizing IoT System Configurations
at the Edge. In Proceedings of the IEEE 39th International Conference on Distributed Computing Systems (ICDCS), Dallas, TX,
USA, 7–10 July 2019.
12. Samijayani, O.N.; Darwis, R.; Rahmatia, S.; Mujadin, A.; Astharini, D. Hybrid ZigBee and WiFi Wireless Sensor Networks
for Hydroponic Monitoring. In Proceedings of the International Conference on Electrical, Communication, and Computer
Engineering (ICECCE), Istanbul, Turkey, 12–13 June 2020.
13. Adidrana, D.; Surantha, N. Hydroponic Nutrient Control System based on Internet of Things and K-Nearest Neighbors.
In Proceedings of the International Conference on Computer, Control, Informatics and Its Applications (IC3INA), Tangerang,
Indonesia, 23–24 October 2019.
14. Ge, Y.; Xiong, Y.; Tenorio, G.L.; From, P.J. Fruit Localization and Environment Perception for Strawberry Harvesting Robots. IEEE
Access 2019, 7, 147642–147652. [CrossRef]
15. Yu, Y.; Zhang, K.; Liu, H.; Yang, L.; Zhang, D. Real-Time Visual Localization of the Picking Points for a Ridge-Planting Strawberry
Harvesting Robot. IEEE Access 2020, 8, 116556–116568. [CrossRef]
16. Feng, Q.; Chen, J.; Zhang, M.; Wang, X. Design and Test of Harvesting Robot for Table-top Cultivated Strawberry. In Proceedings
of the WRC Symposium on Advanced Robotics and Automation (WRC SARA), Beijing, China, 21–22 August 2019.
17. Huang, Y.P.; Wang, T.W.; Basanta, H. Using Fuzzy Mask R-CNN Model to Automatically Identify Tomato Ripeness. IEEE Access
2020, 8, 207672–207682. [CrossRef]
18. Altaheri, H.; Alsulaiman, M.; Muhammad, G. Date Fruit Classification for Robotic Harvesting in a Natural Environment Using
Deep Learning. IEEE Access 2019, 7, 117115–117133. [CrossRef]
19. Zhang, L.; Jia, J.; Gui, G.; Hao, X.; Gao, W.; Wang, M. Deep Learning Based Improved Classification System for Designing Tomato
Harvesting Robot. IEEE Access 2018, 6, 67940–67950. [CrossRef]
20. Nandi, C.S.; Tudu, B.; Koley, C. A Machine Vision Technique for Grading of Harvested Mangoes Based on Maturity and Quality.
IEEE Sens. J. 2016, 16, 6387–6396. [CrossRef]
21. Saputro, A.H.; Juansyah, S.D.; Handayani, W. Banana (Musa sp.) maturity prediction system based on chlorophyll content using
visible-NIR imaging. In Proceedings of the 2018 International Conference on Signals and Systems (ICSigSys), Bali, Indonesia,
1–3 May 2018.
22. Kuang, Y.C.; Lee, S.; Michael, J.C.; Melanie, P.O. Evaluation of Deep Neural Network and Alternating Decision Tree for Kiwifruit
Detection. In Proceedings of the 2019 IEEE International Instrumentation and Measurement Technology Conference (I2MTC),
Auckland, New Zealand, 20–23 May 2019.
23. Hong, S.J.; Eum, H.L. Determination of the Harvest Date and Ripening Phase of ‘Seolhyang’ Strawberry. J. Biol. Environ. Control
2020, 27, 62–72. [CrossRef]
24. Multi_Camera_Adapter_V2.2_Python Example Code. Available online: https://github.com/ArduCAM/RaspberryPi/tree/master/
Multi_Camera_Adapter/Multi_Adapter_Board_4Channel/Multi_Camera_Adapter_V2.2_python (accessed on 14 April 2021).
25. LXD. Available online: https://linuxcontainers.org/lxd/introduction (accessed on 16 April 2021).
26. Storage Management in LXD. Available online: https://ubuntu.com/blog/storage-management-in-lxd-2--15 (accessed on 27
April 2021).
27. Ext4. Available online: https://en.wikipedia.org/wiki/Ext4 (accessed on 27 April 2021).
28. Normalized Difference Vegetation Index. Available online: https://en.wikipedia.org/wiki/Normalized_difference_vegetation_
index (accessed on 27 April 2021).
29. YOLO. Available online: https://github.com/AlexeyAB/darknet (accessed on 27 April 2021).
30. Lu, Y.; Zhang, L.; Xie, W. YOLO-compact: An Efficient YOLO Network for Single Category Real-Time Object Detection.
In Proceedings of the Chinese Control and Decision Conference (CCDC), Hefei, China, 22–24 August 2020.
31. Yolo Mark. Available online: https://github.com/AlexeyAB/Yolo_mark (accessed on 27 April 2021).
32. MAPIR Camera. Available online: https://www.mapir.camera/collections/survey3 (accessed on 2 May 2021).
33. Smart Berry Farm. Available online: https://m.blog.naver.com/damyanggun/221849916199 (accessed on 1 May 2021).