Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
How Granular Can a Dose Form Be Described? Considering EDQM Standard Terms for a Global Terminology
Next Article in Special Issue
Rough IPFCM Clustering Algorithm and Its Application on Smart Phones with Euclidean Distance
Previous Article in Journal
Multi-Institutional Breast Cancer Detection Using a Secure On-Boarding Service for Distributed Analytics
Previous Article in Special Issue
Logit Averaging: Capturing Global Relation for Session-Based Recommendation
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

ROS-Based Unmanned Mobile Robot Platform for Agriculture

1
Department of AI & Big Data, Honam University, Gwangju 62399, Korea
2
Automotive Materials & Components R&D Group, Korea Institute of Industrial Technology, Gwangju 61012, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4335; https://doi.org/10.3390/app12094335
Submission received: 1 April 2022 / Revised: 18 April 2022 / Accepted: 24 April 2022 / Published: 25 April 2022
(This article belongs to the Special Issue Advances in Intelligent Systems)

Abstract

:
While the demand for new high-tech technologies is rapidly increasing, difficulties are presented, such as aging and population decline in rural areas. In particular, autonomous mobile robots have been emerging in the agricultural field. Worldwide, huge investment is being made in the development of unmanned agricultural mobile robots; meanwhile with the development of robots, modern farms have high expectations of increased productivity. However, in the agricultural work environment, it is difficult to solve these problems with the existing mobile robot form, due to the difficulties of various environments. Typical problems are space constraints in the agricultural work environment, the high computational complexity of algorithms, and changes in the environment. To solve these problems, in this paper, we propose a method to design and operate a mobile robot platform that can be used in a greenhouse. We represent a robot type with two drive wheels along with four casters that can operate on path and rail. In addition, we propose a technology for a multi-AI deep learning system to operate a robot, an algorithm that can operate such a robot, and a VPN-based communication system for network and security. The proposed method is expected to increase productivity and reduce labor costs in the agricultural work environment.

1. Introduction

In rural areas, the proportion of elderly people keeps increasing, while the youth labor force participation rate is decreasing. As a result, rural areas, where there is insufficient workforce, have no choice but to attract manpower from outside. However, the reality is that it is difficult to recruit a new workforce. In addition, although recently, mechanization and the automation of agricultural work are progressing, efficient technology in the actual agricultural environment to replace the labor force has been lacking.
Globally, climate change is evident, and the temperature is rising. Agricultural production in particular is very vulnerable to climate change. For every 1 degree Celsius increase in global average temperature, it is predicted that, on average, global production of wheat, rice, corn, and soybean will decrease by (6.0, 3.2, 7.4, and 3.1)%, respectively [1]. Climate change has become an even greater obstacle to farmers struggling with a shortage of labor. Therefore, in rural areas, the development of advanced agricultural work robots is considered an urgent task.
To solve various problems [2,3,4,5], mobile robots nowadays are working in both the industrial and non-industrial environments. Advances in mobile robots allow for a variety of uses. Warehouse managers use autonomous mobile robots to manage and move inventory [6]. Household vacuum cleaner robots are very common today as a general consumer product [7]. Robots that can work in home environments to help older people [8] are another area currently being researched. In particular, the mobile robot will show great efficiency and effectiveness when applied to the smart farm environment. In recent years, many robots have been developed to manage, and take care of, crops [9]. They are being developed to work autonomously, with minimal or no human intervention. For the robot to move autonomously, environmental awareness sensors, such as laser scans and depth cameras, are indispensable. Through the Simultaneous Localization and Mapping (SLAM) algorithm and route planner for map building and location recognition, the robot can autonomously drive [10].
In particular, various roles of robots are required in agricultural fields. To meet these demands, various types of robots have been developed until recently. Agricultural robot technology has been widely studied and applied in planting, aquaculture, livestock, and poultry farming. In addition, Agricultural robots are also being used in crop agriculture for phenotyping, monitoring, mapping, crop management, environmental control, etc.
Several agricultural harvesting robots have been developed. To harvest fruit, robots perform visual perception, position detection, segmentation, and 3D reconstruction, and compute spatial coordinates of objects [11,12,13,14,15]. Wibowo et al. have developed an end-to-end autonomous coconut harvesting robot that can automatically climb coconut trees and detect fruit through a vision system [11]. Qingchun et al. has developed a strawberry harvesting robot with picking speeds of 7.5 s per strawberry [12]. Zhao et al. used a CCD sensor to localize sweet-pepper in the image plane [13]. Barth et al. developed a selective sweet-pepper harvesting robot [14]. Hemming et al. developed another sweet-pepper harvesting robot [15]. Si et al. developed another apple harvesting robot [16].
In agricultural production, monitoring is just as important as harvesting. Various robots for monitoring are also being developed. The European Union VineRobot project has developed an autonomous robot that measures grape growth, grape yield, grape composition, and soil moisture. The platform is a 4WD Electric Monster Truck, which includes a navigation sensor, an infrared camera, and an RGB camera [17]. Rizk H et al. developed a plant health monitoring robot based on 5-Wheel and 6-chassis [18]. Rizk H et al. developed crop disease detection based on 4-wheel [19].
Robot technology is a powerful tool for realizing targeted actions in agriculture. Zhao Y et al. developed a seed sowing robot based on a 4-wheeled platform, which contains vision sensors, a lead screw, and a rotating disc [20]. Van Henten E J et al. developed a de-leafing robot based on railed vehicle [21,22]. Strisciuglio N et al. developed a trimming robot based on a lawnmower [23]. In addition, recently, research on developing a robot using a digital twin method is in progress [24].
However, there are still many difficulties in using these agricultural robots. In an indoor environment in particular, there are several problems, such as steering issues, network disconnection problems, robots that only solve specific problems, and so forth. One of the challenges of indoor environments is that the corridors in a typical greenhouse are narrow, and to grow more crops in the greenhouse, the joints and corners are designed to be very narrow. Because the vehicle with rear-wheel steering, which is most commonly used in vehicles, cannot turn in a tight and narrow space, agriculture mobile robots should not move like a four-wheeled car. Another problem is that there is the possibility of communication cut-off due to the steel frame, vinyl, and glass in the smart farm. If communication is suddenly cut off, there is a high risk that the robot will malfunction. Yet another problem is that it is difficult to operate in real-time due to the high computational complexity of deep learning or robotic algorithms.
To solve the problems that occur in the unmanned agricultural work robot of the smart farm, a multi-AI onboard system is built to calculate the deep learning algorithm in real-time to enable the stable use of the autonomous system. In addition, to overcome the steering issue, we designed a new autonomous mobile robot with 2WD chassis. Section 2 explains the autonomous mobile robot platform. Section 3 introduces a mobile robot system based on ROS. Section 4 shows the experimental results. Finally, Section 5 concludes the paper.

2. Autonomous Mobile Robot Platform

2.1. Analysis of the Greenhouse Environment

The proposed robot is intended to be applied to cutting-edge smart farms. For the conceptual design of the control robot, analysis is conducted on a paprika farm. The research target is the paprika smart farm in the form of an interlocking glass greenhouse in South Korea. The horticultural environment is standardized as shown in Figure 1, and the internal structure of the facility consists of a concrete passage, a hot water pipe for heating, and a bed for growing crops. Table 1 represents the specification of the smart farm. Although the corridor is not narrow to allow farmers to work, it is too narrow for the robot to move. Even the smart farms that we investigated are large in scale in South Korea. Therefore, it is necessary to design a robot that can load and transport a sufficient quantity of crops, and that can move freely, even in a tight space.

2.2. WD Mobile Robot Chassis

We established a design strategy based on the optimization of robot mobility, transportation, and agricultural work. First, design specifications, such as load and working area, are reviewed according to purposes and requirements. Second, we select the hardware specifications according to the conceptual design, design the basic structure of the robot, and select the main hardware parts accordingly. Third, detailed design evaluates the capacity and characteristics of major parts by 3D modeling and manufacturing the detailed structure of the control robot. The proposed robot can work while moving along each ridge and corridor in facility cultivation areas, such as glass greenhouses and smart farms. In addition, since it runs using the hot water pipe for heating installed in each ridge, the maximum load of the robot should be designed and manufactured within 300 kg, considering the allowable load of the hot water pipe for heating.
To move freely in narrow space, we design a prototype of a 2WD mobile robot to conduct agriculture work. The advantage of the 2WD mobile is that it can turn around near narrow passages. The mobile robot chassis serves as the main body in the construction of the robot, as represented in Figure 2:
The chassis is designed in such a way as to support all components of the mobile robot, and to withstand the load of 350 kg. In addition, it is designed to be operated together with the smart farm rail and passage, as shown in Figure 3. The mobile robot normally passes through the corridor, and moves on pipe rails in the facility when performing tasks, such as transplanting, planting, and leaf-cutting. To perform the tasks, the robot chassis may also be added to other components, such as a vacuum system, lift table system, or manipulator. For driving on rails, we use a method of driving the rail wheels by connecting a chain to the motor, as represented in Figure 4.

2.3. Sensors and System Setup

We configure a multi-on-board system for the operation of the robot in real-time. As shown in Figure 5, jetson1 is responsible for sensing data or calculating or analyzing data, or controlling the robot. Jetson2 controls additionally used robotic arms and cleaning machines. For autonomous movement, it is necessary to install sensors for the process of detecting events and quantitative mapping for further processing of changes in the surrounding environment. Therefore, we install Lidar, a camera, and an encoder to obtain data. The camera mainly detects objects and the surrounding environment. Lidar is used to map the smart farm terrain. The encoder is used to detect the position of the robot. An LTE router is also installed for communication between the server and robot.

3. Mobile Robot System

3.1. ROS Based Mobile Robot System

ROS (Robot Operating System) is a middleware system for controlling robot components from a PC. ROS is usually installed on the Operating System (OS). The ROS system consists of several independent nodes, each of which operates by communicating with other nodes using a publish/subscribe messaging model. Our robot control system employs NVidia Jetson nano embedded board as the computation module with Ubuntu 18.04 operation system and ROS Melodic. To facilitate communication between processes, develop a robot with a new ROS version, and allow easy integration of a wide range of tools and algorithms, we employ Robot Operating System (ROS) [25,26]. ROS is a middleware that provides a message passing framework and a large number of tools and libraries for robot development. The middleware, ROS, is not a complete operating system, but rather an abstraction layer or meta-operating system that runs on top of Ubuntu [27]. In addition, we implement ROS through VPN. A virtual private network (VPN) is an encrypted connection from the device to network over the Internet. Encrypted connections help to securely transmit confidential data. They prevent unauthorized persons from hacking into the user’s traffic, and allow users to quickly perform tasks remotely [28]. Figure 6 represents the proposed mobile robot system using ROS through VPN. The jetson nano board is connected with motor controller FIM2360 via USB to Serial. FIM2360 motor driver, which can control up to two DC motors. FIM2360 sends current values and status such as temperature and battery to the jetson nano board. In addition, by the distribution and processing multi-jetson nano board, autonomous driving and task control can be processed. In this study, two jetson nano boards are used; but in the future, if other devices such as cleaning robots or manipulators are combined, additional boards may be added.
Table 2 presents detailed information about our mobile robot platform. The proposed system contains Ubuntu 18.04 operation system and ROS Melodic. The software is written in python3.

3.2. Mobile Robot Control

Our mobile robot uses a differential drive mechanism. The robot consists of two drive wheels mounted on a common axle and each wheel can be independently driven forward or in reverse. To control the DC motors in the robot operating system (ROS), we create a custom motor drive node that subscribes to the cmd_vel topic, and publishes the base_odom topic. Our motor drive node receives cmd_vel topic from other processes. cmd_vel topic is a type of geometry_msgs/Twist. A twist is composed of a 3D linear vector (x, y, z), and a 3D angular vector (x, y, z). Values of RPM parameters are transmitted to the motor controller FIM2360 to operate motors. Thus, we are required to convert Twist to RPM with the following equation [29]:
R P M l = s   ( x l i n e a r z a n g u l a r × W r o b o t 2 ) R P M r = s   ( x l i n e a r + z a n g u l a r × W r o b o t 2 )
where R P M l and R P M r are the RPM values of the left motor and right motor, respectively, s indicates a scale factor, x l i n e a r is the x value of linear vector, z a n g u l a r is the z value of angular vector, and W r o b o t is the length between the left wheel and right wheel. Equation (1) is a formula that allows the robot to rotate its rolling motion around the ICC (Instantaneous Center of Curvature).

3.3. Cam–Lidar Calibration

To use heterogeneous cameras in one system, we extract extrinsic parameters between each camera and Lidar sensors [30]. This is to use the information of the object predicted by the camera in the lidar sensor. Camera parameters include intrinsic, extrinsic, and distortion coefficients. Before estimating the extrinsic parameters between the camera and Lidar sensor, we need to extract the intrinsic parameter of the camera [31]. The intrinsic parameters are defined as:
[ f x 0 0 s f y 0 c x c y 1 ]
where cx and cy are the optical centers in pixels, and fx and fy are the focal lengths in pixels. s is a skew coefficient, which if the image axes are not perpendicular, is non-zero. It calculates camera parameters based on feature points extracted from several images of the checkerboard. Figure 7 represents checkerboard images captured from different viewing angles to obtain the camera parameters. We calculate the 3D positions of corresponding feature points from captured checkerboard images. Then we find camera parameters that minimize the projection errors when the 3D points project to the checkerboard images.
To estimate the extrinsic matrix, we solve the perspective-n-point system. Perspective-n-point [32] is the problem of estimating the pose of a calibrated camera given a set of n 3D points of the world and the corresponding 2D projection of the image. When there are outliers in the point correspondence set, PnP is error-prone. Therefore, RANSAC can be used to make the final solution for the camera pose more robust against outliers [33]. We employ PnP with a RANSAC algorithm to estimate rotation and translation transforms between the camera and the LiDAR. We pick the corresponding points by selecting the four corner points of the checkerboard in both the camera and the LiDAR frames, as shown in Figure 8. The calibrated extrinsic parameter is as follows:
[ R | t ] = [ r 1 , 1 r 1 , 2 r 1 , 3 r 2 , 1 r 2 , 2 r 2 , 3 r 3 , 1 r 3 , 2 r 3 , 3 | t 1 t 2 t 3 ]
where R is the rotation matrix, and t is the translation matrix.

3.4. Virtual Private Network

OpenVPN is software that can be used to create a Virtual Private Network (VPN) over the Internet [34]. In this setup, an OpenVPN access server is set up on the lab computer, allowing the robot to connect to the lab network through a gateway as shown in Figure 9. This set-up is a so-called Site-to-Site configuration, and allows the robot network and lab network to act virtually as a single network. The basic principles of data transfer in VPN site-to-site networks in this study are as follows. The computer (Jetson board) on the robot network asks to connect to a server on the lab network. This request is routed through the robot’s LTE network to the gateway. The gateway then sends a request to the lab’s access server through an encrypted VPN tunnel. The access server decrypts the message, and sends it to the target computer (or server) over the virtual network. When the target computer responds, the response goes through the virtual network’s routing table to the access server. Conversely, the access server acts as a gateway to the lab’s local area network, and sends responses through an encrypted VPN tunnel to the robot’s gateway computer. Finally, the gateway forwards the response to the originating computer.

4. Results

This section represents the specification of the mobile robot and experimental results. Figure 10 represents the actual hardware prototype built to implement the system in Figure 5. We wire two 12 V batteries in series to make 24 V. Switch-mode power supply converts (24 to 5, 9, and 12) V to power various boards and sensors. We connect the Jetson board to the FIM2360, router, and sensors to control the robot. For agricultural work, the robot must withstand a sufficient load. Therefore, we tested how much the robot can withstand when the load increases for each part, as shown in Figure 11. In the case of a centralized load, Table 3 shows that the caster and robot are not abnormal, even at 450 kg. However, in the case of a front concentrated load, Table 4 shows that at more than 400 kg, the caster collapsed. To use our robot platform, it is safe to use less than 370 kg when the load is concentrated on one side.
To experiment with node behavior, we utilize a mobile-based controller to publish a speed command (cmd_vel) to the mobile robot. We have confirmed that the robot operates accurately on the ground and rails. To qualitatively evaluate the results of the calibration, the cloud point of the lidar was projected onto the image. For this, the data recorded in the Rosbag are played, and the point cloud is projected. Figure 12 shows the projection result in the ROS environment. Although a small error occurs due to the close distance, it seems that the performance of autonomous driving can be improved by recognizing objects using heterogeneous cameras.
In general, agricultural machines are often more than 2 m in length. Therefore, 4wd-based agricultural machines cannot move easily in narrow corridors. Therefore, when moving from the hallway to the next rail, a person must manually rotate it using a caster. However, the proposed robot platform can rotate in place, so it can freely move corridors and rails without human assistance. Figure 13 represents our robot driving forward, backward, and rotating by the angle θ or in place.

5. Conclusions

This study proposed an ROS-based unmanned mobile robot platform for agricultural work. In smart farms, there are many problems, such as steering problems, network disconnection problems, and robots that only solve specific problems. To solve the above problems, we constructed a multi-AI on-board system based on ROS to calculate deep learning methods in real-time, and designed a new 2WD mobile robot with four casters to move freely in a narrow space. In addition, we installed a VPN to protect information and communicate quickly, and implemented calibration for heterogeneous cameras. Through them, it is possible to freely move in a narrow space and perform agricultural work, and data fusion between heterogeneous cameras is possible. Consequently, it is expected that the proposed method will be used for various agricultural work by overcoming difficulties in the agricultural environment. In future work, the robot may also be added to other components, such as a vacuum system to clean the greenhouse, a lift table system to raise or lower goods and/or persons, or a manipulator to make a transplant and to cut out the leaf.

Author Contributions

Conceptualization, E.-T.B. and D.-Y.I.; Data curation, E.-T.B.; Formal analysis, E.-T.B.; Funding acquisition, D.-Y.I.; Investigation, E.-T.B.; Methodology, E.-T.B.; Project administration, D.-Y.I.; Resources, E.-T.B.; Software, E.-T.B.; Validation, E.-T.B.; Writing—original draft, E.-T.B.; Writing—review and editing, E.-T.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Korea Institute of Planning and Evaluation for Technology in Food, Agriculture and Forestry (IPET) and the Korea Smart Farm R&D Foundation (KosFarm), through the Smart Farm Innovation Technology Development Program, funded by the Ministry of Agriculture, Food and Rural Affairs (MAFRA) and the Ministry of Science and ICT (MSIT), Rural Development Administration (RDA) (421032-04-2-SB010).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

We declare that we have no financial and personal relationships with other people or organizations that could have inappropriately influenced our work, there is no professional or other personal interest of any nature or kind in any product, service, and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled, “ROS-based unmanned mobile robot platform for agriculture”.

References

  1. Zhao, C.; Liu, B.; Piao, S.; Wang, X.; Lobell, D.B.; Huang, Y.; Huang, M.; Yao, Y.; Bassu, S.; Ciais, P.; et al. Temperature increase reduces global yields of major crops in four independent estimates. Proc. Natl. Acad. Sci. USA 2017, 114, 9326–9331. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Štefek, A.; Pham, V.T.; Krivanek, V.; Pham, K.L. Optimization of Fuzzy Logic Controller Used for a Differential Drive Wheeled Mobile Robot. Appl. Sci. 2021, 11, 6023. [Google Scholar] [CrossRef]
  3. Rubio, F.; Valero, F.; Llopis-Albert, C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int. J. Adv. Robot. Syst. 2019, 16, 1729881419839596. [Google Scholar] [CrossRef] [Green Version]
  4. Zhang, P.; Gao, L.; Zhu, Y. Study on control schemes of flexible steering system of a multi-axle all-wheel-steering robot. Adv. Mech. Eng. 2016, 8, 1687814016651556. [Google Scholar] [CrossRef] [Green Version]
  5. Shamshiri, R.R.; Weltzien, C.; Hameed, I.A.; Yule, J.I.; Grift, E.T.; Balasundram, S.K.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and development in agricultural robotics: A perspective of digital farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–14. [Google Scholar] [CrossRef]
  6. Bogue, R. Growth in e-commerce boosts innovation in the warehouse robot market. Ind. Robot Int. J. 2016, 43, 583–587. [Google Scholar] [CrossRef]
  7. Asafa, T.B.; Afonja, T.M.; Olaniyan, E.A.; Alade, H.O. Development of a vacuum cleaner robot. Alex. Eng. J. 2018, 57, 2911–2920. [Google Scholar] [CrossRef]
  8. Peleka, G.; Kargakos, A.; Skartados, E.; Kostavelis, I.; Giakoumis, D.; Sarantopoulos, I.; Doulgeri, Z.; Foukarakis, M.; Antona, M.; Hirche, S.; et al. Ramcip-a service robot for mci patients at home. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1–9. [Google Scholar]
  9. Klancar, G.; Zdesar, A.; Blazic, S.; Skrjanc, I. Wheeled Mobile Robotics: From Fundamentals towards Autonomous Systems; Butterworth Heinemann: Oxford, UK, 2017. [Google Scholar]
  10. Bergerman, M.; Billingsley, J.; Reid, J.; van Henten, E. Robotics in agriculture and forestry. In Springer Handbook of Robotics; Siciliano, B., Khatib, O., Eds.; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1463–1492. [Google Scholar]
  11. Wibowo, T.S.; Sulistijono, I.A.; Risnumawan, A. End-to-end coconut harvesting robot. In Proceedings of the 18th IEEE International Electronics Symposium (IES), Denpasar, Indonesia, 29–30 September 2016; pp. 444–449. [Google Scholar]
  12. Feng, Q.; Wang, X.; Zheng, W.; Qiu, Q.; Jiang, K. New strawberry harvesting robot for elevated-trough culture. Int. J. Agric. Biol. Eng. 2012, 5, 1–8. [Google Scholar]
  13. Zhao, D.; Lv, J.; Ji, W.; Zhang, Y. Design and control of an apple harvesting robot. Biosyst. Eng. 2011, 110, 112–122. [Google Scholar]
  14. Barth, R.; Hemming, J.; Van Henten, E.J. Angle estimation between plant parts for grasp optimisation in harvest robots. Biosyst. Eng. 2019, 183, 26–46. [Google Scholar] [CrossRef]
  15. Hemming, J.; Ruizendaal, J.; Hofstee, J.W.; van Henten, E.J. Fruit detectability analysis for different camera positions in sweet-pepper. Sensors 2014, 14, 6032–6044. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Si, Y.; Liu, G.; Feng, J. Location of apples in trees using stereoscopic vision. Comput. Electron. Agric. 2015, 112, 68–74. [Google Scholar] [CrossRef]
  17. Dos Santos, F.N.; Sobreira, H.M.P.; Campos, D.F.B.; Morais, R.; Moreira, A.P.G.M.; Contente, O.M.S. Towards a reliable monitoring robot for mountain vineyards. In Proceedings of the 2015 IEEE International Conference on Autonomous Robot Systems and Competitions, Vila Real, Potugal, 8–10 April 2015; pp. 37–43. [Google Scholar] [CrossRef]
  18. Rizk, H.; Habib, M.K. Robotized early plant health monitoring system. In Proceedings of the IECON 2018-44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018. [Google Scholar]
  19. Rey, B.; Aleixos, N.; Cubero, S.; Blasco, J. A field robot to detect olive trees infected by Xylella fastidiosa using proximal sensing. Remote Sens. 2019, 11, 221. [Google Scholar] [CrossRef] [Green Version]
  20. Santhi, P.V.; Kapileswar, N.; Chenchela, V.K.; Prasad, C.V.S. Sensor and vision based autonomous AGRIBOT for sowing seeds. In Proceedings of the 2017 International Conference on Energy, Communication, Data Analytics & Soft Computing (ICECDS), Chennai, India, 1–2 August 2017; pp. 242–245. [Google Scholar] [CrossRef]
  21. Van Henten, E.J.; Van Tuijl, B.A.J.; Hoogakker, G.J.; Van Der Weerd, M.J.; Hemming, J.; Kornet, J.G.; Bontsema, J. An autonomous robot for de-leafing cucumber plants grown in a high-wire cultivation system. Biosyst. Eng. 2006, 94, 317–323. [Google Scholar] [CrossRef]
  22. Ota, T.; Bontsema, J.; Hayashi, S.; Kubota, K.; Van Henten, E.J.; Van Os, E.A.; Ajiki, K. Development of a cucumber leaf picking device for greenhouse production. Biosyst. Eng. 2007, 98, 381–390. [Google Scholar] [CrossRef]
  23. Strisciuglio, N.; Tylecek, R.; Blaich, M.; Petkov, N.; Biber, P.; Hemming, J.; van Henten, E.; Sattler, T.; Pollefeys, M.; Gevers, T.; et al. Trimbot2020: An outdoor robot for automatic gardening. In Proceedings of the 50th International Symposium on Robotics, Munich, Germany, 20–21 June 2018; pp. 1–6. [Google Scholar]
  24. Stączek, P.; Pizoń, J.; Danilczuk, W.; Gola, A. A digital twin approach for the improvement of an autonomous mobile robots (AMR’s) operating environment—A case study. Sensors 2021, 21, 7830. [Google Scholar] [CrossRef]
  25. Lim, J.Z.; Ng, D.W.K. Cloud based implementation of ROS through VPN. In Proceedings of the 2019 7th International Conference on Smart Computing & Communications (ICSCC), Sarawak, Malaysia, 28–30 June 2019. [Google Scholar]
  26. Koubaa, A. Robot Operating System (ROS) the Complete Reference (Volume 1); Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  27. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. Ros: An open-source robot operating system. ICRA Workshop Open Source Softw. 2009, 3, 5. [Google Scholar]
  28. Mason, A.G. (Ed.) Cisco Secure Virtual Private Networks; Cisco Press: Indianapolis, IN, USA, 2001. [Google Scholar]
  29. McKerrow, P.J. Introduction to Robotics; Addison-Wesley Publishing Company: Boston, MA, USA, 1991. [Google Scholar]
  30. Kim, E.S.; Park, S.Y. Extrinsic calibration between camera and LiDAR sensors by matching multiple 3D plane. Sensors 2019, 20, 52. [Google Scholar] [CrossRef] [Green Version]
  31. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  32. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  33. Bradski, G. The OpenCV Library. Dr Dobb’s J. Softw. Tools 2000, 25, 120–123. [Google Scholar]
  34. Crist, E.F.; Keijser, J.J. Mastering OpenVPN; PACKT Publishing: Birmingham, UK, 2015. [Google Scholar]
Figure 1. The greenhouse environment. (a) corridor; (b) corridor with machines; (c) rails; (d) top view of smartfarm.
Figure 1. The greenhouse environment. (a) corridor; (b) corridor with machines; (c) rails; (d) top view of smartfarm.
Applsci 12 04335 g001
Figure 2. Chassis of the 2WD mobile robot model. (1) frame, (2) cover, (3) cover, (4) cover panel, (5) cover panel, (6) motor, (7) bumper, (8) slide block, (10) sensor cover, (11) battery.
Figure 2. Chassis of the 2WD mobile robot model. (1) frame, (2) cover, (3) cover, (4) cover panel, (5) cover panel, (6) motor, (7) bumper, (8) slide block, (10) sensor cover, (11) battery.
Applsci 12 04335 g002
Figure 3. A mobile robot that moves along smart farm rails and passageways.
Figure 3. A mobile robot that moves along smart farm rails and passageways.
Applsci 12 04335 g003
Figure 4. Rail wheel drive using chain belt type.
Figure 4. Rail wheel drive using chain belt type.
Applsci 12 04335 g004
Figure 5. System structure.
Figure 5. System structure.
Applsci 12 04335 g005
Figure 6. Proposed mobile robot system.
Figure 6. Proposed mobile robot system.
Applsci 12 04335 g006
Figure 7. Checkerboard images for camera calibration.
Figure 7. Checkerboard images for camera calibration.
Applsci 12 04335 g007
Figure 8. Corresponding points by selecting the four corner points of the checkerboard.
Figure 8. Corresponding points by selecting the four corner points of the checkerboard.
Applsci 12 04335 g008
Figure 9. Virtural private network.
Figure 9. Virtural private network.
Applsci 12 04335 g009
Figure 10. Agriculture mobile robot.
Figure 10. Agriculture mobile robot.
Applsci 12 04335 g010
Figure 11. Load testing in a real environment.
Figure 11. Load testing in a real environment.
Applsci 12 04335 g011
Figure 12. Camera-LiDAR Projection.
Figure 12. Camera-LiDAR Projection.
Applsci 12 04335 g012
Figure 13. The proposed robot movement. (a) forward move; (b) backward move; (c) rotation in place; (d) rotation by the angle θ.
Figure 13. The proposed robot movement. (a) forward move; (b) backward move; (c) rotation in place; (d) rotation by the angle θ.
Applsci 12 04335 g013
Table 1. Specification of the greenhouse.
Table 1. Specification of the greenhouse.
ParameterValue
Cultivation area24,300 m2
Length of rail100 m
Number of rails150 ea.
Width of the corridor3 m
Table 2. Hardware and software specification of the mobile robot platform.
Table 2. Hardware and software specification of the mobile robot platform.
ParametersConfiguration
Chassis2WD mobile robot
Motor controllerFIM2360
Battery12 V ×2
MotorAC Induction Motor (24 V) ×2
BoardJetson nano board ×2
RouterCNR-L500 (LTE)
OSUbuntu 18.04
CameraYOITCH webcam
LidarVelodyne vlp-16
ROSMelodic
Program languagePython3
Table 3. The case of a centralized load.
Table 3. The case of a centralized load.
Weight [kg]Gap between the Ground and the Robot [mm]
035
5035
10035
15035
20035
25035
30035
35035
40035
45035
Table 4. The case of a front concentrated load.
Table 4. The case of a front concentrated load.
Weight [kg]Gap between the Ground and the Robot [mm]
035
5035
10035
15035
20035
25035
30035
35035
40034.8
450x
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Baek, E.-T.; Im, D.-Y. ROS-Based Unmanned Mobile Robot Platform for Agriculture. Appl. Sci. 2022, 12, 4335. https://doi.org/10.3390/app12094335

AMA Style

Baek E-T, Im D-Y. ROS-Based Unmanned Mobile Robot Platform for Agriculture. Applied Sciences. 2022; 12(9):4335. https://doi.org/10.3390/app12094335

Chicago/Turabian Style

Baek, Eu-Tteum, and Dae-Yeong Im. 2022. "ROS-Based Unmanned Mobile Robot Platform for Agriculture" Applied Sciences 12, no. 9: 4335. https://doi.org/10.3390/app12094335

APA Style

Baek, E. -T., & Im, D. -Y. (2022). ROS-Based Unmanned Mobile Robot Platform for Agriculture. Applied Sciences, 12(9), 4335. https://doi.org/10.3390/app12094335

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop