Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
A New Way to Identify Mastitis in Cows Using Artificial Intelligence
Previous Article in Journal
Design and Testing of an Extruded Shaking Vibration-Type Peanut Digging and Harvesting Machine for Saline Soil
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of an Automatic Harvester for Wine Grapes by Using Three-Axis Linear Motion Mechanism Robot

1
Laboratory of Bio-Mechatronics, Faculty of Engineering, Kitami Institute of Technology, Koentyo 165, Kitami-shi 090-8507, Hokkaido, Japan
2
Yasukawa Electric Corporation, 2-1 Kurosakishiroishi, Yahatanishi-ku, Kitakyushu 806-0004, Fukuoka, Japan
*
Author to whom correspondence should be addressed.
AgriEngineering 2024, 6(4), 4203-4219; https://doi.org/10.3390/agriengineering6040236
Submission received: 25 September 2024 / Revised: 14 October 2024 / Accepted: 22 October 2024 / Published: 7 November 2024
Figure 1
<p>Grape training systems: (<b>a</b>) Japanese traditional table cultivation method; (<b>b</b>) VSP (vertical shoot position).</p> ">
Figure 2
<p>Examples of grape harvester: (<b>a</b>) grape harvester made by NEW HOLLAND; (<b>b</b>) grape harvesting robot under development in Laboratory of Bio-Mechatronics.</p> ">
Figure 3
<p>The robot harvester using the three-axis linear robot construction.</p> ">
Figure 4
<p>Movement mechanism in the x-axis (left and right).</p> ">
Figure 5
<p>Movement mechanism in the y-axis: (<b>a</b>) lower state; (<b>b</b>) upper state.</p> ">
Figure 6
<p>Slide mechanism using bearings.</p> ">
Figure 7
<p>Movement mechanism in the z-axis: (<b>a</b>) backward state; (<b>b</b>) forward state; (<b>c</b>) front of retention mechanism; and (<b>d</b>) isometric view of retention mechanism.</p> ">
Figure 8
<p>Three-axis linear motion mechanism robot in the outdoor wine grapes field.</p> ">
Figure 9
<p>Motor feedback control system.</p> ">
Figure 10
<p>Flowchart of robot control.</p> ">
Figure 11
<p>Robot travel route: (<b>a</b>) travel route when two motors have same speed, and (<b>b</b>) travel route when two motors have different speeds.</p> ">
Figure 12
<p>Definition of the acceleration period, constant period, and deceleration period.</p> ">
Figure 13
<p>A detailed definition of the control period of <a href="#agriengineering-06-00236-f012" class="html-fig">Figure 12</a>.</p> ">
Figure 14
<p>Movement measurement jig (<b>a</b>) in the x-axis; (<b>b</b>) in the y-axis; and (<b>c</b>) in the z-axis.</p> ">
Figure 14 Cont.
<p>Movement measurement jig (<b>a</b>) in the x-axis; (<b>b</b>) in the y-axis; and (<b>c</b>) in the z-axis.</p> ">
Figure 15
<p>Example when the cut point is within the blade width indoors.</p> ">
Versions Notes

Abstract

:
In Japan, the aging and decreasing number of agricultural workers is a significant problem. For wine grape harvesting, especially for large farming areas, there is physical strain to farmers. In order to solve this problem, this study focuses on developing an automated harvesting robot for wine grapes. The harvesting robot needs high dust, water, and mud resistance because grapevines are grown in hard conditions. Therefore, a three-axis linear robot was developed using a rack and pinion mechanism in this study, which can be used in outdoor conditions with low cost. Three brushless DC motors were utilized to drive the three-axis linear robot. The motors were controlled using a control area network (CAN) bus to simplify the hardware system. The accuracy of the robot positioning was evaluated at the automated harvesting condition. The experiment results show that the accuracy is approximately 5 mm, 9 mm, and 9 mm in the x-axis (horizontal), y-axis (vertical), and z-axis (depth), respectively. In order to improve the accuracy, we constructed an error model of the robot and conducted a calibration of the robot. The accuracy was improved to around 2 mm of all three axes after calibration. The experimental results show that the accuracy of the robot is high enough for automated harvesting of the wine grapes.

1. Introduction

In Japan, the declining birthrate and aging population are becoming increasingly serious. Almost 70% of key agricultural workers are aged 65 and over [1]. And the total number of farmers is around one million, which is less than 1% of the total Japanese population. This indicates that the burden on each agricultural worker is increasing. One of the solutions to this problem is mechanization and automation of most of the measures in viticulture.
In recent years, Japanese wines have been attracting increasing attention. Japanese wine refers to wine made in Japan, from the cultivation of grapes to the winemaking such as Honshu, the most produced wine grape in Japan. Japanese wine is produced all over the country, and Hokkaido ranks third in terms of the number of wineries and volume of Japanese wine produced [2]. There are two types of grape cultivation methods, as shown in Figure 1. One is a traditional Japanese training system named table cultivation method like pergolas, as shown in Figure 1a, and the other is a worldwide adopted method named VSP (vertical shoot position), as shown in Figure 1b. The table cultivation method is easier for harvesting clusters because the clusters can be easily seen. On the other hand, the hedge cultivation is a cultivation method always adopted in large fields because it is easier for big vehicles and equipment to pass through the vineyard. Therefore, many vineyards are still using a training system appropriate for table grapes in Honshu, Japan, because the vineyard size in Honshu is small. However, the hedge training system is adopted in Hokkaido because the vineyard size is larger than in Honshu. With the expanded vineyard size in Honshu, in recent years, some vineyards have begun to adopt the hedge training system. Therefore, this research develops a harvesting robot for hedge cultivation grapes.
Grape harvesting machines are being researched and developed around the world, and some have already been commercialized. There are mainly two types of machines: one is the grape harvester, same as the berry harvester shown in Figure 2a. It is taller than the grape trunk and weighs over 2 tons. One problem of this machine is that the soil has been compacted because andosol, which makes up most of Japan’s soils, has high water retention and permeability derived from volcanic ash and is soft due to its low density when using such a heavy machine. The roots of the grape are negatively impacted when growing under compressed soil. It is also expensive at approximately JPY 36 million. The installation cost is high. Therefore, this kind of harvester has not been imported in Japan and has not been used in practice. The other type of harvester is a robot type harvester [3], which is being developed by the team of Laboratory of Bio-Mechatronics, as shown in Figure 2b [4]. However, it is not possible to take immediate action in the event of a breakdown or other trouble because it uses a robot arm with six degrees of freedom (DOF).
In order to harvest the grape clusters, our previous research has been in recognition models for wine grapes. In automatic harvesting, the construction of an object detection model is essential and the most important element. In recent years, many object detection methods have been based on the YOLO series [5,6,7,8,9]. For automatic harvesting, which is also used in this recognition model, accurate recognition of the shape of the cob axis and information on its depth are necessary. To obtain this information, a 3D camera is used. Various recognition methods have been proposed for automatic grape harvesting [10,11,12,13,14], but this recognition system employs a two-stage recognition.
Automatic harvesting robots have a variety of harvesting methods, such as suction [15] or grasping [16,17] and pulling, shaking, and cutting. Wine grapes, the target of this type of harvesting, should be harvested by the compact bunch to improve wine quality. Grapes are softer than apples, tomatoes, and other grapes, so grapes can be damaged if they are grabbed or pulled. Shaking is also unsuitable because grapes are harvested by dropping them using vibrations. Therefore, in this study, a hand was attached to the harvester that can harvest individual bunches by cutting the upper branches of the bunches. Due to the small area of penetration in automated grape harvesting, in a laboratory, Tomoki Noguchi developed a single-cutter harvesting hand similar to the cutting scissors used in low-crop operations. The new hand was developed by modifying an electric cutter for cutting thick tree branches. The blade is integrated with a plastic gripping section, which can hold the tassel intact after cutting the cut point. This eliminates the need for an actuator for the gripper and achieves both cutting and gripping with a simple structure.
The physical strain of harvesting work beside grapes is an issue worldwide; to solve this problem, there are some research teams developing harvesters that use 6 DOF robot arms, aside from our laboratory [18]. The latest research has been conducted on harvesting machines in various regions, such as tomatoes [19,20,21], strawberries [22,23,24], kiwifruits [25,26,27], and grapes [28,29,30,31]. In the havering of tomatoes, a 2 DOF robot was utilized to pick the tomato fruits in Shogo Isa’s study. In addition, Hidekazu Araki used four-axis Selective Compliance Assembly Robot Arm for harvesting tomatoes, and training depth information using deep learning was utilized to obtain the position of the tomatoes. JoonYoung Kim utilized a six-axis DOF robot to pick up tomatoes with a 3D camera for detection. In the study of harvesting strawberries, Shigehiko Hayashi et al. utilized stereo vision method to detect strawberries, and a 3 DOF robot was utilized to pick them. In addition, Kil-Su Han et al.’s manipulator had an additional revolute axis in addition to the Cartesian-type link structure for picking, and the stereo vision method was utilized for detection. Ya Xiong et al. utilized a serial arm with 5 DOF and a unique end effector to pick up strawberries. In addition, for harvesting kiwifruits, Henry A.M. Williams et al. utilized the stereo vision method to detect the kiwifruit, and 3 DOF robotic arm designed specifically for kiwifruit harvesting were utilized to pick up. In addition, Josh Barnett et al. used a 3 DOF robot with prismatic joints for harvesting kiwifruits. Training depth information using YOLO v4 was utilized to detect kiwifruits in Li Ma et al.’s study. F. Pezzi et al. utilized the vibration with beating frequencies to harvest grapes. In addition, Chris Lytridis utilized a 6 DOF robotic arm for harvesting and trained them using YOLO v7 to detect grapes and stems. Gabriel Coll-Ribes utilized the stereo vision method to detect grapes. In Yuji Hiramatsu’s study, a 6 DOF robotic arm and end effector for cutting and grasping were used to pick up grapes. In this study, the stereo vision method was utilized. In addition to developing harvesting hardware and estimating fruit location, research is also being conducted to recognize fruit diseases. By identifying diseased fruit, better quality fruit can be selected for harvesting [32].
It can be seen from the previous studies that most of the harvesting robots use a 6 DOF robotic arm, four-axis Selective Compliance Assembly Robot Arm, or 3 DOF robot with prismatic joints. However, there are some problems with these robot arms when utilized in outdoor conditions. The first problems resolve around the environmental performance such as dustproof, waterproof, and durability for vibration. The second problem is the cost of developing and manufacturing the robotic arm.
Therefore, in this study, we will try to solve these problems by developing original robot hardware that can be utilized outdoors with low cost. The objectives of this study include a simple automatic grape harvesting robot that is lightweight, low cost, and highly accurate.

2. Materials and Methods

2.1. Robot Harvester Construction

Figure 3 shows the construction of the robot harvester using the three-axis linear robot. The automatic grape harvesting robot developed in this study consists of a golf cart, a three-axis linear motion mechanism, a Time-of-Flight (ToF) camera, and a cutter-hand. The golf cart pulls the harvesting robot, which recognizes the position of grapes using color and depth information acquired by the ToF camera, and passes the coordinate information to the three-axis linear motion mechanism, which then harvests the grapes with the harvesting hand. When all the grapes recognized by the ToF camera are harvested, the motorized car moves to the next point to start automatic harvesting. This process is repeated for automatic harvesting. Next, the details of the three-axis linear motion mechanism are described.

2.2. Development of the Three-Axis Linear Motion Mechanism Robot

The software “Fusion 2.0.20460 x86_64” (Autodesk Inc., San Francisco, CA, USA) was used for mechanical design. In proceeding with the design based on the above-mentioned concept, we used a “rack and pinion mechanism” that combines two gears for the movement mechanism. This converts the rotational motion of the motor into linear motion. Because it has a simple mechanism, it has almost no effect even if mud or other heavy conditions adhere to it when used outdoors, and even in the unlikely event that it does, it can be easily cleaned and maintained. In addition, it costs approximately JPY one million to build the robot. It could be made much less than the robot arm utilized in previous research.
In addition to the rack and pinion mechanism, there is a “ball screw” mechanism that converts rotational motion into linear motion. Ball screws have little backlash and allow for highly accurate positioning; so, they are used for moving nozzles in 3D printers. However, since this machine is designed to be used outdoors, a ball screw is not recommended due to the possibility of dust getting in and malfunctioning. Also, because it is a high-precision component, it is expensive. Based on the above, we used a rack and pinion for the movement mechanism. The following is a summary of the movement mechanisms in the x-axis (left and right), y-axis (up and down), and z-axis (front and back).

2.2.1. Movement Mechanism in x-Axis (Left and Right)

As shown in Figure 4, power is transmitted to the motor and pinion via bevel gears for the x-axis. The aluminum frame is sandwiched between metal bearings to facilitate left-right movement and also serves as a guide. The motor is installed in the lower part of the fuselage to lower the center of gravity and improve stability.

2.2.2. Movement Mechanism in y-Axis (Up and Down)

As shown in Figure 5, a coupling is used to transmit power from the motor to the pinion. As shown in Figure 6, the main frame is sandwiched between metal bearings for smooth vertical movement.

2.2.3. Movement Mechanism in z-Axis (Backward and Forward)

As shown in Figure 7, power is transmitted directly from the motor to the pinion using a coupling, and as shown in Figure 7a,b, the back-and-forth frame is held in place by metal bearings with V-shaped grooves at the four corners for smooth movement, as shown in Figure 7c,d.
Figure 8 shows the three-axis linear motion mechanism robot that was utilized in the outdoor wine grapes vineyard for the experiment.

2.2.4. Motor Control of the Three-Axis Linear Motion Mechanism Robot

Table 1 shows the parameters of the motor, the gear head, and the controller of the motor. The motor of each axis uses a blushless DC (BLDC) motor (IDX56L-24V, Maxon Inc., Sachseln, Switzerland). The motor is powered using 24 V DC power. The maximum torque and speed of the motor are about 1.6 Nm and 6000 rpm, respectively. The reduction ration of the gear head is 1:16. The protection class of the motor is IP65. The motor is dustproof to prevent any dust from entering and waterproof to protect from normal rain. The operating temperature is from −30 to 85 °C. The controller is EPOS4 series from Maxon Inc., and the communication protocol follows the CANopen protocol, which is widely used for the industry control. The resolution of the encoder is 4096 increments per rotation (inc/r).
The motor used in this research is controlled using CAN (Controller Area Network) Bus, which is a communication standard used in a variety of machines, from medical equipment to automobiles. With increasing functionality, the more complex the control content, the more inputs and outputs and wiring becomes congested; however, by using CAN communication, simple wiring can be realized. The control program was designed using a multi thread structure, and each motor has their own thread. Threads 1, 2, and 3 control the x-, y-, and z-axes separately. There is a sub thread inside threads 1, 2, and 3 to check whether the robot moved to the target position. The motor goes to hold mode after the robot moves to the target position. The target control of the motor of each axis uses the same feedback control method, as shown in Figure 9. At first, the target position is sent to the motor driver and the driver output power is sent to the motor to start the rotation of the motor. The encoder installed inside the motor measures the position of the current position of the rotation axis of the motor. The motor reduces speed when the current position is near to target position and stops when the current position is the same as the target position.
Figure 10 shows the control flowchart of the three-axis linear robot. The control program was developed with Visual Studio Integrated Development Environment (IDE) (Microsoft Corporation, Redmond, WA, USA) using c++ language, and the robot is controlled by sending commands to the motors through the USB to CAN cable. The motors are connected to each other with CAN cables, and data are exchanged bidirectionally.

2.2.5. Robot Travel Route Design

In order to automatically harvest the grapes, smooth harvesting operations with no time wasted are required. To achieve smooth harvesting work, we propose an efficient robot travel route method for setting travel routes. If the robot in this study moves for x-, y-, and z-axes at the same time, the automatic harvesting cutter hits the blanches. Therefore, the robot moves for the x- and y-axes at the same time at first, and the automatic harvesting cutter moves only for the z-axis.
As show in Figure 11, the robot moves from the start point (0, 0) to the goal point (p2, q1). When the motor for the x-axis and the motor for the y-axis are moved at the same speed, the movement path follows the green dotted line, as shown in Figure 11a. If the speeds of the two motors are equal, the motor moves diagonally upward to the right at a 45-degree angle. Then, since the amount of movement in the y-axis is half of the amount of movement in the x-axis, the amount of movement in the y-axis reaches the target and stops when it advances to p1. After that, it moves forward from p1 to p2 in the x-axis and completes the movement. If the amount of movement on each axis is the same, the shortest movement path is the one shown by the orange dotted line in Figure 11b. This route is the shortest distance from the start point to the end point and can be said to be the most efficient travel route. In order to follow the ideal travel path, it is necessary to adjust the speed of the motor in the x- and y-axes according to the amount of travel.
The method of the adjustment of speed to travel the shortest distance is shown as follows. As shown in Figure 11b, the slope of the direction from start to goal point is defined by angle θ, and the relationship between the amount of travel in the X direction, the amount of travel in the Y direction, and θ use the trigonometric ratio.
t a n θ = l y l x
where l x is the amount of movement in the X direction, l y is the amount of movement in the Y direction, and θ is the angle between the x-axis and the ideal travel path.
The same relationship can be applied to speed as seen in Equation (2):
t a n θ = v y v x
where v x is the speed in the X direction, and v y is the speed in the Y direction.
We can obtain the speeds of the x-axis ( v x ) and the y-axis ( v y ) using Equation (3) when l x and l y   are not equal to zero.
l y l x = v y v x
Solving Equation (3) for the velocity in the x-axis, we obtain
v x = l y l x v y
Similarly, solving for the velocity in the y-axis, we obtain
v y = l x l y v x
Once the amounts of movement in the x- and y-axes are determined, the magnitude of the amount of movement in the x- and y-axes is determined. The maximum allowable rotation speed of the motor used is 6000 rpm, as shown in Table 1; so, it is necessary to take measures to ensure that the speeds in the x- and y-axes do not exceed that. From Equation (3), it can be seen that the ratio of the amount of movement and the ratio of speed are equal.
When ly > lx,
v y > v x
When ly < lx,
v y < v x
From Equations (6) and (7), if the speed of the axis with the larger travel amount is 6000 [rpm], the speed of the other axis will never exceed 6000 [rpm]. If the speed is adjusted using Equations (4)–(7), the result is as follows:
When ly > lx,
v x = l x l y v y ,   v y = v s e t
When ly < lx,
v y = l y l x v x ,   v x = v s e t
where v s e t is the set speed of the motor that is less than the maximum speed of the motor.
By using Equations (8) and (9), it is possible to adjust the speed to travel the shortest distance and to ensure that neither exceeds the maximum speed. Next, acceleration and deceleration are adjusted. Usually, the movement of objects is defined as the steps shown in Figure 12. The initial speed is 0 rpm, and after an acceleration period, the target speed is reached. After that, there is a constant velocity period in which the robot moves forward while maintaining that speed and finally decelerates through a deceleration period. When it reaches the target point, the speed becomes 0 rpm. Figure 13 shows a detailed definition of Figure 12.
As a precondition, quadrilateral ABCD is a rectangle, line segments EI and FJ intersect line segment AB, and line segments GI and HJ intersect line segment AD perpendicularly.
Since line segments EI, FJ, and BC are parallel,
AE:EF:FB = AI:IJ:JC
Similarly, line segments GI and HJ are parallel, and
AG:GH:HD = AI:IJ:JC
From Equations (10) and (11),
AG:GH:HD = AI:IJ:JC
From Equation (12), it follows that the ratios of the acceleration period, constant velocity period, and deceleration period in the x- and y-axes are equal. The acceleration was adjusted based on this relationship, and the calculation formula was actually incorporated into the program. As a result of operation using an actual machine, it was confirmed that the two axes arrived at the target position at the same time. Once the target position is entered, the system determines each parameter based on these calculations and performs the move.

2.2.6. Evaluation of Movement Accuracy

Accurate positioning is necessary to realize automatic harvesting of wine grapes. Therefore, we evaluated whether the developed three-axis Cartesian coordinate robot has sufficient movement accuracy for automatic harvesting. Although the motor used in this machine has a built-in encoder, discrepancies between theoretical values and actual measured values are expected to occur. The evaluation method is to move each axis by 100 mm and measure the amount of movement to calculate the error trend. A jig designed for this experiment was used to measure the amount of movement (Figure 14). The jig consists of a JIS Class 1 scale (Shinwa Seisakusho Co., Ltd., Komaki, Japan) and the original designed mounting parts that printed using a 3D printer.

3. Experimental Results and Discussion

In order to measure the accuracy of the developed robot using three-axis linear motion mechanism, the experiment was conducted at the indoor laboratory using equipment, as shown in Figure 14. The robot moved along the x-axis with the distance from 100 mm to 500 mm by a step of 100 mm. The robot moved along the y- and z-axes with a distance from 100 mm to 400 mm by a step of 100 mm, respectively. The movable region of the x-axis is longer than the y- and z-axes. The error measurement results are shown in Table 2, Table 3, and Table 4 for the x-, y-, and z-axes, respectively. Table 2, Table 3 and Table 4 show that the error is around 2 mm for the x-, y-, and z-axes. The average error for the x-axis, y-axis, and z-axis was 2.6 mm when the movement distance was 100 mm.
It can be seen from Table 2, Table 3 and Table 4 that the error is increased proportionately with the increase in the distance. In this case, the following equation is used to express the relationship between the theoretical value and the actual value:
l = l × a
where l is the average of the actual values, l′ is the target distance value, and a is gain.
Using Equation (13), it can be seen that an error of 1.1 mm per 100 mm occurs on the x-axis, 2.3 mm on the y-axis, and 2.4 mm on the z-axis. Table 5, Table 6 and Table 7 show the results of sending the corrected target position to the motor based on these calculations.
The maximum errors of the x-, y-, and z-axes were 2.08 mm, 0.06 mm, and 0.14 mm, respectively. It can be seen that the error was significantly reduced compared to that before the calibration. The larger error in the x-axis compared to the other axes is due to the large clearance between the rack and pinion, and the clearance can be suppressed by using high precision parts. In this study, the accuracy is enough for the harvesting job; therefore, we did not choose the high precision parts in order to reduce the cost and adjust for use outdoors.
In addition, an automatic harvesting cutter was attached to the end of the three-axis linear motion mechanism, and the coordinates of the previously measured cut point (the point where the harvester inserts the cutter) were passed to the three-axis linear motion mechanism. The success rate of the cut point fitting within the blade width of the cutter was confirmed by experiment. The results of the experiment are shown below.
The experiment was conducted to test the accuracy of the robot positioning for automatic harvesting at the indoor laboratory and the vineyard outdoors. First, the automatic harvesting cutter was moved to the target position of cut point, and the blade width of the cutter was approximately 35 mm. The coordinates were recorded. Next, the cutter returned to the origin and moved again to the recorded coordinates. an experiment was conducted to see if the cut point fits on the cutter blade. Figure 15 shows the success state of the experiment indoors.
At the indoor laboratory, the experiment was conducted 30 times, and the success rate was 100%. The results show that the errors of the x-, y-, and z-axes do not affect the automatic harvesting, and the three-axis linear motion mechanism robot provides enough accuracy for automatically harvesting grapes. The purpose of this experiment was to evaluate the accuracy of the robot’s movement. The cutter angle must be adjusted parallel to the branch with the clusters in order to actually perform the harvest. This allows the cutter to penetrate even when the cob is short.

4. Conclusions

The aging and decreasing number of agricultural workers is a significant problem for farming. In this study, a three-axis linear motion mechanism robot was developed to be used for wine grape harvesting, especially for large farming areas.
The three-axis linear motion mechanism robot was developed based on the concept of a “lightweight”, “inexpensive”, and “easy-to-maintain” machine. In addition, the simple structure makes the machine less susceptible to heavy conditions, making it suitable for outdoor use.
The experiment results show that the robot can achieve less than 10 mm accuracy when the robot move inside the region of 500 mm. In addition, the error increased with the movement of the distance. A calibration model was proposed to reduce the error. The accuracy improved to less than 2 mm after using the calibration model of the study for the x-, y-, and z-axes. The accuracy was higher than our expectation, which was high enough for the requirement of automatic grape harvesting. In addition, the shortest travel route was realized for smooth harvesting.
In the future, we will construct a recognition system that can cooperate with the three-axis linear motion mechanism developed in this study to achieve an automatic harvesting job.

Author Contributions

Conceptualization, L.Y.; methodology, L.Y., Y.H. and S.S.; software, L.Y., S.S. and T.N.; validation, S.S. and L.Y.; formal analysis, S.S.; investigation, L.Y.; resources, L.Y.; data curation, L.Y.; writing—original draft preparation, S.S.; writing—review and editing, L.Y.; visualization, L.Y. and S.S.; supervision, L.Y.; project administration, L.Y.; funding acquisition, L.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the research program on development of innovative technology grants (03020C1) from the Project of the Bio-oriented Technology Research Advancement Institution (BRAIN).

Data Availability Statement

The material and data available upon request to interested researchers.

Acknowledgments

We would like to thank Tsurunuma Winery (Hokkiado Wine Co., Ltd.) for providing the experimental site for this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Statistics on Agricultural Labor Force: MAFF. Available online: https://www.maff.go.jp/j/tokei/sihyo/data/08.html (accessed on 12 June 2024).
  2. Overview of the Alcoholic Beverage Manufacturing Industry and Alcoholic Beverage Wholesale Industry 2023|National Tax Agency JAPAN. Available online: https://www.nta.go.jp/taxes/sake/shiori-gaikyo/seizo_oroshiuri/r05/index.htm (accessed on 12 June 2024).
  3. Tomoki, N.; Liangliang, Y.; Yohei, H. Research on Multi-Cutter Robotic Hand for Smart Vineyard. J. Soc. Instrum. Control Eng. 2022, 65, 241–245. [Google Scholar] [CrossRef]
  4. Braud Grape Harvester—Grape Harvest Machine|New Holland. Available online: https://agriculture.newholland.com/en-us/nar/products/grape-and-olive-harvesters/braud-grape-harvester (accessed on 13 June 2024).
  5. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27 June 2016; pp. 779–788. [Google Scholar]
  6. Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21 July 2017; pp. 6517–6525. [Google Scholar]
  7. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  8. Bochkovskiy, A.; Wang, C.-Y.; Liao, H. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  9. Ge, Z.; Liu, S.; Wang, F.; Li, Z.; Sun, J. YOLOX: Exceeding YOLO Series in 2021. arXiv 2021, arXiv:2107.08430. [Google Scholar]
  10. Vrochidou, E.; Tziridis, K.; Nikolaou, A.; Kalampokas, T.; Papakostas, G.A.; Pachidis, T.P.; Mamalis, S.; Koundouras, S.; Kaburlasos, V.G. An Autonomous Grape-Harvester Robot: Integrated System Architecture. Electronics 2021, 10, 1056. [Google Scholar] [CrossRef]
  11. Santos, T.T.; de Souza, L.L.; dos Santos, A.A.; Avila, S. Grape Detection, Segmentation, and Tracking Using Deep Neural Networks and Three-Dimensional Association. Comput. Electron. Agric. 2020, 170, 105247. [Google Scholar] [CrossRef]
  12. Guo, C.; Zheng, S.; Cheng, G.; Zhang, Y.; Ding, J. An Improved YOLO v4 Used for Grape Detection in Unstructured Environment. Front. Plant Sci. 2023, 14, 1209910. [Google Scholar] [CrossRef]
  13. Luo, L.; Tang, Y.; Zou, X.; Ye, M.; Feng, W.; Li, G. Vision-Based Extraction of Spatial Information in Grape Clusters for Harvesting Robots. Biosyst. Eng. 2016, 151, 90–104. [Google Scholar] [CrossRef]
  14. Luo, L.; Tang, Y.; Lu, Q.; Chen, X.; Zhang, P.; Zou, X. A Vision Methodology for Harvesting Robot to Detect Cutting Points on Peduncles of Double Overlapping Grape Clusters in a Vineyard. Comput. Ind. 2018, 99, 130–139. [Google Scholar] [CrossRef]
  15. Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. An Automated Apple Harvesting Robot—From System Design to Field Evaluation. J. Field Robot. 2023, 41, 2384–2400. [Google Scholar] [CrossRef]
  16. Zhang, F.; Chen, Z.; Wang, Y.; Bao, R.; Chen, X.; Fu, S.; Tian, M.; Zhang, Y. Research on Flexible End-Effectors with Humanoid Grasp Function for Small Spherical Fruit Picking. Agriculture 2023, 13, 123. [Google Scholar] [CrossRef]
  17. Wang, Y.; Yang, Y.; Yang, C.; Zhao, H.; Chen, G.; Zhang, Z.; Fu, S.; Zhang, M.; Xu, H. End-Effector with a Bite Mode for Harvesting Citrus Fruit in Random Stalk Orientation Environment. Comput. Electron. Agric. 2019, 157, 454–470. [Google Scholar] [CrossRef]
  18. Jiang, Y.; Liu, J.; Wang, J.; Li, W.; Peng, Y.; Shan, H. Development of a Dual-Arm Rapid Grape-Harvesting Robot for Horizontal Trellis Cultivation. Front. Plant Sci. 2022, 13, 881904. [Google Scholar] [CrossRef] [PubMed]
  19. Isa, S.; Harada, M.; Wakimizu, K.; Takemura, Y. Development of Rail Move Type Automatic Tomato Harvesting Robot 1st Report: Development of a Harvesting Platform Using Uniaxial Actuator and Control System. JSME Annu. Conf. Robot. Mechatron. 2017, 2017, 2P1-J09. [Google Scholar] [CrossRef]
  20. Araki, H. Ryo Toshima Tomato Harvesting Robot. J. Robot. Soc. Jpn. 2021, 39, 911–916. [Google Scholar] [CrossRef]
  21. Kim, J.; Pyo, H.; Jang, I.; Kang, J.; Ju, B.; Ko, K. Tomato Harvesting Robotic System Based on Deep-ToMaToS: Deep Learning Network Using Transformation Loss for 6D Pose Estimation of Maturity Classified Tomatoes with Side-Stem. Comput. Electron. Agric. 2022, 201, 107300. [Google Scholar] [CrossRef]
  22. Hayashi, S.; Shigematsu, K.; Yamamoto, S.; Kobayashi, K.; Kohno, Y.; Kamata, J.; Kurita, M. Evaluation of a Strawberry-Harvesting Robot in a Field Test. Biosyst. Eng. 2010, 105, 160–171. [Google Scholar] [CrossRef]
  23. Han, K.-S.; Kim, S.-C.; Lee, Y.-B.; Kim, S.-C.; Im, D.-H.; Choi, H.-K.; Hwang, H. Strawberry Harvesting Robot for Bench-Type Cultivation. J. Biosyst. Eng. 2012, 37, 65–74. [Google Scholar] [CrossRef]
  24. Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
  25. Williams, H.A.M.; Jones, M.H.; Nejati, M.; Seabright, M.J.; Bell, J.; Penhall, N.D.; Barnett, J.J.; Duke, M.D.; Scarfe, A.J.; Ahn, H.S.; et al. Robotic Kiwifruit Harvesting Using Machine Vision, Convolutional Neural Networks, and Robotic Arms. Biosyst. Eng. 2019, 181, 140–156. [Google Scholar] [CrossRef]
  26. Barnett, J.; Duke, M.; Au, C.K.; Lim, S.H. Work Distribution of Multiple Cartesian Robot Arms for Kiwifruit Harvesting. Comput. Electron. Agric. 2020, 169, 105202. [Google Scholar] [CrossRef]
  27. Ma, L.; He, Z.; Zhu, Y.; Jia, L.; Wang, Y.; Ding, X.; Cui, Y. A Method of Grasping Detection for Kiwifruit Harvesting Robot Based on Deep Learning. Agronomy 2022, 12, 3096. [Google Scholar] [CrossRef]
  28. Pezzi, F.; Caprara, C. Mechanical Grape Harvesting: Investigation of the Transmission of Vibrations. Biosyst. Eng. 2009, 103, 281–286. [Google Scholar] [CrossRef]
  29. Lytridis, C.; Bazinas, C.; Kalathas, I.; Siavalas, G.; Tsakmakis, C.; Spirantis, T.; Badeka, E.; Pachidis, T.; Kaburlasos, V.G. Cooperative Grape Harvesting Using Heterogeneous Autonomous Robots. Robotics 2023, 12, 147. [Google Scholar] [CrossRef]
  30. Coll-Ribes, G.; Torres-Rodríguez, I.J.; Grau, A.; Guerra, E.; Sanfeliu, A. Accurate Detection and Depth Estimation of Table Grapes and Peduncles for Robot Harvesting, Combining Monocular Depth Estimation and CNN Methods. Comput. Electron. Agric. 2023, 215, 108362. [Google Scholar] [CrossRef]
  31. Hiramatsu, Y.; Akama, S.; Uenoyama, M.; Kaneshiro, T.; Watanabe, T. Grape Harvesting Robot. J. Robot. Soc. Jpn. 2021, 39, 896–900. [Google Scholar] [CrossRef]
  32. Gupta, S.; Tripathi, A.K. Fruit and Vegetable Disease Detection and Classification: Recent Trends, Challenges, and Future Opportunities. Eng. Appl. Artif. Intell. 2024, 133, 108260. [Google Scholar] [CrossRef]
Figure 1. Grape training systems: (a) Japanese traditional table cultivation method; (b) VSP (vertical shoot position).
Figure 1. Grape training systems: (a) Japanese traditional table cultivation method; (b) VSP (vertical shoot position).
Agriengineering 06 00236 g001
Figure 2. Examples of grape harvester: (a) grape harvester made by NEW HOLLAND; (b) grape harvesting robot under development in Laboratory of Bio-Mechatronics.
Figure 2. Examples of grape harvester: (a) grape harvester made by NEW HOLLAND; (b) grape harvesting robot under development in Laboratory of Bio-Mechatronics.
Agriengineering 06 00236 g002
Figure 3. The robot harvester using the three-axis linear robot construction.
Figure 3. The robot harvester using the three-axis linear robot construction.
Agriengineering 06 00236 g003
Figure 4. Movement mechanism in the x-axis (left and right).
Figure 4. Movement mechanism in the x-axis (left and right).
Agriengineering 06 00236 g004
Figure 5. Movement mechanism in the y-axis: (a) lower state; (b) upper state.
Figure 5. Movement mechanism in the y-axis: (a) lower state; (b) upper state.
Agriengineering 06 00236 g005
Figure 6. Slide mechanism using bearings.
Figure 6. Slide mechanism using bearings.
Agriengineering 06 00236 g006
Figure 7. Movement mechanism in the z-axis: (a) backward state; (b) forward state; (c) front of retention mechanism; and (d) isometric view of retention mechanism.
Figure 7. Movement mechanism in the z-axis: (a) backward state; (b) forward state; (c) front of retention mechanism; and (d) isometric view of retention mechanism.
Agriengineering 06 00236 g007
Figure 8. Three-axis linear motion mechanism robot in the outdoor wine grapes field.
Figure 8. Three-axis linear motion mechanism robot in the outdoor wine grapes field.
Agriengineering 06 00236 g008
Figure 9. Motor feedback control system.
Figure 9. Motor feedback control system.
Agriengineering 06 00236 g009
Figure 10. Flowchart of robot control.
Figure 10. Flowchart of robot control.
Agriengineering 06 00236 g010
Figure 11. Robot travel route: (a) travel route when two motors have same speed, and (b) travel route when two motors have different speeds.
Figure 11. Robot travel route: (a) travel route when two motors have same speed, and (b) travel route when two motors have different speeds.
Agriengineering 06 00236 g011
Figure 12. Definition of the acceleration period, constant period, and deceleration period.
Figure 12. Definition of the acceleration period, constant period, and deceleration period.
Agriengineering 06 00236 g012
Figure 13. A detailed definition of the control period of Figure 12.
Figure 13. A detailed definition of the control period of Figure 12.
Agriengineering 06 00236 g013
Figure 14. Movement measurement jig (a) in the x-axis; (b) in the y-axis; and (c) in the z-axis.
Figure 14. Movement measurement jig (a) in the x-axis; (b) in the y-axis; and (c) in the z-axis.
Agriengineering 06 00236 g014aAgriengineering 06 00236 g014b
Figure 15. Example when the cut point is within the blade width indoors.
Figure 15. Example when the cut point is within the blade width indoors.
Agriengineering 06 00236 g015
Table 1. Motor and motor controller specifications.
Table 1. Motor and motor controller specifications.
ParameterValue
Model numberIDX56L-24V
Nominal voltage [VDC]24
Maximum torque [Nm]1.589
Maximum allowable rotation speed [rpm]6000
Reduction ratio1:16
Protection classIP65
Operating temperature [°C]−30…+85
ControllerEPOS4
Communication methodCANopen
Encoder r esolution [inc/r]4096
Table 2. Movement distance and error of the x-axis.
Table 2. Movement distance and error of the x-axis.
NO.Target Distance of x-Axis [mm]
100.0200.0300.0400.0500.0
1101.0202.3302.6403.2505.5
2101.1202.3302.6403.1505.3
3100.5201.9302.4402.5504.7
4101.2202.5302.7403.3505.5
5101.3202.5302.8403.3505.6
Average101.02202.30302.62403.08505.32
Error1.022.302.623.085.32
Table 3. Movement distance and error of the y-axis.
Table 3. Movement distance and error of the y-axis.
NO.Target Distance of y-Axis [mm]
100.0200.0300.0400.0
1102.1204.5306.7409.3
2102.1204.6306.6409.1
3102.2204.7306.8409.3
4102.2204.6306.6409.4
5102.0204.4306.5408.8
Average102.12204.56306.64409.18
Error2.124.566.649.18
Table 4. Movement distance and error of the z-axis.
Table 4. Movement distance and error of the z-axis.
NO.Target Distance of z-Axis [mm]
100.0200.0300.0400.0
1102.5204.6307409.4
2102.1204.5306.7409.1
3102.6204.9307.1409.5
4102.5205.0307.0409.5
5102.5204.8307.2409.6
Average102.44204.76307.00409.42
Error2.444.767.009.42
Table 5. Movement distance and error after calibration of the x-axis.
Table 5. Movement distance and error after calibration of the x-axis.
NO.Target Distance of x-Axis [mm]
100.0200.0300.0400.0500.0
199.8199.7298.8398.3499.3
299.2199.1298.4397.5498.7
398.9198.8298.2397.3498.4
4100.0199.9298.8398.6499.6
599.6199.5298.4397.9498.9
Average99.50199.40298.52397.92498.98
Error−0.50−0.60−1.48−2.08−1.02
Table 6. Movement distance and error after calibration of the y-axis.
Table 6. Movement distance and error after calibration of the y-axis.
NO.Target Distance of y-Axis [mm]
100.0200.0300.0400.0
1100.0200.03000400.0
2100.0200.0300.0.400.0
3100.0200.0300.0400.0
499.8200.0299.9400.0
5100.0200.0299.8400.0
Average99.96200.00299.94400.00
Error−0.040.00−0.060.00
Table 7. Movement distance and error after calibration of the z-axis.
Table 7. Movement distance and error after calibration of the z-axis.
NO.Target Distance of z-Axis [mm]
100.0200.0300.0400.0
1100.2200.2300.0.399.9
2100.1200.23000.400.1
3100.1200.030004000.
4100.0200.1299.94000.
5100.0200.2300.0400.0.
Average100.08200.14299.98400.00
Error0.080.14−0.020.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sasaya, S.; Yang, L.; Hoshino, Y.; Noguchi, T. Development of an Automatic Harvester for Wine Grapes by Using Three-Axis Linear Motion Mechanism Robot. AgriEngineering 2024, 6, 4203-4219. https://doi.org/10.3390/agriengineering6040236

AMA Style

Sasaya S, Yang L, Hoshino Y, Noguchi T. Development of an Automatic Harvester for Wine Grapes by Using Three-Axis Linear Motion Mechanism Robot. AgriEngineering. 2024; 6(4):4203-4219. https://doi.org/10.3390/agriengineering6040236

Chicago/Turabian Style

Sasaya, Shota, Liangliang Yang, Yohei Hoshino, and Tomoki Noguchi. 2024. "Development of an Automatic Harvester for Wine Grapes by Using Three-Axis Linear Motion Mechanism Robot" AgriEngineering 6, no. 4: 4203-4219. https://doi.org/10.3390/agriengineering6040236

APA Style

Sasaya, S., Yang, L., Hoshino, Y., & Noguchi, T. (2024). Development of an Automatic Harvester for Wine Grapes by Using Three-Axis Linear Motion Mechanism Robot. AgriEngineering, 6(4), 4203-4219. https://doi.org/10.3390/agriengineering6040236

Article Metrics

Back to TopTop