Nothing Special   »   [go: up one dir, main page]

CN118444687A - Self-learning obstacle avoidance control method of mowing robot - Google Patents

Self-learning obstacle avoidance control method of mowing robot Download PDF

Info

Publication number
CN118444687A
CN118444687A CN202410926670.XA CN202410926670A CN118444687A CN 118444687 A CN118444687 A CN 118444687A CN 202410926670 A CN202410926670 A CN 202410926670A CN 118444687 A CN118444687 A CN 118444687A
Authority
CN
China
Prior art keywords
mowing
area
obstacle
robot
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410926670.XA
Other languages
Chinese (zh)
Other versions
CN118444687B (en
Inventor
应逸恒
孙亮亮
应阔
崔东
郭建兵
庄林强
黄会飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZHEJIANG SAFUN INDUSTRIAL CO LTD
Zhejiang Kuochuang Technology Co ltd
Original Assignee
ZHEJIANG SAFUN INDUSTRIAL CO LTD
Zhejiang Kuochuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZHEJIANG SAFUN INDUSTRIAL CO LTD, Zhejiang Kuochuang Technology Co ltd filed Critical ZHEJIANG SAFUN INDUSTRIAL CO LTD
Priority to CN202410926670.XA priority Critical patent/CN118444687B/en
Publication of CN118444687A publication Critical patent/CN118444687A/en
Application granted granted Critical
Publication of CN118444687B publication Critical patent/CN118444687B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/242Means based on the reflection of waves generated by the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/243Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/246Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/248Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons generated by satellites, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/644Optimisation of travel parameters, e.g. of energy consumption, journey time or distance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/648Performing a task within a working area or space, e.g. cleaning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/656Interaction with payloads or external entities
    • G05D1/661Docking at a base station

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Harvester Elements (AREA)

Abstract

The invention discloses a self-learning obstacle avoidance control method of a mowing robot, which relates to the technical field of mowing robots and comprises the following steps: s1, global path planning, wherein a preset processor and a positioning module acquire position data of a current grassland, an online map is downloaded from a wide area network according to the position data acquired by the processor, and an optimal mowing path is planned through an algorithm; s2, self-adaptive obstacle avoidance is carried out, the shape characteristics of the obstacle are judged according to the motion parameters of the mowing robot in the process of executing the task, and the mowing task is carried out along the edge of the obstacle; s3, updating a mowing path in real time, and marking the position of an obstacle encountered in a mowing task; s4, cutting path planning, namely setting an area which is not mowed in the mowing task as a target area, and recording the position of the target area. According to the self-learning obstacle avoidance control method of the mowing robot, through the steps S1-S4, the mowing repetition rate is reduced, the mowing area is uniformly covered, and the mowing effect is improved.

Description

Self-learning obstacle avoidance control method of mowing robot
Technical Field
The invention relates to the technical field of mowing robots, in particular to a self-learning obstacle avoidance control method of a mowing robot.
Background
At present, mature intelligent mobile mowing robots on the market generally adopt a scheme that boundaries are embedded into electrified wires to establish electronic fences and then the boundaries are identified through electromagnetic induction, most mowing robots in the wire boundaries adopt random path movement and use distance sensors to bypass at proper distances to avoid obstacles on paths, and few advanced models can adopt full-coverage Niu Gengtian type path planning. And when the mowing robot completes the mowing task, it typically returns to the charging station along the boundary.
However, the existing mowing robot still has the following problems:
(1) In the existing moving mode of the mowing robot, the moving track adopting the random path is not purposeful, and the problems of prolonged working time, changed mowing direction, increased mowing repetition rate and the like are likely to be caused after the moving track encounters an obstacle to change the direction, so that uneven mowing area coverage is easy to be caused, and the effect of completely mowing is not achieved;
(2) For obstacles existing on a lawn, the number of wires is increased to divide forbidden areas, so that the complexity of the map is increased, the amount of wire embedding work and the cost are increased, and a blank map is divided by increasing the power-on wire path which the wires need to be increased, so that the grass in the area is ignored by the mowing robot.
Therefore, we propose a self-learning obstacle avoidance control method of the mowing robot.
Disclosure of Invention
The invention mainly aims to provide a self-learning obstacle avoidance control method of a mowing robot, which solves the problems that in the moving mode of the existing mowing robot, the moving track adopting a random path is not always purposeful, after encountering an obstacle to change direction, the problem that the working time is prolonged, or the mowing direction is changed, the mowing repetition rate is increased, and the like is likely to cause uneven mowing area coverage, and further the effect of complete mowing is not achieved, and solves the problems that the cutting area is required to be divided by adding wires for a restricted area for an obstacle existing on a lawn, the complexity of a map is increased, the quantity and the cost of a buried wire are increased, and the electrified wire path which is required to be added by adding wires is likely to divide a blank map, so that the grass in the area is ignored by the mowing robot through the set cutting path planning.
In order to achieve the above purpose, the present invention provides the following technical solutions:
the self-learning obstacle avoidance control method of the mowing robot comprises the following steps:
S1, global path planning, namely acquiring position data of a current lawn by a preset processor and a positioning module, establishing communication connection between the preset mobile terminal and the processor through a local area network communication module, downloading an online map from a wide area network according to the position data acquired by the processor, transmitting boundary information of a mowing area to the processor through the mobile terminal by an operator, and planning an optimal mowing path through an algorithm;
S2, performing a mowing task according to an optimal mowing path after the mowing robot receives the mobile terminal instruction, judging the shape characteristics of the obstacle according to the motion parameters of the mowing robot during the task execution, and performing the mowing task along the edge of the obstacle;
S3, updating the mowing path in real time, marking the position of the obstacle encountered in the mowing task, and updating the mowing path in real time;
S4, cutting-up path planning, namely setting an area which is not cut up in the middle of a cutting-up task as a target area, recording the position of the target area, judging the distance between the position of the target area and the updated path according to the cutting-up task of the cutting-up robot, and further optimizing the cutting-up path.
Preferably, the mowing area boundary information acquiring in the step S1 includes the steps of:
s11, presetting an unmanned aerial vehicle, traversing a grassland boundary through a down-looking binocular passive vision system carried by the unmanned aerial vehicle, and generating a plot boundary coordinate graph;
S12, identifying obstacles in the block boundary based on the block boundary coordinate graph, and correcting the block boundary graph;
S13, dividing the corrected land block boundary by adopting a visual identification and segmentation technology, determining the drivable area and barrier boundary information, and planning the optimal drivable area path information.
Preferably, the algorithm in step S1 is a genetic algorithm.
Preferably, the step S2 of determining the shape feature of the obstacle includes the steps of:
S21, presetting a CCD camera to acquire all-dimensional image characteristics of the obstacle, presetting a processing module to equally divide the obstacle image into n areas, and setting color pixels of each small area as Obtaining the number of edge pixels after edge detectionCombining the color pixels and the edge pixels into a feature set through the processing module;
S22, further judging according to the ratio of the edge to the number of the pixels detected by the color, and judging the area as an obstacle if the result of the ratio is within a set threshold value, wherein the formula is as follows:
Wherein, alpha and beta are the upper and lower limits of the ratio of the two; Is a region color number threshold.
Preferably, the mowing task along the edge of the obstacle in the step S2 includes the following steps:
S201, detecting obstacles in a lawn through a laser radar and a CCD camera preset in the mowing robot, calling corresponding mowing strategies preset in a mowing robot database through a preset control unit, and converting the mowing strategies into instructions through a preset transmission unit to be transmitted to a receiving unit of the mowing robot;
s202, controlling the robot to bypass the obstacle through a mowing strategy to avoid the obstacle and mow.
Preferably, the mowing strategy in step S201 photographs and records the obstacle in real time through a laser radar and a CCD camera of the mowing robot, reconstructs a model of the obstacle through a preset measurement method, and calculates an optimal mowing path of the mowing robot through a genetic algorithm.
Preferably, the marking the position of the obstacle encountered in the mowing task in step S3 includes the following steps:
s2021, generating a path diagram according to the track of the mowing robot around the obstacle, correspondingly generating an obstacle label, and storing the label into a preset memory;
s2022, the control unit invokes the obstacle tag in the memory, builds the connection between the obstacle tag and the mowing strategy according to the tag attribute, and establishes an association relation.
Preferably, the target area setting in the step S4 includes the steps of:
S41, when the mowing robot performs a mowing task along a mowing path, real-time scanning the grassland conditions around the path through a CCD camera, judging an unharked area according to the grassland height, recording the coordinates and the shape of the area, and generating a main area node;
s42, radiating the main area node to the periphery, scanning an unharvested land adjacent to the main area node, and recording the coordinates and the shape of the unharvested land to generate a sub-area node;
S43, establishing a relation between the main area node and the sub-area nodes, generating a target area grid graph according to the distribution of the main area node and the sub-area nodes, and calculating an optimal complementary cutting path for connecting the main area node and the sub-area nodes through a genetic algorithm.
Preferably, the determining the distance between the target area position and the updated path in step S4 includes the following steps:
S401, acquiring a first target area to be processed near a mowing path of the mowing robot, controlling the mowing robot to move along the mowing path according to a moving instruction set by a user, acquiring positioning information of the mowing robot in a moving process in real time, and determining the first target area to be processed according to the positioning information of the mowing robot;
s402, expanding the first target area to be treated outwards by a preset distance to serve as a second target area to be treated, and controlling the mowing robot to move along the second target area to be treated;
s403, acquiring environmental information of the mower in the second target area to be processed according to a sensor of the mowing robot, determining a non-working area boundary in the environmental information, controlling the mower to move along the boundary of the non-working area according to the non-working area boundary so as to acquire a third target area to be processed, avoiding the non-working area, and constructing the boundary of the working area of the mowing robot according to the second target area to be processed, the third target area to be processed and the subsequent target area to be processed.
Compared with the prior art, the invention has the following beneficial effects:
1. According to the invention, through the steps of real-time path planning, self-adaptive obstacle avoidance and real-time mowing path updating, firstly, the optimal mowing path of the grassland is planned, then, the mowing path is planned again according to the shape characteristics of obstacles, the mowing strategy is formulated to control the robot to avoid obstacles and mow the grass, and the corresponding mowing strategy is conveniently adjusted through the connection between the tag and the mowing strategy, so that the adjusting time is saved, the mowing repetition rate is reduced, the mowing area is uniformly covered, and the mowing effect is improved.
2. According to the invention, through the arrangement of the cutting-up path planning, the area which is not cut up in the middle of the cutting-up task is set as the target area, the position of the target area is recorded, the distance between the position of the target area and the updated path is judged by the cutting-up robot according to the cutting-up path when the cutting-up task is carried out, and the cutting-up path is further optimized, so that the cutting-up of the target area which is near the obstacle is carried out, no buried line is needed for carrying out zone division, and the cutting-up efficiency and the cutting-up effect are further improved.
Drawings
FIG. 1 is a flow chart of a self-learning obstacle avoidance control method of a mowing robot of the invention;
FIG. 2 is a flow chart of obtaining boundary information of mowing area according to the present invention;
FIG. 3 is a flow chart of the obstacle position marking encountered in the mowing task of the present invention;
FIG. 4 is a flow chart of the target area setting of the present invention;
FIG. 5 is a flowchart for determining the distance between the target area and the updated path according to the present invention.
Detailed Description
The invention is further described in connection with the following detailed description, in order to make the technical means, the creation characteristics, the achievement of the purpose and the effect of the invention easy to understand.
Examples: as shown in fig. 1-5, the self-learning obstacle avoidance control method of the mowing robot comprises the following steps:
S1, global path planning, namely acquiring position data of a current lawn by a preset processor and a positioning module, establishing communication connection between the preset mobile terminal and the processor through a local area network communication module, downloading an online map from a wide area network according to the position data acquired by the processor, transmitting boundary information of a mowing area to the processor through the mobile terminal by an operator, and planning an optimal mowing path through an algorithm;
S2, performing a mowing task according to an optimal mowing path after the mowing robot receives the mobile terminal instruction, judging the shape characteristics of the obstacle according to the motion parameters of the mowing robot during the task execution, and performing the mowing task along the edge of the obstacle;
S3, updating the mowing path in real time, marking the position of the obstacle encountered in the mowing task, and updating the mowing path in real time;
s4, planning a cutting-up path, namely setting an area which is not cut up in the middle of a cutting-up task as a target area, recording the position of the target area, judging the distance between the position of the target area and the updated path when the cutting-up robot performs the cutting-up task according to the cutting-up path, and further optimizing the cutting-up path.
The mowing area boundary information acquisition in the step S1 includes the following steps:
s11, presetting an unmanned aerial vehicle, and traversing the grassland boundary through a downward-looking binocular passive vision system carried by the unmanned aerial vehicle to generate a plot boundary coordinate graph;
S12, identifying obstacles in the block boundary based on the block boundary coordinate graph, and correcting the block boundary graph;
S13, dividing the corrected land block boundary by adopting a visual identification and segmentation technology, determining the drivable area and barrier boundary information, and planning the optimal drivable area path information.
The obstacle comprises the number, the category and the GPS coordinates of the obstacle;
Traversing the block boundary by using a down-looking binocular passive vision system carried by the unmanned aerial vehicle to generate a block boundary coordinate graph, which specifically comprises the following steps:
(1) Selecting a block boundary, acquiring a rough GPS coordinate of the block boundary and a connection relation between coordinate points, storing the rough GPS coordinate and the connection relation between coordinate points as a doubly linked list data structure, and generating a rough block boundary coordinate graph of the block boundary; the rough block boundary coordinate graph comprises rough GPS coordinates, a connection relation among coordinate points and a doubly linked list data structure;
(2) Inputting a plot boundary coordinate graph into the unmanned aerial vehicle, starting from a head node, selecting the direction of the next node as the basis of a plot boundary route, and establishing a flight task for traversing the plot boundary;
(3) In a flight task, continuously acquiring ground images by using a down-looking binocular passive vision system on board an unmanned aerial vehicle, and determining a visual boundary of a land block by means of a deep learning network; the visual boundaries of the land block comprise ridges, ravines, paved roadsides and water bodies;
(4) Combining the boundary points of the visual boundary of the land block with the relative positions of the unmanned aerial vehicle, and carrying out GPS coordinate assignment on the boundary points of the visual boundary of the land block based on the GPS coordinates of the unmanned aerial vehicle;
(5) And updating the doubly linked list data structure according to the boundary points of the new visual boundary of the land parcel, the GPS coordinates and the connection relation among the coordinate points, and generating an accurate coordinate diagram of the boundary of the land parcel.
Wherein, the algorithm in the step S1 is a genetic algorithm.
Wherein, the judging of the shape feature of the obstacle in the step S2 includes the steps of:
S21, presetting a CCD camera to acquire all-dimensional image characteristics of the obstacle, presetting a processing module to equally divide the obstacle image into n areas, and setting color pixels of each small area as Obtaining the number of edge pixels after edge detectionCombining the color pixels and the edge pixels into a feature set through a processing module;
S22, further judging according to the ratio of the edge to the number of the pixels detected by the color, and judging the area as an obstacle if the result of the ratio is within a set threshold value, wherein the formula is as follows:
Wherein, alpha and beta are the upper and lower limits of the ratio of the two; Is a region color number threshold.
The obstacle image is segmented through the HS1 model, and is detected by using a Canny edge detection operator, so that quick and efficient obstacle edge recognition is realized, and in order to reduce calculation time, pictures acquired by the CCD camera are compressed by Gussian algorithm.
The mowing task along the edge of the obstacle in the step S2 comprises the following steps:
S201, detecting obstacles in a lawn through a laser radar and a CCD camera preset in the mowing robot, calling corresponding mowing strategies preset in a mowing robot database through a preset control unit, and converting the mowing strategies into instructions through a preset transmission unit to be transmitted to a receiving unit of the mowing robot;
s202, controlling the robot to bypass the obstacle through a mowing strategy to avoid the obstacle and mow.
The mowing strategy in step S201 shoots and records the obstacle in real time through a laser radar and a CCD camera of the mowing robot, reconstructs a model of the obstacle through a preset measurement method, and calculates an optimal mowing path of the mowing robot through a genetic algorithm.
Wherein, marking the obstacle position encountered in the mowing task in step S3 includes the following steps:
s2021, generating a path diagram according to the track of the mowing robot around the obstacle, correspondingly generating an obstacle label, and storing the label into a preset memory;
s2022, the control unit invokes the obstacle tag in the memory, builds the connection between the obstacle tag and the mowing strategy according to the tag attribute, and establishes an association relation.
The target area setting in step S4 includes the following steps:
S41, when the mowing robot performs a mowing task along a mowing path, real-time scanning the grassland conditions around the path through a CCD camera, judging an unharked area according to the grassland height, recording the coordinates and the shape of the area, and generating a main area node;
s42, radiating the main area node to the periphery, scanning an unharvested land adjacent to the main area node, and recording the coordinates and the shape of the unharvested land to generate a sub-area node;
S43, establishing a relation between the main area node and the sub-area nodes, generating a target area grid graph according to the distribution of the main area node and the sub-area nodes, and calculating an optimal complementary cutting path for connecting the main area node and the sub-area nodes through a genetic algorithm.
The mowing planning device can establish a mowing path of the sub-area according to the starting coordinate and the ending coordinate of the sub-area, so that the mowing path can completely cover the area which is not mowed, the mowing path in the sub-area is shortest, the mowing robot is controlled to mow a target lawn in the sub-area along the mowing path, the mowing effect in the sub-area is ensured, and the mowing efficiency is improved.
The step S4 of determining the distance between the target area position and the updated path includes the following steps:
S401, acquiring a first target area to be processed near a mowing path of the mowing robot, controlling the mowing robot to move along the mowing path according to a moving instruction set by a user, acquiring positioning information of the mowing robot in a moving process in real time, and determining the first target area to be processed according to the positioning information of the mowing robot;
s402, expanding the first target area to be treated outwards by a preset distance to serve as a second target area to be treated, and controlling the mowing robot to move along the second target area to be treated;
s403, acquiring environmental information of the mower in the second target area to be processed according to a sensor of the mowing robot, determining a non-working area boundary in the environmental information, controlling the mower to move along the boundary of the non-working area according to the non-working area boundary so as to acquire a third target area to be processed, avoiding the non-working area, and constructing the boundary of the working area of the mowing robot according to the second target area to be processed, the third target area to be processed and the subsequent target area to be processed.

Claims (9)

1. The self-learning obstacle avoidance control method of the mowing robot is characterized by comprising the following steps of:
S1, global path planning, namely acquiring position data of a current lawn by a preset processor and a positioning module, establishing communication connection between the preset mobile terminal and the processor through a local area network communication module, downloading an online map from a wide area network according to the position data acquired by the processor, transmitting boundary information of a mowing area to the processor through the mobile terminal by an operator, and planning an optimal mowing path through an algorithm;
S2, performing a mowing task according to an optimal mowing path after the mowing robot receives the mobile terminal instruction, judging the shape characteristics of the obstacle according to the motion parameters of the mowing robot during the task execution, and performing the mowing task along the edge of the obstacle;
S3, updating the mowing path in real time, marking the position of the obstacle encountered in the mowing task, and updating the mowing path in real time;
S4, cutting-up path planning, namely setting an area which is not cut up in the middle of a cutting-up task as a target area, recording the position of the target area, judging the distance between the position of the target area and the updated path according to the cutting-up task of the cutting-up robot, and further optimizing the cutting-up path.
2. The self-learning obstacle avoidance control method of a lawnmower robot of claim 1, wherein: the mowing area boundary information acquisition in the step S1 includes the following steps:
s11, presetting an unmanned aerial vehicle, traversing a grassland boundary through a down-looking binocular passive vision system carried by the unmanned aerial vehicle, and generating a plot boundary coordinate graph;
S12, identifying obstacles in the block boundary based on the block boundary coordinate graph, and correcting the block boundary graph;
S13, dividing the corrected land block boundary by adopting a visual identification and segmentation technology, determining the drivable area and barrier boundary information, and planning the optimal drivable area path information.
3. The self-learning obstacle avoidance control method of a lawnmower robot of claim 2, wherein: the algorithm in the step S1 is a genetic algorithm.
4. The self-learning obstacle avoidance control method of a lawnmower robot of claim 3, wherein: the step S2 of determining the shape feature of the obstacle includes the steps of:
S21, presetting a CCD camera to acquire all-dimensional image characteristics of the obstacle, presetting a processing module to equally divide the obstacle image into n areas, and setting color pixels of each small area as Obtaining the number of edge pixels after edge detectionCombining the color pixels and the edge pixels into a feature set through the processing module;
S22, further judging according to the ratio of the edge to the number of the pixels detected by the color, and judging the area as an obstacle if the result of the ratio is within a set threshold value, wherein the formula is as follows:
Wherein, alpha and beta are the upper and lower limits of the ratio of the two; Is a region color number threshold.
5. The self-learning obstacle avoidance control method of a lawnmower robot of claim 4, wherein: the mowing task along the edge of the obstacle in the step S2 comprises the following steps:
S201, detecting obstacles in a lawn through a laser radar and a CCD camera preset in the mowing robot, calling corresponding mowing strategies preset in a mowing robot database through a preset control unit, and converting the mowing strategies into instructions through a preset transmission unit to be transmitted to a receiving unit of the mowing robot;
s202, controlling the robot to bypass the obstacle through a mowing strategy to avoid the obstacle and mow.
6. The self-learning obstacle avoidance control method of a lawnmower robot of claim 5, wherein: the mowing strategy in step S201 shoots and records the obstacle in real time through the laser radar and the CCD camera of the mowing robot, reconstructs the model of the obstacle through a preset measurement method, and calculates the optimal mowing path of the mowing robot through a genetic algorithm.
7. The self-learning obstacle avoidance control method of a lawnmower robot of claim 6, wherein: the marking of the obstacle position encountered in the mowing task in the step S3 includes the following steps:
s2021, generating a path diagram according to the track of the mowing robot around the obstacle, correspondingly generating an obstacle label, and storing the label into a preset memory;
s2022, the control unit invokes the obstacle tag in the memory, builds the connection between the obstacle tag and the mowing strategy according to the tag attribute, and establishes an association relation.
8. The self-learning obstacle avoidance control method of a lawnmower robot of claim 7, wherein: the target area setting in step S4 includes the steps of:
S41, when the mowing robot performs a mowing task along a mowing path, real-time scanning the grassland conditions around the path through a CCD camera, judging an unharked area according to the grassland height, recording the coordinates and the shape of the area, and generating a main area node;
s42, radiating the main area node to the periphery, scanning an unharvested land adjacent to the main area node, and recording the coordinates and the shape of the unharvested land to generate a sub-area node;
S43, establishing a relation between the main area node and the sub-area nodes, generating a target area grid graph according to the distribution of the main area node and the sub-area nodes, and calculating an optimal complementary cutting path for connecting the main area node and the sub-area nodes through a genetic algorithm.
9. The self-learning obstacle avoidance control method of a lawnmower robot of claim 8, wherein: the step S4 of determining the distance between the target area position and the updated path includes the following steps:
S401, acquiring a first target area to be processed near a mowing path of the mowing robot, controlling the mowing robot to move along the mowing path according to a moving instruction set by a user, acquiring positioning information of the mowing robot in a moving process in real time, and determining the first target area to be processed according to the positioning information of the mowing robot;
s402, expanding the first target area to be treated outwards by a preset distance to serve as a second target area to be treated, and controlling the mowing robot to move along the second target area to be treated;
s403, acquiring environmental information of the mower in the second target area to be processed according to a sensor of the mowing robot, determining a non-working area boundary in the environmental information, controlling the mower to move along the boundary of the non-working area according to the non-working area boundary so as to acquire a third target area to be processed, avoiding the non-working area, and constructing the boundary of the working area of the mowing robot according to the second target area to be processed, the third target area to be processed and the subsequent target area to be processed.
CN202410926670.XA 2024-07-11 2024-07-11 Self-learning obstacle avoidance control method of mowing robot Active CN118444687B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410926670.XA CN118444687B (en) 2024-07-11 2024-07-11 Self-learning obstacle avoidance control method of mowing robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410926670.XA CN118444687B (en) 2024-07-11 2024-07-11 Self-learning obstacle avoidance control method of mowing robot

Publications (2)

Publication Number Publication Date
CN118444687A true CN118444687A (en) 2024-08-06
CN118444687B CN118444687B (en) 2024-10-25

Family

ID=92309297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410926670.XA Active CN118444687B (en) 2024-07-11 2024-07-11 Self-learning obstacle avoidance control method of mowing robot

Country Status (1)

Country Link
CN (1) CN118444687B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200170181A1 (en) * 2019-03-01 2020-06-04 Chongqing Rato Intelligent Equipment Co., LTD. Method and system for path planning after removing or adding obstacle from/to lawn to be mowed
WO2020147159A1 (en) * 2019-01-14 2020-07-23 傲基科技股份有限公司 Mowing robot and mowing region division method thereof, control device, and storage medium
CN112612280A (en) * 2020-12-24 2021-04-06 格力博(江苏)股份有限公司 Mower and control method thereof
CN115454054A (en) * 2022-08-22 2022-12-09 深圳拓邦股份有限公司 Mowing control method and system of mower and readable storage medium
CN117148845A (en) * 2023-09-25 2023-12-01 优思美地(上海)机器人科技有限公司 Mowing planning method of mowing robot, electronic equipment and storage medium
CN117274519A (en) * 2023-10-16 2023-12-22 奥比中光科技集团股份有限公司 Map construction method and device and mowing robot
WO2024001880A1 (en) * 2022-06-29 2024-01-04 松灵机器人(深圳)有限公司 Intelligent obstacle avoidance method and device, mowing robot, and storage medium
WO2024012192A1 (en) * 2022-07-11 2024-01-18 松灵机器人(深圳)有限公司 Intelligent obstacle avoidance method, and mowing robot and storage medium
CN118068840A (en) * 2024-03-27 2024-05-24 江苏汉地曼智能科技有限公司 Mowing robot path planning method, mowing robot path planning equipment, mowing robot path planning medium and mowing robot path planning product

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020147159A1 (en) * 2019-01-14 2020-07-23 傲基科技股份有限公司 Mowing robot and mowing region division method thereof, control device, and storage medium
US20200170181A1 (en) * 2019-03-01 2020-06-04 Chongqing Rato Intelligent Equipment Co., LTD. Method and system for path planning after removing or adding obstacle from/to lawn to be mowed
CN112612280A (en) * 2020-12-24 2021-04-06 格力博(江苏)股份有限公司 Mower and control method thereof
WO2024001880A1 (en) * 2022-06-29 2024-01-04 松灵机器人(深圳)有限公司 Intelligent obstacle avoidance method and device, mowing robot, and storage medium
WO2024012192A1 (en) * 2022-07-11 2024-01-18 松灵机器人(深圳)有限公司 Intelligent obstacle avoidance method, and mowing robot and storage medium
CN115454054A (en) * 2022-08-22 2022-12-09 深圳拓邦股份有限公司 Mowing control method and system of mower and readable storage medium
CN117148845A (en) * 2023-09-25 2023-12-01 优思美地(上海)机器人科技有限公司 Mowing planning method of mowing robot, electronic equipment and storage medium
CN117274519A (en) * 2023-10-16 2023-12-22 奥比中光科技集团股份有限公司 Map construction method and device and mowing robot
CN118068840A (en) * 2024-03-27 2024-05-24 江苏汉地曼智能科技有限公司 Mowing robot path planning method, mowing robot path planning equipment, mowing robot path planning medium and mowing robot path planning product

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
宝秋利;梁显丽;: "基于大数据的园林割草机器人轨迹规划仿真", 计算机仿真, no. 08, 15 August 2020 (2020-08-15) *
张东涛;周莹莹;吴东林;: "基于物联网的智能割草机器人运动控制系统研究", 农机化研究, no. 10, 31 March 2020 (2020-03-31) *
訾涛;: "割草机器人路径规划的研究――基于机器视觉和人工势场算法", 农机化研究, no. 07, 28 October 2020 (2020-10-28) *

Also Published As

Publication number Publication date
CN118444687B (en) 2024-10-25

Similar Documents

Publication Publication Date Title
US11789459B2 (en) Vehicle controllers for agricultural and industrial applications
US9274524B2 (en) Method for machine coordination which maintains line-of-site contact
US20210000006A1 (en) Agricultural Lane Following
US9603300B2 (en) Autonomous gardening vehicle with camera
CN106325271A (en) Intelligent mowing device and intelligent mowing device positioning method
Moorehead et al. Automating orchards: A system of autonomous tractors for orchard maintenance
EP2354878B1 (en) Method for regenerating a boundary containing a mobile robot
US8195358B2 (en) Multi-vehicle high integrity perception
US8527197B2 (en) Control device for one or more self-propelled mobile apparatus
US11882787B1 (en) Automatic sensitivity adjustment for an autonomous mower
KR102683902B1 (en) Grain handling autopilot system, autopilot method and automatic identification method
CN114937258A (en) Control method for mowing robot, and computer storage medium
EP3695701B1 (en) Robotic vehicle for boundaries determination
CN118170145B (en) Self-adaptive harvesting method and device for unmanned harvester based on vision
CN117148845A (en) Mowing planning method of mowing robot, electronic equipment and storage medium
CN118444687B (en) Self-learning obstacle avoidance control method of mowing robot
US20230320262A1 (en) Computer vision and deep learning robotic lawn edger and mower
CN112180945B (en) Method for automatically generating obstacle boundary and automatic walking equipment
CN111744690B (en) Spraying operation control method, device, carrier and storage medium
CN113268058A (en) Collision avoidance method and collision avoidance device for unmanned vehicle, and unmanned vehicle
CN211698708U (en) Automatic working system
CN115097833A (en) Automatic obstacle avoidance method and system for pesticide application robot and storage medium
CN116430838A (en) Self-mobile device and control method thereof
CN113821021B (en) Automatic walking equipment region boundary generation method and system
CN113359700A (en) Intelligent operation system of unmanned tractor based on 5G

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant