CN111399502A - Mobile robot and drawing establishing method and device thereof - Google Patents
Mobile robot and drawing establishing method and device thereof Download PDFInfo
- Publication number
- CN111399502A CN111399502A CN202010157240.8A CN202010157240A CN111399502A CN 111399502 A CN111399502 A CN 111399502A CN 202010157240 A CN202010157240 A CN 202010157240A CN 111399502 A CN111399502 A CN 111399502A
- Authority
- CN
- China
- Prior art keywords
- boundary
- mobile robot
- environment image
- texture features
- working area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000000007 visual effect Effects 0.000 claims abstract description 40
- 238000013507 mapping Methods 0.000 claims abstract description 39
- 239000003086 colorant Substances 0.000 claims description 37
- 238000007781 pre-processing Methods 0.000 claims description 9
- 238000010586 diagram Methods 0.000 abstract description 21
- 239000003550 marker Substances 0.000 abstract description 7
- 238000012423 maintenance Methods 0.000 abstract description 4
- 230000007613 environmental effect Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 241001417527 Pempheridae Species 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 241000256626 Pterygota <winged insects> Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000007797 corrosion Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention provides a mobile robot and a drawing establishing method and device thereof. The method comprises the following steps: identifying a mapping starting point, wherein the mapping starting point is on the boundary of the working area; identifying a natural boundary within the work area; identifying visual markers within the working area and determining a virtual boundary from the visual markers; controlling the mobile robot to start from the image building starting point, walk along the natural boundary and the virtual boundary, and recording the walking track of the mobile robot through a positioning device; and when the mobile robot returns to the map building starting point, taking the walking track of the mobile robot as the map information of the working area. And identifying a natural boundary through a visual sensor, identifying a virtual boundary through a visual marker, and establishing a boundary block diagram of the working area. Boundary wires do not need to be pre-buried, manual wiring operation and maintenance work of the boundary wires are reduced, and use cost of the mobile robot is reduced.
Description
Technical Field
The invention belongs to the technical field of mobile robots, and particularly relates to a mobile robot and a map building method and device thereof.
Background
With the development of the robot technology, the application range of the robot is wider and wider. Nowadays, more and more large-area lawn owners adopt mowing robots to automatically trim lawns, and the workload of personnel is reduced.
In order to realize automatic lawn trimming by the mowing robot, a working boundary map is established before the mowing machine works, the boundary of a mowing area is determined, and the mowing robot is controlled to be kept in the mowing area through the position relation between the mowing boundary in the working boundary map and the mowing robot in the mowing process of the mowing robot. Particularly, when the lawn mower trims a local lawn in a large lawn, it is important to create the work boundary map when the mowing area is not clearly demarcated from the non-mowing area.
At present, a boundary of a mowing area is provided for a mowing robot in a mode of pre-burying a boundary line, a sensor matched with the pre-buried boundary line is installed on the mowing robot, the boundary line is identified through the sensor, and a working boundary map is established.
However, by embedding the boundary line in advance, not only manual wiring is required, but also maintenance work such as continuous energization and breakage prevention of the boundary line is required, and the economic cost and the labor cost of the mowing robot in the use process are increased.
Disclosure of Invention
The embodiment of the invention aims to provide a mobile robot mapping method, and aims to solve the problems of reducing economic cost and labor cost in the use process of a mowing robot.
The embodiment of the invention is realized by a mapping method of a mobile robot, wherein the mobile robot is provided with a vision sensor, and the method comprises the following steps:
identifying a mapping starting point, wherein the mapping starting point is on the boundary of the working area;
identifying a natural boundary within the work area;
identifying visual markers within the working area and determining a virtual boundary from the visual markers;
controlling the mobile robot to start from the image building starting point, walk along the natural boundary and the virtual boundary, and recording the walking track of the mobile robot through a positioning device;
and when the mobile robot returns to the map building starting point, taking the walking track of the mobile robot as the map information of the working area.
Further, the step of identifying a natural boundary within the work area comprises:
acquiring an environment image in a working area acquired by the vision sensor in real time;
acquiring all colors contained in the environment image;
judging whether the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color;
and if the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color, determining the position corresponding to the environment image as the natural boundary.
Further, after the step of determining whether the environment image simultaneously includes a pre-stored color and a color other than the pre-stored color, the method further includes:
if the environment image does not simultaneously contain a pre-stored color and colors other than the pre-stored color, obtaining all texture features contained in the environment image;
judging whether the texture features simultaneously contain pre-stored texture features and texture features except the pre-stored texture features;
and if the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features, determining the position corresponding to the environment image as the natural boundary.
Further, the step of acquiring all colors included in the environment image includes:
preprocessing the environment image;
and acquiring all colors contained in the environment image according to the preprocessed environment image.
Further, the step of identifying a natural boundary within the work area comprises:
acquiring an environment image in a working area acquired by the vision sensor in real time;
acquiring all texture features contained in the environment image;
judging whether the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features;
and if so, determining the position corresponding to the environment image as the natural boundary.
Furthermore, if the texture features simultaneously include pre-stored texture features and texture features other than the pre-stored texture features, when the environment image in the working area is obtained again through the visual sensor, all the texture features included in the environment image are obtained; judging whether the texture features simultaneously contain pre-stored texture features and texture features except the pre-stored texture features; and if the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features, determining the position corresponding to the environment image as the natural boundary.
Further, the step of using the walking track of the robot as the map information of the working area comprises:
marking the walking track corresponding to the natural boundary in the walking track as an entity boundary;
marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and forming the map information of the working area by the walking tracks respectively corresponding to the entity boundary and the virtual boundary.
Further, after the step of using the walking track of the robot as the map information of the working area, the method further includes:
after receiving a working instruction, controlling the mobile robot to move in the working area according to the map information, wherein when the mobile robot moves to a physical boundary, a vision sensor is adopted to control the robot to be in the physical boundary; when the mobile robot moves to the virtual boundary, the positioning information of the mobile robot is acquired in real time so as to control the mobile robot to be in the virtual boundary.
The invention also provides a map building device of the mobile robot, which comprises:
the map building starting point identification unit is used for identifying a map building starting point, wherein the map building starting point is positioned on the boundary of the working area;
a natural boundary identifying unit for identifying a natural boundary within the working area;
the virtual boundary determining unit is used for identifying the visual markers in the working area and determining a virtual boundary according to the visual markers;
the walking track recording unit is used for controlling the mobile robot to start from the drawing starting point, walk along the natural boundary and the virtual boundary, and record the walking track of the mobile robot through a positioning device;
and the map information determining unit is used for taking the walking track of the robot as the map information of the working area when the mobile robot returns to the mapping starting point.
Further, the natural boundary identifying unit includes:
the first environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
a color obtaining module for obtaining all colors contained in the environment image
The judging module is used for judging whether the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color;
and the first judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
Further, the color acquisition module comprises:
the preprocessing submodule is used for preprocessing the environment image;
and the color obtaining sub-module is used for obtaining all colors contained in the environment image according to the preprocessed environment image.
Further, the natural boundary identifying unit may further include:
the second environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
the texture feature acquisition module is used for acquiring all texture features contained in the environment image;
the second judging module is used for judging whether the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features;
and the second judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
Further, the map information determination unit includes:
the entity boundary marking module is used for marking the walking track corresponding to the natural boundary in the walking track as an entity boundary;
the virtual boundary marking module is used for marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and the map information determining module is used for forming the map information of the working area by the walking tracks respectively corresponding to the entity boundary and the virtual boundary.
Still further, the apparatus further comprises:
the control unit is used for controlling the mobile robot to move in the working area according to the map information after receiving a working instruction, wherein when the mobile robot moves to an entity boundary, a visual sensor is adopted to control the mobile robot to be in the entity boundary; when the mobile robot moves to the virtual boundary, the positioning information of the mobile robot is acquired in real time so as to control the mobile robot to be in the virtual boundary.
The invention also provides a mobile robot which is provided with a vision sensor and the mobile robot drawing establishing device.
According to the drawing establishing method of the mobile robot, the visual sensor arranged in the mobile robot is used for acquiring the working environment image, the controller identifies the drawing establishing starting point on the boundary of the working area, the natural boundary and the virtual boundary in the working area according to the environment image detected by the visual sensor, and then the mobile robot is controlled to start from the drawing establishing starting point, walk along the detected natural boundary and virtual boundary, acquire the environment image while walking, and determine the natural boundary or the virtual boundary when walking next time. The walking track of the mobile robot is recorded by the positioning device, after the mobile robot returns to the starting point of the map building, the mobile robot is shown to walk for a circle along the working area, the walking track of the mobile robot can be used as the map information of the working area of the mobile robot, and the building of the boundary block diagram of the working area is completed. And identifying a natural boundary by adopting a vision sensor, and setting a vision marker at the virtual boundary for the mobile robot to identify the virtual boundary so as to establish a boundary block diagram of the working area according to the natural boundary and the virtual boundary. The working boundary block diagram can be established without embedding boundary wires in advance, manual wiring operation and maintenance work of boundary lines are not needed, and the use cost of the mobile robot is reduced.
Drawings
Fig. 1 is a schematic flowchart of a mapping method of a mobile robot according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a diagram establishing method of a mobile robot according to a second embodiment of the present invention;
fig. 3 is a schematic flowchart of a diagram establishing method for a mobile robot according to a third embodiment of the present invention;
fig. 4 is a schematic block diagram of a diagram building apparatus of a mobile robot according to a fourth embodiment of the present invention;
fig. 5 is a schematic block diagram of a diagram building apparatus of a mobile robot according to a fifth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example one
Referring to fig. 1, a flowchart of a mapping method for a mobile robot according to a first embodiment of the present invention is shown, where the mapping method for a mobile robot includes:
and step S10, identifying a mapping starting point, wherein the mapping starting point is on the boundary of the working area.
The mobile robot can be a mowing robot, a sweeper, a floor washing machine, a mobile trolley and other devices with mobile functions. The mobile robot is provided with a visual sensor connected with a controller, environmental image information is collected in real time or at regular time through the visual sensor, and the collected environmental image information is sent to the controller so that the controller can analyze the environmental image information and identify environmental information such as position identification, boundary identification, color information, texture feature information and the like in the environment.
The method for establishing the map of the mobile robot establishes a boundary block diagram of a working area of the mobile robot, and a map establishing starting point needs to be defined on the boundary of the working area, so that the mobile robot walks along the boundary of the working area by taking the map establishing starting point as a map establishing starting position, and the boundary block diagram of the working area is drawn. Therefore, markers such as two-dimensional codes, bar codes and special graphs are arranged on the boundary of the working area to serve as marks or a base station on the boundary of the working area is directly used as a mark, so that an environment image with the marks is acquired by a vision sensor in the mobile robot and is sent to the controller, and the controller can identify the position point of the mark through the environment image and use the position point as a drawing starting point of the mobile robot.
Step S20, identifying a natural boundary within the work area.
The vision sensor collects an environment image in a working area in real time and sends the environment image to the controller, and the controller analyzes whether the position corresponding to the environment image is a natural boundary or not according to the environment image.
Specifically, the natural boundary determination methods include the following two methods:
first, whether the boundary is a natural boundary is determined based on colors in the environment image. According to the color characteristics of the working area of the mobile robot, the colors of the working area are pre-stored in a program executed by a controller in advance, and after the environment image is received, all the colors existing in the environment image are acquired. When the environment image simultaneously contains the pre-stored colors and colors other than the pre-stored colors, the corresponding position of the environment image is a natural boundary; when the environment image only contains the pre-stored color, the corresponding position of the environment image is indicated to be in the boundary; when the environment image only contains colors except the pre-stored color, the environment image corresponding to the position is out of the boundary. The specific position of the natural boundary is a position corresponding to a color boundary between a pre-stored color and other colors in the environment image.
For example, if the mobile robot is a mowing robot and the working area is a lawn, the color (e.g., green) of the lawn is set as a preset color, and if there are colors other than green and green in the captured environment image, the environment image corresponds to a natural boundary whose position is the working area.
It is easy to understand that, in order to avoid that colors other than the pre-stored colors (for example, yellow leaves fall into the lawn) are mistakenly present in the working area, or corners in the environment image are shot outside the natural boundary, but the actual position corresponding to the environment image is not the natural boundary, thereby affecting the judgment of the natural boundary. In another embodiment, a scheme is provided in which, when the ratio of the area in which the same color appears to the total area of the image in the environment image is greater than a certain value, it is determined that the color exists in the environment image, in other words, when the ratio of the area in which one color appears to the total area of the image in the environment image is less than a certain value, the color is no longer taken as the color existing in the environment image. For example, if the environment image actually includes green and yellow, but yellow occupies 0.1% of the total area of the image, it is determined that the environment image includes only green.
And secondly, determining whether the position corresponding to the environment image is a natural boundary or not through the texture features contained in the environment image. The method comprises the steps that the environment such as the lawn, the ground, the wall and the path has obvious texture features, so that the texture features of the environment in a working area are collected in advance and serve as pre-stored texture features, and when the pre-stored texture features and the texture features except the pre-stored texture features are analyzed from an environment image at the same time, the position corresponding to the environment image is judged to be a natural boundary.
Further, in another embodiment, in the process of determining whether the environment image is a natural boundary according to the color in the environment image, after the controller acquires the environment image collected by the visual sensor, the controller first pre-processes the environment image, extracts the color existing in the environment image according to the pre-processed environment image, and determines whether the environment image simultaneously includes a pre-stored color and a color other than the pre-stored color. The preprocessing includes image processing methods such as expansion and corrosion to remove color noise and the like in the image. The environment image is preprocessed and then subjected to color extraction, so that the situation that the natural boundary misjudgment is caused due to the fact that color types contained in the environment image are increased when sundries such as fallen leaves and winged insects fall into a working area can be avoided.
And step S30, recognizing the visual markers in the working area through a visual sensor, and determining a virtual boundary according to the visual markers.
When the working area is a part of a small lawn area in a large lawn, the connection between the small lawn and other lawns has no natural boundary, but forms a virtual boundary without obvious boundary. Moreover, there is no difference between the color and texture features on both sides of the virtual boundary, and it is difficult to determine the boundary based on the color and texture features included in the environment image.
In order to realize mapping without obvious boundaries, visual markers are arranged at the virtual boundaries before the robot is controlled to map, and when an environment image with the visual markers is acquired by a visual sensor, the corresponding positions of the environment image are judged as the virtual boundaries. The visual marker may be a color band (the color of the color band should be obviously different from the color of the object in the working area, or may be a color marker post arranged along the virtual boundary at a preset distance, and the like, which is not specifically limited herein.
And step S40, controlling the mobile robot to start from the drawing starting point, walk along the natural boundary and the virtual boundary, and recording the walking track of the mobile robot through a positioning device.
And step S50, when the mobile robot returns to the mapping start point, using the walking track of the mobile robot as the map information of the work area.
And after the starting point of drawing construction is identified, acquiring an environment image in real time, and identifying a natural boundary and a virtual boundary in the working area according to the environment image. And then controlling the mobile robot to walk along the detected natural boundary and virtual boundary from the map building starting point, acquiring the environment image while walking, and determining the natural boundary or virtual boundary when walking next. The walking track of the mobile robot is recorded by the positioning device, after the mobile robot returns to the starting point of the map building, the mobile robot is shown to walk for a circle along the working area, the walking track of the mobile robot can be used as the map information of the working area of the mobile robot, and the building of the boundary block diagram of the working area is completed.
In this embodiment, a visual sensor provided in the mobile robot acquires a working environment image, and the controller identifies a mapping start point on a working area boundary, a natural boundary in the working area, and a virtual boundary according to the environment image detected by the visual sensor, and further controls the mobile robot to travel along the detected natural boundary and virtual boundary from the mapping start point, acquire the environment image while traveling, and determine the natural boundary or virtual boundary at the time of the next travel. The walking track of the mobile robot is recorded by the positioning device, after the mobile robot returns to the starting point of the map building, the mobile robot is shown to walk for a circle along the working area, the walking track of the mobile robot can be used as the map information of the working area of the mobile robot, and the building of the boundary block diagram of the working area is completed. And identifying a natural boundary by adopting a vision sensor, and setting a vision marker at the virtual boundary for the mobile robot to identify the virtual boundary so as to establish a boundary block diagram of the working area according to the natural boundary and the virtual boundary. The working boundary block diagram can be established without embedding boundary wires in advance, manual wiring operation and maintenance work of boundary lines are not needed, and the use cost of the mobile robot is reduced.
Example two
Referring to fig. 2, a second embodiment of the method for creating a diagram of a mobile robot according to the present invention is based on the first embodiment, where step S20 includes:
step S21, acquiring an environment image in a working area acquired by the vision sensor in real time;
step S22, acquiring all colors included in the environment image;
step S23, judging whether the environment image contains pre-stored color and color other than the pre-stored color;
step S24, if the environment image includes pre-stored color and color other than the pre-stored color, determining the position corresponding to the environment image as the natural boundary;
step S25, if the environment image does not contain the pre-stored color and the colors other than the pre-stored color at the same time, obtaining all the texture features contained in the environment image;
step S26, judging whether the texture features simultaneously contain pre-stored texture features and texture features other than the pre-stored texture features;
step S27, if the texture features include both pre-stored texture features and texture features other than the pre-stored texture features, determining the position corresponding to the environment image as the natural boundary.
In order to improve the accuracy of natural boundary determination, in this embodiment, the determination is performed by combining color and texture features. Specifically, after the environment image in the working area is acquired by the vision sensor, the color included in the environment image is acquired, and it is first determined whether the environment image includes the preset color and a color other than the preset color. If the environment image simultaneously contains the preset color and the colors other than the preset color, the position corresponding to the environment image can be directly judged to be a natural boundary. If the environment image does not simultaneously contain the preset color and the colors except the preset color, the texture feature of the environment image is obtained, and whether the environment image simultaneously contains the preset texture feature and the texture features except the preset texture feature is judged. If the environment image simultaneously contains a preset texture feature and a texture feature except the preset texture feature, judging that the position corresponding to the environment image is a natural boundary; and if the environmental image does not simultaneously contain the preset texture features and the texture features except the preset texture features, judging that the position corresponding to the environmental image is not a natural boundary.
Further, when the position corresponding to the environment image is judged not to be a natural boundary by using the color, and the position corresponding to the environment image is judged to be the natural boundary by using the texture feature, the color distinction between the inside and outside of the boundary at the natural boundary is not obvious, and the texture feature is different, so that the accuracy of judging the boundary line by using the color is lower than that of judging the texture feature. Then, the texture features are adopted for the environment images collected by the vision sensor to judge whether the environment images are natural boundaries or not, and the colors contained in the environment images do not need to be obtained again.
In this embodiment, whether the position corresponding to the environment image is a natural boundary is comprehensively determined according to the color and texture features of the environment image acquired by the vision sensor, so that the accuracy of natural boundary determination is improved.
EXAMPLE III
Referring to fig. 3, a third embodiment of the method for creating a diagram of a mobile robot according to the present invention is based on the first or second embodiment, where step S50 includes:
step S51, marking the walking track corresponding to the natural boundary in the walking track as an entity boundary;
step S52, marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and step S53, forming the map information of the working area by the walking tracks respectively corresponding to the entity boundary and the virtual boundary.
And when the mobile robot returns to the map building starting point again, taking the walking track of the mobile robot as the map information of the working area, wherein the walking track of the mobile robot is divided into a track walking along a natural boundary and a track walking along a virtual boundary. Marking the walking track corresponding to the natural boundary in the walking track as an entity boundary; and marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary so as to obtain the marked walking track, and forming a boundary block diagram (map information) of the working area by the walking track. After the mobile robot stores map information with boundary marks, when the mobile robot moves in the working area according to the map information, and when the mobile robot moves to a physical boundary, a visual sensor is adopted to control the robot to be in the physical boundary; when the mobile robot moves to the virtual boundary, the virtual boundary information cannot be acquired through the visual sensor (the visual markers arranged at the virtual boundary are removed during image construction), and then the positioning information of the robot is acquired in real time by adopting the positioning device so as to control the mobile robot to be in the virtual boundary. The mobile robot can accurately control the mobile robot to be in the boundary by adopting the visual positioning function of the visual sensor at the entity boundary according to the boundary mark, and the action range of the mobile robot is controlled by adopting other positioning modes at the virtual boundary which cannot be identified by the visual sensor so as to make up for the deficiency of visual positioning.
The positioning device may be a GPS, a odometer, a radar positioning device, an infrared positioning device, or a bluetooth positioning device, and is not limited herein.
In this embodiment, by marking an entity boundary and a virtual boundary in map information (boundary block diagram), a basis is provided for the mobile robot to work by using the map information and freely switch a positioning mode.
Example four
Referring to fig. 4, a diagram establishing apparatus of a mobile robot according to a fourth embodiment of the present invention includes:
a mapping starting point identification unit 10, configured to identify a mapping starting point, where the mapping starting point is on a boundary of a working area;
a natural boundary identifying unit 20 for identifying a natural boundary within the work area;
a virtual boundary determining unit 30, configured to identify a visual marker in the working area, and determine a virtual boundary according to the visual marker;
a walking track recording unit 40, configured to control the mobile robot to start from the mapping starting point, walk along the natural boundary and the virtual boundary, and record a walking track of the mobile robot through a positioning device;
and the map information determining unit 50 is used for taking the walking track of the mobile robot as the map information of the working area when the mobile robot returns to the mapping starting point.
Further, the natural boundary identifying unit 20 includes:
the first environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
a color obtaining module for obtaining all colors contained in the environment image
The judging module is used for judging whether the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color;
and the first judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
Further, the color acquisition module includes:
the preprocessing submodule is used for preprocessing the environment image;
and the color obtaining sub-module is used for obtaining all colors contained in the environment image according to the preprocessed environment image.
Further, the natural boundary identifying unit 20 further includes:
the second environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
the texture feature acquisition module is used for acquiring all texture features contained in the environment image;
the second judging module is used for judging whether the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features;
and the second judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
In the drawing device for a mobile robot according to the fourth embodiment of the present invention, the implementation principle and the generated technical effects are the same as those of the first and second embodiments of the drawing method for a mobile robot, and for brief description, reference may be made to corresponding contents in the first and second embodiments of the drawing method for a mobile robot, where the fourth embodiment of the drawing device for a mobile robot is not mentioned.
EXAMPLE five
Referring to fig. 5, in a fifth embodiment of the drawing setup apparatus for a mobile robot according to the present invention, the map information determining unit 50 includes:
an entity boundary marking module 51, configured to mark a walking trajectory corresponding to the natural boundary in the walking trajectory as an entity boundary;
a virtual boundary marking module 52, configured to mark a walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and the map information determining module 53 is configured to combine the walking tracks corresponding to the entity boundary and the virtual boundary into the map information of the working area.
Further, the apparatus further comprises:
the control unit is used for controlling the mobile robot to move in the working area according to the map information after receiving a working instruction, wherein when the mobile robot moves to an entity boundary, a visual sensor is adopted to control the mobile robot to be in the entity boundary; when the mobile robot moves to the virtual boundary, the positioning information of the mobile robot is acquired in real time so as to control the mobile robot to be in the virtual boundary.
For a brief description, the fifth embodiment of the apparatus for creating an image of a mobile robot may refer to the third embodiment of the method for creating an image of a mobile robot, which has the same implementation principle and technical effects as the third embodiment of the method for creating an image of a mobile robot.
In addition, the embodiment of the invention also provides a mobile robot, which is provided with a vision sensor and the drawing establishing device of the mobile robot in the three to four embodiments.
Specifically, the moving device may be a device having a moving function, such as a mowing robot, a sweeper, a floor washer, a mobile cart, or the like.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (15)
1. A method of mapping a mobile robot, the mobile robot having a vision sensor, the method comprising:
identifying a mapping starting point, wherein the mapping starting point is on the boundary of the working area;
identifying a natural boundary within the work area;
identifying visual markers within the working area and determining a virtual boundary from the visual markers;
controlling the mobile robot to start from the image building starting point, walk along the natural boundary and the virtual boundary, and recording the walking track of the mobile robot through a positioning device;
and when the mobile robot returns to the map building starting point, taking the walking track of the mobile robot as the map information of the working area.
2. The mapping method of a mobile robot according to claim 1, wherein the step of identifying natural boundaries within the work area comprises:
acquiring an environment image in a working area acquired by the vision sensor in real time;
acquiring all colors contained in the environment image;
judging whether the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color;
and if the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color, determining the position corresponding to the environment image as the natural boundary.
3. The mapping method of a mobile robot according to claim 2, wherein after the step of determining whether the environment image includes both a pre-stored color and a color other than the pre-stored color, the method further comprises:
if the environment image does not simultaneously contain a pre-stored color and colors other than the pre-stored color, obtaining all texture features contained in the environment image;
judging whether the texture features simultaneously contain pre-stored texture features and texture features except the pre-stored texture features;
and if the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features, determining the position corresponding to the environment image as the natural boundary.
4. The mapping method of a mobile robot according to claim 2, wherein the step of acquiring all colors included in the environment image comprises:
preprocessing the environment image;
and acquiring all colors contained in the environment image according to the preprocessed environment image.
5. The mapping method of a mobile robot according to claim 1, wherein the step of identifying natural boundaries within the work area comprises:
acquiring an environment image in a working area acquired by the vision sensor in real time;
acquiring all texture features contained in the environment image;
judging whether the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features;
and if so, determining the position corresponding to the environment image as the natural boundary.
6. A mapping method for a mobile robot according to claim 5,
if the texture features simultaneously comprise pre-stored texture features and texture features other than the pre-stored texture features, acquiring all the texture features contained in the environment image when the environment image in the working area is acquired again through the visual sensor; judging whether the texture features simultaneously contain pre-stored texture features and texture features except the pre-stored texture features; and if the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features, determining the position corresponding to the environment image as the natural boundary.
7. The mapping method of a mobile robot according to any one of claims 1 to 6, wherein the step of using the travel path of the robot as the map information of the work area comprises:
marking the walking track corresponding to the natural boundary in the walking track as an entity boundary;
marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and forming the map information of the working area by the walking tracks respectively corresponding to the entity boundary and the virtual boundary.
8. The mapping method of a mobile robot according to claim 7, wherein after the step of using the travel path of the robot as the map information of the work area, the method further comprises:
after receiving a working instruction, controlling the mobile robot to move in the working area according to the map information, wherein when the mobile robot moves to a physical boundary, a vision sensor is adopted to control the robot to be in the physical boundary; when the mobile robot moves to the virtual boundary, the positioning information of the mobile robot is acquired in real time so as to control the mobile robot to be in the virtual boundary.
9. An apparatus for mapping a mobile robot, the apparatus comprising:
the map building starting point identification unit is used for identifying a map building starting point, wherein the map building starting point is positioned on the boundary of the working area;
a natural boundary identifying unit for identifying a natural boundary within the working area;
the virtual boundary determining unit is used for identifying the visual markers in the working area and determining a virtual boundary according to the visual markers;
the walking track recording unit is used for controlling the mobile robot to start from the drawing starting point, walk along the natural boundary and the virtual boundary, and record the walking track of the mobile robot through a positioning device;
and the map information determining unit is used for taking the walking track of the robot as the map information of the working area when the mobile robot returns to the mapping starting point.
10. The mobile robot mapping apparatus of claim 9, wherein the natural boundary identifying unit comprises:
the first environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
a color obtaining module for obtaining all colors contained in the environment image
The judging module is used for judging whether the environment image simultaneously contains a pre-stored color and colors other than the pre-stored color;
and the first judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
11. The mapping apparatus of a mobile robot according to claim 10, wherein the color acquisition module comprises:
the preprocessing submodule is used for preprocessing the environment image;
and the color obtaining sub-module is used for obtaining all colors contained in the environment image according to the preprocessed environment image.
12. The mobile robot mapping apparatus of claim 9, wherein the natural boundary identifying unit further comprises:
the second environment image acquisition module is used for acquiring an environment image in a working area acquired by the vision sensor in real time;
the texture feature acquisition module is used for acquiring all texture features contained in the environment image;
the second judging module is used for judging whether the texture features simultaneously comprise pre-stored texture features and texture features except the pre-stored texture features;
and the second judging module is used for judging the position corresponding to the environment image as the natural boundary if the position is the natural boundary.
13. The mobile robot mapping apparatus according to any one of claims 9 to 12, wherein the map information determining unit includes:
the entity boundary marking module is used for marking the walking track corresponding to the natural boundary in the walking track as an entity boundary;
the virtual boundary marking module is used for marking the walking track corresponding to the virtual boundary in the walking track as a virtual boundary;
and the map information determining module is used for forming the map information of the working area by the walking tracks respectively corresponding to the entity boundary and the virtual boundary.
14. The mobile robot mapping apparatus of claim 13, wherein said apparatus further comprises:
the control unit is used for controlling the mobile robot to move in the working area according to the map information after receiving a working instruction, wherein when the mobile robot moves to an entity boundary, a visual sensor is adopted to control the mobile robot to be in the entity boundary; when the mobile robot moves to the virtual boundary, the positioning information of the mobile robot is acquired in real time so as to control the mobile robot to be in the virtual boundary.
15. A mobile robot, characterized in that it has a vision sensor and a mapping device of the mobile robot according to any of claims 9-14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010157240.8A CN111399502A (en) | 2020-03-09 | 2020-03-09 | Mobile robot and drawing establishing method and device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010157240.8A CN111399502A (en) | 2020-03-09 | 2020-03-09 | Mobile robot and drawing establishing method and device thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111399502A true CN111399502A (en) | 2020-07-10 |
Family
ID=71432328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010157240.8A Pending CN111399502A (en) | 2020-03-09 | 2020-03-09 | Mobile robot and drawing establishing method and device thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111399502A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112102429A (en) * | 2020-08-03 | 2020-12-18 | 深圳拓邦股份有限公司 | Lawn mower, and map building method and storage medium thereof |
CN112146662A (en) * | 2020-09-29 | 2020-12-29 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN113110411A (en) * | 2021-03-08 | 2021-07-13 | 深圳拓邦股份有限公司 | Visual robot base station returning control method and device and mowing robot |
CN114019956A (en) * | 2021-10-14 | 2022-02-08 | 科沃斯机器人股份有限公司 | Method and system for determining region boundary, autonomous traveling equipment and mowing robot |
CN114322990A (en) * | 2021-12-30 | 2022-04-12 | 杭州海康机器人技术有限公司 | Data acquisition method and device for constructing mobile robot map |
WO2022095170A1 (en) * | 2020-11-09 | 2022-05-12 | 苏州科瓴精密机械科技有限公司 | Obstacle recognition method and apparatus, and device, medium and weeding robot |
CN115079692A (en) * | 2022-06-02 | 2022-09-20 | 深圳拓邦股份有限公司 | Drawing establishing method and system for mowing robot |
CN115371686A (en) * | 2022-10-26 | 2022-11-22 | 世源科技工程有限公司 | Method and related device for instantly positioning robot |
US11917938B2 (en) | 2022-07-05 | 2024-03-05 | Willand (Beijing) Technology Co., Ltd. | Method for constructing map for mower, storage medium, mower, and mobile terminal |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662400A (en) * | 2012-05-10 | 2012-09-12 | 慈溪思达电子科技有限公司 | Path planning algorithm of mowing robot |
CN103891463A (en) * | 2012-12-28 | 2014-07-02 | 苏州宝时得电动工具有限公司 | Automatic mowing system |
CN104111653A (en) * | 2013-04-22 | 2014-10-22 | 苏州宝时得电动工具有限公司 | Automatic walking equipment and working region judgment method thereof |
CN104699101A (en) * | 2015-01-30 | 2015-06-10 | 深圳拓邦股份有限公司 | Robot mowing system capable of customizing mowing zone and control method thereof |
CN105785986A (en) * | 2014-12-23 | 2016-07-20 | 苏州宝时得电动工具有限公司 | Automatic working equipment |
CN107015657A (en) * | 2017-04-14 | 2017-08-04 | 深圳市唯特视科技有限公司 | A kind of method that moveable robot movement space is limited based on interactive mode |
CN107044103A (en) * | 2016-02-06 | 2017-08-15 | 苏州宝时得电动工具有限公司 | Automatically walk snow removing equipment |
CN108507578A (en) * | 2018-04-03 | 2018-09-07 | 珠海市微半导体有限公司 | A kind of construction method and its air navigation aid of overall situation border map |
CN109421067A (en) * | 2017-08-31 | 2019-03-05 | Neato机器人技术公司 | Robot virtual boundary |
CN110347153A (en) * | 2019-06-26 | 2019-10-18 | 深圳拓邦股份有限公司 | A kind of Boundary Recognition method, system and mobile robot |
CN110450152A (en) * | 2019-06-24 | 2019-11-15 | 广东宝乐机器人股份有限公司 | Region identification method, robot, and storage medium |
-
2020
- 2020-03-09 CN CN202010157240.8A patent/CN111399502A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662400A (en) * | 2012-05-10 | 2012-09-12 | 慈溪思达电子科技有限公司 | Path planning algorithm of mowing robot |
CN103891463A (en) * | 2012-12-28 | 2014-07-02 | 苏州宝时得电动工具有限公司 | Automatic mowing system |
CN104111653A (en) * | 2013-04-22 | 2014-10-22 | 苏州宝时得电动工具有限公司 | Automatic walking equipment and working region judgment method thereof |
CN105785986A (en) * | 2014-12-23 | 2016-07-20 | 苏州宝时得电动工具有限公司 | Automatic working equipment |
CN104699101A (en) * | 2015-01-30 | 2015-06-10 | 深圳拓邦股份有限公司 | Robot mowing system capable of customizing mowing zone and control method thereof |
CN107044103A (en) * | 2016-02-06 | 2017-08-15 | 苏州宝时得电动工具有限公司 | Automatically walk snow removing equipment |
CN107015657A (en) * | 2017-04-14 | 2017-08-04 | 深圳市唯特视科技有限公司 | A kind of method that moveable robot movement space is limited based on interactive mode |
CN109421067A (en) * | 2017-08-31 | 2019-03-05 | Neato机器人技术公司 | Robot virtual boundary |
CN108507578A (en) * | 2018-04-03 | 2018-09-07 | 珠海市微半导体有限公司 | A kind of construction method and its air navigation aid of overall situation border map |
CN110450152A (en) * | 2019-06-24 | 2019-11-15 | 广东宝乐机器人股份有限公司 | Region identification method, robot, and storage medium |
CN110347153A (en) * | 2019-06-26 | 2019-10-18 | 深圳拓邦股份有限公司 | A kind of Boundary Recognition method, system and mobile robot |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112102429A (en) * | 2020-08-03 | 2020-12-18 | 深圳拓邦股份有限公司 | Lawn mower, and map building method and storage medium thereof |
CN112146662A (en) * | 2020-09-29 | 2020-12-29 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
CN112146662B (en) * | 2020-09-29 | 2022-06-10 | 炬星科技(深圳)有限公司 | Method and device for guiding map building and computer readable storage medium |
WO2022095170A1 (en) * | 2020-11-09 | 2022-05-12 | 苏州科瓴精密机械科技有限公司 | Obstacle recognition method and apparatus, and device, medium and weeding robot |
CN113110411A (en) * | 2021-03-08 | 2021-07-13 | 深圳拓邦股份有限公司 | Visual robot base station returning control method and device and mowing robot |
CN114019956A (en) * | 2021-10-14 | 2022-02-08 | 科沃斯机器人股份有限公司 | Method and system for determining region boundary, autonomous traveling equipment and mowing robot |
CN114322990A (en) * | 2021-12-30 | 2022-04-12 | 杭州海康机器人技术有限公司 | Data acquisition method and device for constructing mobile robot map |
CN114322990B (en) * | 2021-12-30 | 2024-04-19 | 杭州海康机器人股份有限公司 | Acquisition method and device for data for constructing mobile robot map |
CN115079692A (en) * | 2022-06-02 | 2022-09-20 | 深圳拓邦股份有限公司 | Drawing establishing method and system for mowing robot |
CN115079692B (en) * | 2022-06-02 | 2024-09-06 | 深圳拓邦股份有限公司 | Graph building method and system for mowing robot |
US11917938B2 (en) | 2022-07-05 | 2024-03-05 | Willand (Beijing) Technology Co., Ltd. | Method for constructing map for mower, storage medium, mower, and mobile terminal |
CN115371686A (en) * | 2022-10-26 | 2022-11-22 | 世源科技工程有限公司 | Method and related device for instantly positioning robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111399502A (en) | Mobile robot and drawing establishing method and device thereof | |
US10444760B2 (en) | Robotic vehicle learning site boundary | |
US10338602B2 (en) | Multi-sensor, autonomous robotic vehicle with mapping capability | |
US10806075B2 (en) | Multi-sensor, autonomous robotic vehicle with lawn care function | |
CN112415998B (en) | Obstacle classification obstacle avoidance control system based on TOF camera | |
US20170303466A1 (en) | Robotic vehicle with automatic camera calibration capability | |
CN109885060B (en) | Path management system and management method thereof | |
CN102789234A (en) | Robot navigation method and robot navigation system based on color coding identifiers | |
EP3760022B1 (en) | Installation method of a mobile device for land maintenance based on the recognition of the human figure | |
CN113885495B (en) | Outdoor automatic work control system, method and equipment based on machine vision | |
CN111168669B (en) | Robot control method, robot, and readable storage medium | |
JP2022165563A (en) | Autonomous travel inspection robot | |
CN114937258A (en) | Control method for mowing robot, and computer storage medium | |
CN111830968B (en) | Multifunctional water shield unmanned operation ship and navigation control method thereof | |
CN111176305A (en) | Visual navigation method | |
WO2023274339A1 (en) | Self-propelled working system | |
CN111103886A (en) | Method, device and equipment for identifying narrow traffic lane and computer readable storage medium | |
CN110531774A (en) | Obstacle Avoidance, device, robot and computer readable storage medium | |
CN116466724A (en) | Mobile positioning method and device of robot and robot | |
CN116430838A (en) | Self-mobile device and control method thereof | |
CN111504270A (en) | Robot positioning device | |
US20240061423A1 (en) | Autonomous operating zone setup for a working vehicle or other working machine | |
US20240287766A1 (en) | Virtual path guidance system | |
CN118128432A (en) | Walking navigation deviation correcting method for intelligent cable trench punching robot | |
CN117970404A (en) | Block laser slam mower positioning method based on gps positioning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |