CN107981790B - Indoor area dividing method and sweeping robot - Google Patents
Indoor area dividing method and sweeping robot Download PDFInfo
- Publication number
- CN107981790B CN107981790B CN201711262898.XA CN201711262898A CN107981790B CN 107981790 B CN107981790 B CN 107981790B CN 201711262898 A CN201711262898 A CN 201711262898A CN 107981790 B CN107981790 B CN 107981790B
- Authority
- CN
- China
- Prior art keywords
- cleaning
- virtual boundary
- graphic
- marker
- door
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
- Electric Vacuum Cleaner (AREA)
Abstract
The invention discloses an indoor area division method and a sweeping robot, wherein the indoor area division method comprises the following steps: the floor sweeping robot detects a marker arranged on a door frame of an indoor room; the spatial coordinate position of the marker is determined, and the position of the virtual boundary between the cleaning regions partitioned by the door is determined based on the spatial coordinate position. Adopt the figure label to judge the boundary, the regional division is reasonable, and the volume is littleer, can set up for a long time, and it is more convenient to operate, can also reduce the cost.
Description
Technical Field
The invention relates to the field of area identification, in particular to an indoor area dividing method and a sweeping robot.
Background
With the development of technology and the improvement of living standard of people, the sweeping robot with the automatic moving function has been greatly popularized. The sweeping robot is also called as a robot dust collector, and the purpose of automatically cleaning a room is achieved by sucking garbage on the ground into a dust box by using a fan carried on a machine body and automatically moving by a travelling wheel.
In order to perform cleaning more effectively, the cleaning robot needs to perform area division on the indoor environment. Currently, there are two main area division methods: the first is to divide the environment into several parts according to a fixed size and sequentially sweep the environment according to the position information of the robot. However, this method cannot adapt to different characteristics of different environments, and the phenomenon of unreasonable area division is easily caused. The second is to provide guidance and restriction through external means, such as the virtual lighthouse technology of IRobot corporation. However, the cost of the virtual lighthouse is high, and if a user needs to divide a room into a plurality of areas, the user needs to purchase the virtual lighthouse additionally, so that the use cost is greatly increased; in addition, virtual beacon is bulky, for example puts the normal use that will influence the door in the door limit, and the user need put the setting when using, still need to collect it with the end, and the operation is more loaded down with trivial details.
Disclosure of Invention
The invention mainly aims to provide an indoor area dividing method and a sweeping robot with lower cost.
The invention proposes
An indoor area dividing method comprises the following steps:
the floor sweeping robot detects a marker arranged on a door frame of an indoor room;
the spatial coordinate position of the marker is determined, and the position of the virtual boundary between the cleaning regions partitioned by the door is determined based on the spatial coordinate position.
Further, the robot that sweeps the floor detects the marker that sets up on indoor room door frame includes: the sweeping robot utilizes the visual sensor to detect the markers arranged in the detection range of the visual sensor on the door frame of the indoor room.
Further, the marker is a graphic label.
Furthermore, a plurality of graphic labels are arranged, and the left end and the right end of the door frame are respectively adhered with the graphic labels;
determining the location of a virtual boundary between cleaning zones separated by a door based on the spatial coordinate location comprises: and calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends.
Further, the step of determining a spatial coordinate position of the marker, and determining a position of a virtual boundary between two cleaning regions partitioned by the door based on the spatial coordinate position includes:
determining the position of the image of the graphic label in the scene image according to the pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and converting the position into the space coordinate position of the graphic label in the environment map;
and judging the position of a virtual boundary between the two cleaning areas in the environment map according to the space coordinate position of the graphic label in the environment map.
The invention also provides a sweeping robot, which comprises a detection module and a judgment module;
the detection module is used for detecting a marker arranged on an indoor room door frame by the sweeping robot; the judging module is used for determining the space coordinate position of the marker and determining the position of a virtual boundary between cleaning areas separated by the door according to the space coordinate position.
Further, the detection module comprises a detection submodule; the detection submodule is used for detecting the markers arranged in the detection range of the visual sensor on the indoor room door frame by using the visual sensor.
Furthermore, the markers are a plurality of graphic labels, and the left end and the right end of the door frame are respectively stuck with the graphic labels;
the judgment module comprises a judgment submodule; the judgment submodule is used for calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends.
Further, the marker is a graphic label;
the judgment module comprises an acquisition submodule and a positioning submodule; the acquisition submodule is used for determining the position of the image of the graphic label in the scene image according to the pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and further converting the position into the space coordinate position of the graphic label in the environment map; and the positioning submodule is used for judging the position of a virtual boundary between two cleaning areas in the environment map according to the space coordinate position of the graphic label in the environment map.
Further, the markers are disposed on both inner and outer sides of the door frame of the indoor room.
According to the indoor area division method and the sweeping robot, the boundary is judged by adopting the graphic label, the area division is reasonable, the size is smaller, the indoor area division method can be set for a long time, the boundary is prevented from being moved frequently, the artificial participation is reduced, the operation is more convenient, and the cost can be reduced compared with the use of a virtual lighthouse.
Drawings
FIG. 1 is a schematic diagram illustrating steps of an indoor area division method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating steps of another embodiment of an indoor area division method according to the present invention;
FIG. 3 is a schematic diagram illustrating a third embodiment of an indoor area dividing method according to the present invention;
fig. 4 is a schematic diagram illustrating an embodiment of step S2 in the method for dividing an indoor area according to the present invention;
fig. 5 is a schematic diagram illustrating an embodiment of step S23 in the method for dividing an indoor area according to the present invention;
FIG. 6 is a schematic diagram illustrating a fourth embodiment of the indoor area dividing method according to the present invention;
fig. 7 is a schematic structural view of a sweeping robot according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an embodiment of a detection module in the sweeping robot of the present invention;
fig. 9 is a schematic structural diagram of an embodiment of a determination module in the sweeping robot according to the present invention;
fig. 10 is a schematic structural diagram of a judging module in the sweeping robot according to another embodiment of the present invention;
fig. 11 is a schematic structural diagram of a judgment submodule in the sweeping robot according to another embodiment of the present invention;
fig. 12 is a schematic structural view of another embodiment of the sweeping robot of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, an indoor area division method includes the following steps:
s1, the sweeping robot detects the marker arranged on the door frame of the indoor room.
And S2, determining the space coordinate position of the marker, and determining the position of the virtual boundary between the cleaning areas separated by the door according to the space coordinate position.
In the above step S1, the marker is set at the door frame position; the marker is flat, can set up for a long time and not influence the door and open and close, needn't take off after using, reduces artificial intervention, and it is more convenient to use, and user experience is better.
In the above-described step S2, the entire space to be cleaned may be divided into the respective cleaning regions by dividing the cleaning regions by the virtual boundaries; particularly, in a house with a plurality of rooms, each room can be divided independently, the sweeping robot is prevented from working to and fro in different rooms without planning, and the user experience is better; a virtual boundary is usually set at the door, where the cleaning area is divided.
The sweeping robot judges whether the virtual boundary can be crossed according to preset conditions.
Referring to fig. 2, in some embodiments, step S1 includes:
s11, the sweeping robot detects the markers in the detection range of the visual sensor arranged on the door frame of the indoor room by using the visual sensor.
In step S11, the vision sensor is used to search for the marker during the operation of the sweeping robot, and the marker is set within the detectable range of the vision sensor, for example, the marker is set on the door frame of the room and cannot be detected by the vision sensor.
The marker is a graphic label. The graphic label is a flaky object, can paste on the plane and do not influence the door and normally open and close, consequently need not get rid of once using after, can place for a long time, consequently once sets up and can repetitious usage, has reduced user's work load, and user experience is better, and the cost of graphic label all is cheaper moreover, also can reduce user's cost.
The graphics in the graphic label can be a two-dimensional code.
In some embodiments, the vision sensor may be a monocular, binocular, or depth camera.
Referring to fig. 3, in some embodiments, a plurality of graphic labels are provided, and the graphic labels are respectively attached to the left and right ends of the door frame;
step S2 includes:
and S21, calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends.
In step S21, the position of the virtual boundary is directly calculated from the spatial positions of the two graphic labels, and the accuracy of the determination is higher.
Referring to fig. 4, in the present embodiment, step S2 includes:
and S22, determining the position of the image of the graphic label in the scene image according to the pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and converting the position into the space coordinate position of the graphic label in the environment map.
And S23, judging the position of the virtual boundary between the two cleaning areas in the environment map according to the space coordinate position of the graphic label in the environment map.
In step S22, the environmental map is obtained by initializing a spatial coordinate system when the sweeping robot starts to work, and then performing indoor positioning and mapping according to data acquired by the visual sensor, and includes information such as room boundaries, positions and sizes of placed objects, and the spatial coordinate system; the spatial coordinate position of the graphic sticker in the environmental map may generally determine the location of the virtual boundary.
In step S23, the specific position of the virtual boundary is determined by the setting based on the position of the graphic label, and the virtual boundary separates two adjacent cleaning regions.
The markers are arranged on the inner side and the outer side of the indoor room door frame, and due to the fact that the visual sensors are arranged, indoor area division in the scheme cannot be achieved without the markers, and therefore at least one marker is required to be arranged at each corresponding cleaning area at the door.
Referring to fig. 5, step S23 includes:
and S231, determining the orientation of the graphic label according to the space coordinate position of the graphic label in the environment map.
And S232, taking the graphic label as a center, and extending to two sides of the horizontal direction along the plane where the graphic label is located by a preset width which is larger than or equal to the width of the door to obtain a virtual plane.
And S233, if the virtual plane meets the determined obstacle plane in the environment map in the process of extending from the middle point of the graphic label to the two sides, cutting the virtual plane.
And S234, if the determined obstacle plane exists within the distance set in the opposite direction of the graphic label, deleting the position from the virtual plane.
And S235, detecting the residual virtual planes, and rejecting planes with the width smaller than a second preset width.
And S236, if a plurality of virtual planes meeting the conditions still exist, selecting the virtual plane closest to the graphic labeling center as a virtual boundary between the two cleaning areas.
In the step S231, the sweeping robot searches the graphic label through the visual sensor, so that the sweeping robot is opposite to the graphic label, the relative position of the sweeping area where the sweeping robot is located and the graphic label can be determined according to the orientation of the graphic label, the directions of the sweeping area where the sweeping robot is located and the adjacent sweeping area and the direction of the virtual boundary are determined according to the orientation of the graphic label, and the virtual boundary is located in the horizontal extension direction of the side surface of the graphic label.
In the above step S232, the preset width is greater than or equal to the door width, so the door position is necessarily in the virtual plane; in some embodiments the predetermined width is 1.5m to 3.2 m.
In step S233, the determined obstacle plane is a plane through which the sweeping robot cannot pass.
In step S234, in order to determine that the wall on the side of the door is in the direction away from the door, the wall is located in the opposite direction of the graphic label, and an obstacle plane such as a wall is located within the determined distance in the opposite direction of the graphic label, it can be determined that the door is not located at the position of the obstacle plane, the virtual boundary is not located at the position, and the sweeping robot cannot enter another room through the position, so the virtual plane corresponding to the position can be deleted.
In step S235, the second predetermined width is smaller than or equal to the width of the door, and faces where there is obviously no door may be removed.
In the above step S236, since the graphic sticker is attached to the door frame, the virtual plane closest to the center of the graphic sticker is most likely to have a door.
In some embodiments, the second threshold is 0.6m-1.5 m. Preferably 0.6 m.
Referring to fig. 6, step S2 is followed by:
s3, when the cleaning area is not cleaned, the sweeping robot takes an avoidance measure when meeting the virtual boundary; when the cleaning area is cleaned, the sweeping robot crosses the virtual boundary into the adjacent area.
In step S3, when the cleaning area is not cleaned, the sweeping robot takes an avoidance measure when meeting the virtual boundary; when the cleaning area is judged to be cleaned, the sweeping robot does not take reaction measures when meeting the virtual boundary and directly crosses the magnetic stripe to enter the adjacent cleaning area. The sequence of entering the adjacent non-cleaning space can adopt a nearby principle, and the cleaning robot automatically enters the adjacent cleaning area from the nearest virtual boundary according to the coordinate positioning.
In some embodiments, the avoidance measures are backward and turning to ensure that the sweeping robot does not cross the virtual boundary.
Referring to fig. 7, the invention further provides a sweeping robot, which comprises a detection module 1 and a judgment module 2; the detection module 1 is used for detecting a marker arranged on a door frame of an indoor room by the sweeping robot; the judging module 2 is used for determining the space coordinate position of the marker, and determining the position of a virtual boundary between cleaning areas separated by a door according to the space coordinate position.
In the operation of the detection module 1, the marker is arranged on the position of the doorframe; the marker is flat, can set up for a long time and not influence the door and open and close, needn't take off after using, reduces artificial intervention, and it is more convenient to use, and user experience is better.
In the work of the judging module 2, the whole space to be cleaned can be divided into cleaning areas through virtual boundary division, particularly, each room can be divided independently in a house with a plurality of rooms, the sweeping robot is prevented from working in different rooms without planning, and the user experience is better; setting a virtual boundary at a door in a normal condition, and dividing a region at the door; the floor sweeping robot judges whether the virtual boundary can be crossed according to preset conditions
Referring to fig. 8, the detection module 1 includes a detection submodule 11; the detection submodule 11 is configured to detect, by using the visual sensor, a marker disposed in a detection range of the visual sensor on a door frame of the indoor room.
In the working process of the detection sub-module 11, the visual sensor is used for searching for the marker in the working process of the sweeping robot, and the setting position of the marker is within the detectable range of the visual sensor, for example, the marker is arranged on the door frame of a room and cannot be detected out of the detectable range of the visual sensor.
The marker is a graphic label. The graphic label is a flaky object, can paste on the plane and do not influence the door and normally open and close, consequently need not get rid of once using after, can place for a long time, consequently once sets up and can repetitious usage, has reduced user's work load, and user experience is better, and the cost of graphic label all is cheaper moreover, also can reduce user's cost.
The graphics in the graphic label can be a two-dimensional code.
In some embodiments, the vision sensor may be a monocular, binocular, or depth camera.
Referring to fig. 9, the markers are graphic labels, a plurality of graphic labels are provided, and the graphic labels are respectively stuck to the left end and the right end of the door frame; the judgment module 2 comprises a judgment submodule 21; the judgment sub-module 21 is used for calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends. In the working process of the judgment submodule 21, the position of the virtual boundary is directly calculated through the space positions of the two graphic labels, and the judgment accuracy is higher.
Referring to fig. 10, the judgment module 2 includes an acquisition submodule 22 and a positioning submodule 23; the acquisition submodule 22 is configured to determine a position of the image of the graphic label in the scene image according to a pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and further convert the position into a spatial coordinate position of the graphic label in the environment map; the positioning sub-module 23 is configured to determine a position of a virtual boundary between two cleaning regions in the environment map according to a spatial coordinate position of the graphic label in the environment map.
In the working process of the acquisition submodule 22, the environment map is obtained by initializing a space coordinate system when the sweeping robot starts to work and then positioning and mapping indoors according to data acquired by the visual sensor, and comprises information such as room boundaries, positions and sizes of placed objects, the space coordinate system and the like; the spatial coordinate position of the graphic sticker in the environmental map may generally determine the location of the virtual boundary.
In the working process of the positioning sub-module 23, the specific position of the virtual boundary is judged according to the setting through the position of the graphic label, and the virtual boundary separates two adjacent cleaning areas.
The markers are arranged on the inner side and the outer side of the indoor room door frame, and due to the fact that the visual sensors are arranged, indoor area division in the scheme cannot be achieved without the markers, and therefore at least one marker is required to be arranged at each corresponding cleaning area at the door.
Referring to fig. 11, the markers are graphic labels; the positioning sub-module 23 comprises an orientation unit 231, an extension unit 232, a truncation unit 233, a deletion unit 234, a rejection unit 235 and a selection unit 236; the orientation unit 231 is used for determining the orientation of the graphic label according to the spatial coordinate position of the graphic label in the environment map; the extending unit 232 is used for extending the graphic label as a center along the plane of the graphic label to two sides in the horizontal direction by a preset width larger than or equal to the width of the door to obtain a virtual plane; the truncation unit 233 is configured to truncate the virtual plane when the virtual plane encounters a determined obstacle plane in the environment map in the process of extending from the midpoint of the graphic label to both sides; the deleting unit 234 is configured to delete the position from the virtual plane if the determined obstacle plane exists within a distance set in the opposite direction of the graphic label; the rejecting unit 235 is configured to detect remaining virtual planes, and reject planes with a width smaller than a second preset width; the selecting unit 236 is configured to select a virtual plane closest to the graphic label center as a virtual boundary between two cleaning regions if there are still a plurality of eligible virtual planes.
In the working process of the orientation unit 231, the sweeping robot probes the graphic label through the visual sensor, so that the sweeping robot is opposite to the graphic label, the relative position of the sweeping area where the sweeping robot is located and the graphic label can be determined according to the orientation of the graphic label, the directions of the sweeping area where the sweeping robot is located and the adjacent sweeping area and the direction of the virtual boundary are determined according to the orientation of the graphic label, and the virtual boundary is located in the horizontal extension direction of the side face of the graphic label.
In the working process of the extension unit 232, the preset width is greater than or equal to the width of the door, so that the door is always positioned in the virtual plane; in some embodiments the predetermined width is 1.5m to 3.2 m.
During the operation of the intercepting unit 233, the determined obstacle plane is a plane that the sweeping robot cannot pass through.
In the working process of the deleting unit 234, in order to determine the condition that the wall on the side of the door is in the direction away from the door, the wall is located in the opposite direction of the graphic label, and barrier planes such as walls are located within the determined distance in the opposite direction of the graphic label, it can be determined that the door is not located at the position of the barrier plane, the virtual boundary is not located at the position, and the sweeping robot cannot enter another room through the position, so that the virtual plane corresponding to the position can be deleted.
In the working process of the removing unit 235, the second preset width is smaller than or equal to the width of the door, and the faces obviously without the door possibly exist can be removed.
During operation of the selection unit 236, the virtual plane closest to the center of the graphic sticker is most likely to be present in the door because the graphic sticker is affixed to the door frame.
In some embodiments, the second threshold is 0.6m-1.5 m. Preferably 0.6 m.
Referring to fig. 12, the sweeping robot further includes a boundary-crossing judging module 3, where the boundary-crossing judging module 3 is configured to judge that the sweeping robot takes an avoidance measure when encountering a virtual boundary when the sweeping area is not finished; when the cleaning area is cleaned, the sweeping robot crosses the virtual boundary to enter the adjacent cleaning area.
Confirmation from the environmental map while determining the virtual boundary
In the working process of the over-boundary judgment module 3, when the cleaning area is not cleaned, the sweeping robot takes an avoidance measure when meeting the virtual boundary; when the cleaning area is judged to be cleaned, the sweeping robot does not take reaction measures when meeting the virtual boundary and directly crosses the magnetic stripe to enter the adjacent cleaning area. The sequence of entering the adjacent non-cleaning space can adopt a nearby principle, and the cleaning robot automatically enters the adjacent cleaning area from the nearest virtual boundary according to the coordinate positioning.
The avoidance measures are backward and turning to ensure that the sweeping robot cannot cross the virtual boundary.
According to the indoor cleaning area dividing method and the sweeping robot, the graph label is adopted to judge the boundary, the cleaning area dividing is reasonable, the graph label is smaller in size and can be set for a long time, the boundary is prevented from being moved frequently, human participation is reduced, the operation is more convenient, the use of a virtual lighthouse is compared, and the cost can be reduced.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (5)
1. An indoor area dividing method is characterized by comprising the following steps:
the floor sweeping robot detects a marker arranged on a door frame of an indoor room; wherein the marker is a graphic label;
determining the space coordinate position of the marker, and determining the position of a virtual boundary between cleaning areas separated by a door according to the space coordinate position;
when the cleaning of the cleaning area is not finished, the cleaning robot takes an avoidance measure when meeting the virtual boundary; when the cleaning area is cleaned, the cleaning robot crosses the virtual boundary to enter an adjacent area; the avoidance measures comprise backward movement or turning;
the robot of sweeping the floor surveys the marker that sets up on indoor room door frame and includes: the sweeping robot detects a marker arranged on a door frame of an indoor room in a detection range of a visual sensor by using the visual sensor;
wherein the step of determining a spatial coordinate position of the marker and determining a position of a virtual boundary between cleaning regions partitioned by the door based on the spatial coordinate position comprises:
determining the position of the image of the graphic label in the scene image according to the pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and converting the position into the space coordinate position of the graphic label in the environment map;
and judging the position of a virtual boundary between two cleaning areas in the environment map according to the space coordinate position of the graphic label in the environment map.
2. The indoor area dividing method according to claim 1, wherein a plurality of graphic labels are attached to left and right ends of the door frame of the room;
the determining the position of the virtual boundary between the cleaning areas separated by the door according to the space coordinate position comprises the following steps: and calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends.
3. A robot of sweeping floor, characterized in that, the robot of sweeping floor includes:
the detection module is used for detecting a marker arranged on an indoor room door frame by the sweeping robot; wherein the marker is a graphic label;
the judging module is used for determining the space coordinate position of the marker and determining the position of a virtual boundary between cleaning areas separated by a door according to the space coordinate position;
when the cleaning of the cleaning area is not finished, the cleaning robot takes an avoidance measure when meeting the virtual boundary; when the cleaning area is cleaned, the cleaning robot crosses the virtual boundary to enter an adjacent area; the avoidance measures comprise backward movement or turning;
the detection module comprises:
the detection submodule is used for detecting the markers arranged on the indoor room door frame in the detection range of the visual sensor by using the visual sensor;
the judging module comprises:
the acquisition submodule is used for determining the position of the image of the graphic label in the scene image according to the pixel position of the image of the graphic label in the scene image acquired by the visual sensor, and further converting the position into the space coordinate position of the graphic label in the environment map;
and the positioning submodule is used for judging the position of a virtual boundary between two cleaning areas in the environment map according to the space coordinate position of the graphic label in the environment map.
4. The sweeping robot according to claim 3, wherein the markers are a plurality of graphic labels, and the graphic labels are respectively attached to the left and right ends of the door frame of the room;
the judging module comprises:
and the judgment submodule is used for calculating the position of a virtual boundary between two cleaning areas separated by the door according to the space positions of the graphic labels respectively arranged at the two ends.
5. The sweeping robot of claim 3, wherein the markers are disposed on both the inside and outside of the door frame of the indoor room.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711262898.XA CN107981790B (en) | 2017-12-04 | 2017-12-04 | Indoor area dividing method and sweeping robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711262898.XA CN107981790B (en) | 2017-12-04 | 2017-12-04 | Indoor area dividing method and sweeping robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107981790A CN107981790A (en) | 2018-05-04 |
CN107981790B true CN107981790B (en) | 2020-06-09 |
Family
ID=62035542
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711262898.XA Active CN107981790B (en) | 2017-12-04 | 2017-12-04 | Indoor area dividing method and sweeping robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107981790B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022056056A1 (en) * | 2020-09-11 | 2022-03-17 | Locus Robotics Corp. | Robot navigation management between zones in an environment |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109077667B (en) * | 2018-07-16 | 2020-12-01 | 广州俊德信息科技有限公司 | Adjusting method and system of cleaning electric appliance, storable medium and cleaning electric appliance |
CN108968818B (en) * | 2018-07-16 | 2020-12-18 | 广州俊德信息科技有限公司 | Storage method, system, device and storable medium for cleaning electric appliance |
CN108784543B (en) * | 2018-07-16 | 2021-03-09 | 广州俊德信息科技有限公司 | Method, system, device and storable medium for locating a cleaning appliance |
CN110806746A (en) * | 2018-07-18 | 2020-02-18 | 杭州萤石软件有限公司 | Functional area division method applied to mobile robot and mobile robot |
CN108968825B (en) * | 2018-08-17 | 2020-12-11 | 深圳领贝智能科技有限公司 | Sweeping robot and sweeping method thereof |
CN112639897A (en) * | 2018-09-13 | 2021-04-09 | 开利公司 | Spatial determination of boundary visualizations |
CN109522803B (en) * | 2018-10-18 | 2021-02-09 | 深圳乐动机器人有限公司 | Indoor area division and identification method and device and terminal equipment |
CN111459153B (en) * | 2019-01-03 | 2022-09-06 | 科沃斯机器人股份有限公司 | Dynamic region division and region channel identification method and cleaning robot |
CN110251000A (en) * | 2019-05-20 | 2019-09-20 | 广东宝乐机器人股份有限公司 | A method of improving sweeping robot cleaning efficiency |
CN110450152A (en) * | 2019-06-24 | 2019-11-15 | 广东宝乐机器人股份有限公司 | Region identification method, robot, and storage medium |
CN112205937B (en) * | 2019-07-12 | 2022-04-05 | 北京石头世纪科技股份有限公司 | Automatic cleaning equipment control method, device, equipment and medium |
CN112799389B (en) * | 2019-11-12 | 2022-05-13 | 苏州宝时得电动工具有限公司 | Automatic walking area path planning method and automatic walking equipment |
CN110974091B (en) * | 2020-02-27 | 2020-07-17 | 深圳飞科机器人有限公司 | Cleaning robot, control method thereof, and storage medium |
JP7411897B2 (en) * | 2020-04-10 | 2024-01-12 | パナソニックIpマネジメント株式会社 | Vacuum cleaning systems and vacuum cleaners |
CN111897334B (en) * | 2020-08-02 | 2022-06-14 | 珠海一微半导体股份有限公司 | Robot region division method based on boundary, chip and robot |
WO2022027252A1 (en) * | 2020-08-04 | 2022-02-10 | 苏州珊口智能科技有限公司 | Marking, association and control method for mobile robot, system, and storage medium |
CN113475977B (en) * | 2021-06-22 | 2023-05-05 | 深圳拓邦股份有限公司 | Robot path planning method and device and robot |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107329476A (en) * | 2017-08-02 | 2017-11-07 | 珊口(上海)智能科技有限公司 | A kind of room topology map construction method, system, device and sweeping robot |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7634336B2 (en) * | 2005-12-08 | 2009-12-15 | Electronics And Telecommunications Research Institute | Localization system and method of mobile robot based on camera and landmarks |
KR20090077547A (en) * | 2008-01-11 | 2009-07-15 | 삼성전자주식회사 | Method and apparatus of path planning for a mobile robot |
KR101906329B1 (en) * | 2010-12-15 | 2018-12-07 | 한국전자통신연구원 | Apparatus and method for indoor localization based on camera |
CN103271699B (en) * | 2013-05-29 | 2016-05-18 | 东北师范大学 | A kind of Smart Home clean robot |
KR101830249B1 (en) * | 2014-03-20 | 2018-03-29 | 한국전자통신연구원 | Position recognition apparatus and method of mobile object |
CN106325266A (en) * | 2015-06-15 | 2017-01-11 | 联想(北京)有限公司 | Spatial distribution map building method and electronic device |
CN205656496U (en) * | 2015-11-26 | 2016-10-19 | 江苏美的清洁电器股份有限公司 | Robot of sweeping floor and device is establish to indoor map thereof |
-
2017
- 2017-12-04 CN CN201711262898.XA patent/CN107981790B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107329476A (en) * | 2017-08-02 | 2017-11-07 | 珊口(上海)智能科技有限公司 | A kind of room topology map construction method, system, device and sweeping robot |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022056056A1 (en) * | 2020-09-11 | 2022-03-17 | Locus Robotics Corp. | Robot navigation management between zones in an environment |
Also Published As
Publication number | Publication date |
---|---|
CN107981790A (en) | 2018-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107981790B (en) | Indoor area dividing method and sweeping robot | |
CN104536445B (en) | Mobile navigation method and system | |
US11351670B2 (en) | Domestic robotic system and method | |
US20220057212A1 (en) | Method for updating a map and mobile robot | |
JP2022106924A (en) | Device and method for autonomous self-position estimation | |
US8467902B2 (en) | Method and apparatus for estimating pose of mobile robot using particle filter | |
CN107997690B (en) | Indoor area dividing method and sweeping robot | |
CN106227212B (en) | The controllable indoor navigation system of precision and method based on grating map and dynamic calibration | |
US20180210448A1 (en) | Method of identifying functional region in 3-dimensional space, and robot implementing the method | |
US20190168386A1 (en) | Self-moving robot, map building method, and map invoking method for combined robot | |
CN113741438A (en) | Path planning method and device, storage medium, chip and robot | |
CN112526993A (en) | Grid map updating method and device, robot and storage medium | |
CN112075879A (en) | Information processing method, device and storage medium | |
CN106569489A (en) | Floor sweeping robot having visual navigation function and navigation method thereof | |
CN109528089A (en) | A kind of walk on method, apparatus and the chip of stranded clean robot | |
Maier et al. | Vision-based humanoid navigation using self-supervised obstacle detection | |
CN113331743A (en) | Method for cleaning floor by cleaning robot and cleaning robot | |
CN111679664A (en) | Three-dimensional map construction method based on depth camera and sweeping robot | |
CN111714028A (en) | Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium | |
CN112445215A (en) | Automatic guided vehicle driving control method, device and computer system | |
CN111609853A (en) | Three-dimensional map construction method, sweeping robot and electronic equipment | |
CN113503877A (en) | Robot partition map establishing method and device and robot | |
CN112308033A (en) | Obstacle collision warning method based on depth data and visual chip | |
CN114779777A (en) | Sensor control method and device for self-moving robot, medium and robot | |
WO2021138372A1 (en) | Feature coverage analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20190906 Address after: Room 402, 4th floor, Kanghe Sheng Building, New Energy Innovation Industrial Park, No. 1 Chuangsheng Road, Nanshan District, Shenzhen City, Guangdong Province, 518000 Applicant after: Shenzhen Infinite Power Development Co., Ltd. Address before: 518000 B, block 1079, garden city digital garden, Nanhai Road, Shekou, Shenzhen, Guangdong, 503, Nanshan District 602, China Applicant before: SHENZHEN WOTE WODE CO., LTD. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |