Nothing Special   »   [go: up one dir, main page]

US20200089235A1 - Self-moving robot movement boundary determining method - Google Patents

Self-moving robot movement boundary determining method Download PDF

Info

Publication number
US20200089235A1
US20200089235A1 US16/691,419 US201916691419A US2020089235A1 US 20200089235 A1 US20200089235 A1 US 20200089235A1 US 201916691419 A US201916691419 A US 201916691419A US 2020089235 A1 US2020089235 A1 US 2020089235A1
Authority
US
United States
Prior art keywords
self
moving robot
sample points
boundary
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/691,419
Inventor
Jinju Tang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to US16/691,419 priority Critical patent/US20200089235A1/en
Assigned to ECOVACS ROBOTICS CO., LTD. reassignment ECOVACS ROBOTICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TANG, JINJU
Publication of US20200089235A1 publication Critical patent/US20200089235A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/24Arrangements for determining position or orientation
    • G05D1/247Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
    • G05D1/249Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process

Definitions

  • the present invention relates to a self-moving robot movement boundary delimiting method, which belongs to a technical field of self-moving robot movement control.
  • the self-moving robot is a typical robot, including various types such as sweeping robot, mowing robot, home service robot, surveillance robot and the like, and is very popular with customers for that it is characterized as being capable of walking freely. How to effectively control the movement of the self-moving robot within a certain working space is a key issue.
  • the self-moving robot needs to divide its movement region into blocks.
  • the existing regional division methods comprise Satellite Positioning method, Marker Setting-up method, Spatial Infrared Signal Guidance method and the like.
  • these existing regional division methods have the problems of low precision and cumbersome marker arrangement, and lack universality because their applications need to be particularly set according to the specific requirements of the actual environment.
  • the invention application published as CN 101109809A discloses positioning device, system and method based on a direction control photosensitive array, in which the real-time positioning of moving objects in a house or a small area is realized using sine theorem calculation by three infrared signal emitters fixed in the same plane and a signal receiver mounted on the robot device.
  • CN 101109809A discloses positioning device, system and method based on a direction control photosensitive array, in which the real-time positioning of moving objects in a house or a small area is realized using sine theorem calculation by three infrared signal emitters fixed in the same plane and a signal receiver mounted on the robot device.
  • such method only can realize the real-time positioning of the robot, and has a low accuracy of calculation and cannot realize the function of delimiting movement boundary.
  • the object of the present invention aims to provide a self-moving robot movement boundary delimiting method, which achieves regional division by distance measurement and positioning based on stationary base stations and has obvious advantages in term of either accuracy or convenience compared to the prior art.
  • a self-moving robot movement boundary delimiting method comprises the following steps:
  • step 100 setting up three or more base stations in a movement area of a self-moving robot, and establishing a coordinate system;
  • step 200 artificially planning a movement path in the movement area of the self-moving robot, gathering sample points on the path, and determining the coordinates of the sample points in the coordinate system;
  • step 300 delimiting a boundary according to the coordinates of the gathered sample points, and setting the self-moving robot to work inside or outside the boundary.
  • step 100 establishing the coordinate system by using one of the base stations as an origin, and calculating distances between the respective base stations by measuring signal transmission time between the respective base stations, whereby determining the coordinates of the respective base stations in the coordinate system.
  • determining the coordinates of the sample points specifically comprises calculating the coordinates of the sample points in the coordinate system by measuring signal transmission time between the self-moving robot and the respective base stations, and methods for the calculation include Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • the artificially planned movement path may be implemented in various ways, specifically including: a path formed by controlling the self-moving robot to move by a user via an interactive device; or a path formed by moving a positioning device provided for the self-moving robot in the movement area after the positioning device is detached from the self-moving robot by the user.
  • the gathering of the sample points in the step 200 is an interval gathering which is performed automatically at a preset time interval by the self-moving robot being moved, or is a random gathering which is performed artificially.
  • the present invention establishes the coordinate system by setting up the base stations.
  • the coordinate system may be a plane coordinate system or a three-dimensional coordinate system.
  • the shapes of the delimited boundaries may differ.
  • the coordinate system in the step 100 is a plane coordinate system established using three base stations, and a plane in which the plane coordinate system is located is coplanar with the movement area of the self-moving robot.
  • the boundary in the step 300 is an open or closed line formed by the sample points.
  • the coordinate system in the step 100 is a three-dimensional coordinate system established using four base stations.
  • the step 300 specifically comprises vertically or non-vertically projecting a set of the gathered sample points onto a plane, in which the movement area of the self-moving robot is located, to form mapped points, and the boundary is an open or closed line formed by connecting the mapped points.
  • the boundary in the step 300 is a plane determined using three sample points, or a plane fitted out using three or more sample points.
  • the boundary in the step 300 is a surface of a three-dimensional space constructed using a plurality of sample points by interpolating or fitting the sample points into a standard three-dimensional shape or a combination of standard three-dimensional shapes.
  • the standard three-dimensional shape is cube, cuboid, sphere or triangular pyramid.
  • the present invention delimits a movement boundary by distance measurement and positioning based on stationary base stations, and has obvious advantages in term of either accuracy or convenience compared to the prior art.
  • FIG. 1 is a schematic diagram of a plane coordinate system established according to the present invention
  • FIG. 2 is a schematic diagram of a first embodiment of the present invention
  • FIG. 3 is a schematic diagram of a second embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a third embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a fourth embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a fifth embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a sixth embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a seventh embodiment of the present invention.
  • the present invention provides a self-moving robot movement boundary delimiting method which delimits a movement boundary by the distance measurement and positioning based on stationary base stations.
  • such automatic robot positioning system comprises a self-moving robot (MR) and three or more base stations (BS).
  • the self-moving robot and the base stations each are provided with respective wireless signal transceivers.
  • the wireless signal to be transmitted may be infrared ray, ultrasonic, laser, electromagnetic wave, etc., and the transmission speed k thereof is known.
  • the self-moving robot and the base stations transmit the wireless signals, receive the signals from each other and measure the transmission time t of the signals.
  • the distances L between the base stations and the distances S between the self-moving robot and the respective base stations can be calculated by k ⁇ t.
  • FIG. 1 is a schematic diagram of a plane coordinate system established according to the present invention.
  • the plane coordinate system is established as follows. First, according to the principle of determining a plane using three points, a plane is determined using three base stations BS 1 , BS 2 and BS 3 , and a coordinate system is established in the plane.
  • the first base station BS 1 is an origin (0, 0) of the coordinate system, and a straight line on which the first base station BS 1 the second base station BS 2 are located may be set as an X-axis while a straight line vertical to the X-axis is a Y-axis.
  • k ⁇ t the relative distances L between the respective base stations are calculated, hereby obtaining the respective coordinates of the base stations in the plane coordinate system.
  • the above method of establishing the coordinate system is relatively simple. In practice, it is not necessary to set one of the base stations as the origin and to determine the X-axis using the first and second base stations.
  • the coordinates of the second base station shall be (X 1 +L 1 ⁇ cos A, Y 1 +L 1 ⁇ sin A), where L 1 is the distance between the first and second base stations, and A is the angle between the connection line of the first and second base stations and the X-axis.
  • X1, Y1 and A may be arbitrarily selected to determine the coordinate system.
  • a plane coordinate system can be established using three base stations and a three-dimensional coordinate system can be established using four base stations. Further, it is to be noted that if the plane coordinate system is established using three base stations, it is necessary that the three base stations are not on the same line. In addition, the number of the base stations to be set up may be increased so as to improve the accuracy of calculation.
  • the distances S between the self-moving robot and the respective base stations are obtained by calculation.
  • S 1 is calculated according to the measured time t 1 of the signal from the self-moving robot to the first base station as well as the known transmission speed.
  • S 2 and S 3 can be calculated in the similar manner.
  • the coordinates of the first base station BS 1 are (0, 0)
  • the distance between the first base station BS 1 and the second base station BS 2 is L 1
  • the coordinates of the second base station BS 2 are (L 1 , 0).
  • the angle A can be calculated according to S 1 , S 2 and L 1
  • the coordinates of the MR can be calculated according to S 3 .
  • the calculation method utilized above may be referred to as the Geometric Positioning method.
  • the coordinates of the MR can also be calculated by the Least Squares method according to the following formula:
  • the coordinates of the first base station BS 1 are (x1, y1)
  • the coordinates of the self-moving robot are (x, y)
  • t 1 is the transmission time of the signal from the self-moving robot to the first base station
  • r 1 is the distance from the self-moving robot to the first base station.
  • TDOA Time Difference Of Arrival
  • the first base station is assumed to be farther from the MR than the second base station, and the formula is written out as below:
  • the coordinates of the first base station are (x1, y1)
  • the coordinates of the second base station are (x2, y2)
  • the coordinates of the self-moving robot are (x, y)
  • t 1 and t 2 are the transmission time of the signal from the self-moving robot to the first base station and the second base station, respectively.
  • the three methods described above can locate the movement of the robot.
  • it is required to artificially manipulate the MR to move and gather sample points P on the movement path in advance.
  • the user may walk while holding the MR or a positioning device which has been detached from the MR and which is equipped with a signal transceiver, or may control the MR to move via an interactive device.
  • the sample points should be gathered at intervals during the movement.
  • the user may set the time interval for gathering the sample points by the interactive device so that the MR can automatically gather the sample points at this time interval, or may manually manipulate corresponding function keys to gather the sample points.
  • the sample points may be connected according to the boundary delimiting modes preset on the interactive device.
  • the boundary delimiting mode refers to the connection mode of the sample points.
  • the sample points may be connected sequentially by straight lines or curves to form a boundary, or the sample points are used to be fitted into a curved boundary, or the first and last points are connected to form a closed boundary, or the sample points are used to obtain a straight boundary of which both ends can be extended infinitely.
  • the boundary delimiting mode may be artificially designed and programmed into the MR for easy selection by the user.
  • the interactive device comprises a selector button or a selector indicator screen that is provided on the surface of the MR, or a remote control provided for the MR, or a mobile terminal (e.g., a mobile phone, a tablet, etc.) that communicates with the MR via Bluetooth or Wi-Fi.
  • a selector button or a selector indicator screen that is provided on the surface of the MR, or a remote control provided for the MR, or a mobile terminal (e.g., a mobile phone, a tablet, etc.) that communicates with the MR via Bluetooth or Wi-Fi.
  • the MR is programmed and set to be prohibited from crossing the boundary, so that it can operate inside or outside the delimited boundary.
  • FIG. 2 is a schematic diagram of the first embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of plane coordinate system placing three base stations BS in the movement area A of the self-moving robot MR, such that the three base stations BS are assuredly not on the same line; and determining a plane coordinate system using the three base stations BS, wherein the plane coordinate system is located in the movement area A of the self-moving robot.
  • sample points P obtained by MR automatic gathering or by artificial random gathering; and calculating coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival (i.e., “TDOA” for short) method.
  • Geometric Positioning method Least Squares method or Time Difference Of Arrival (i.e., “TDOA” for short) method.
  • boundary delimiting delimiting a straight or curved boundary according to the gathered sample points; and then achieving a regional division by setting the MR to be prohibited from crossing the boundary.
  • the curve X is determined by the four gathered sample points P 1 to P 4 .
  • the MR is set to be prohibited from crossing the curve X, as shown by a plurality of straight lines Y with arrows in FIG. 1 which shows the actual movement position of the MR, the MR only moves at one side of the curve X, and does not cross the curve X to move at the other side.
  • the additional obstacles may be combined with the curve X to achieve a completely separated regional division. Since both ends of the curve X are not closed at the boundary with the movement area A, it is preferable to connect the curve X and the obstacles or to add other design functions so that the regional division is more complete.
  • the system may assume that the straight line can be infinitely extended from its endpoints until it intersects with the boundary with the movement area A to form a closed divided region.
  • FIG. 3 is a schematic diagram of the second embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of plane coordinate system placing three base stations BS in the movement area A of the self-moving robot MR, such that the three base stations BS are assuredly not on the same line, and thus a plane can be determined using the three base stations BS; after the base stations BS are placed, determining a plane coordinate system by the methods described above, wherein the plane coordinate system is located in the movement area A of the self-moving robot.
  • sample points P obtained by MR automatic gathering or artificial random gathering; and calculating coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • a closed graphic M is determined using four gathered sample points P 1 to P 4 .
  • the MR is set to be prohibited from crossing the closed graphic M, as shown by a plurality of straight lines N 1 and N 2 with arrows in FIG. 3 which shows the actual movement position of the MR, the MR only moves inside or outside the closed graphic M and does not cross the closed graphic M.
  • the self-moving robot may be programmed and set to allow the self-moving robot to complete tasks for a certain time or a certain distance within the delimited boundary, and then leave the delimited boundary to continue other tasks.
  • FIG. 4 is a schematic diagram of the third embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • sample points P obtained by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • boundary delimiting vertically or non-vertically projecting the set of gathered sample points P onto the XY plane or other plane, wherein the projection plane is a plane in which the movement area A of the self-moving robot is located; and determining a boundary using the mapped points P 1 ′ to P 4 ′ after the sample points P 1 to P 4 gathered in the space are onto the plane coordinate system XOY, wherein the boundary may be consisted of multiple straight lines connected between the mapped points or may be an envelope curve. As shown in FIG. 3 , in the present embodiment, the boundary is the curve Q. Then, the regional division is achieved by setting the MR to be prohibited from crossing the curve Q.
  • the MR As shown by a plurality of straight lines Z with arrows in FIG. 3 which shows the actual movement position of the MR, after the MR is set to be prohibited from crossing the curve Q, the MR only moves at one side of the curve Q and does not cross the curve Q to the other side. Similar to the first embodiment, if there are additional obstacles such as walls in the movement area A, the obstacles can be combined with the curve Q to achieve a completely separated regional division.
  • the sample points are gathered when the self-moving robot moves in the space. Then, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined using the mapped points. Then, the movement area is divided in the manner of prohibiting from crossing the boundary.
  • FIG. 5 is a schematic diagram of the fourth embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method is substantially the same as that of the third embodiment in that: in both methods, the sample points are gathered when the self-moving robot moves in the space; hereafter, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined with the mapped points; and then, the movement area is divided in the manner of prohibiting from passing across the boundary.
  • the two methods only differ in the graphics formed using the mapped points.
  • the graphic is formed as the non-closed curve Q, whereas in the present embodiment, the graphic is formed as a closed graphic H.
  • FIG. 6 is a schematic diagram of the fourth embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method is substantially the same as that of the fourth embodiment in that: in both methods, the sample points are gathered when the self-moving robot moves in the space; hereafter, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined with the mapped points; and then, the movement area is divided in the manner of prohibiting from passing across the boundary.
  • the two methods only differ in the projecting directions.
  • the projection is vertical, whereas in the present embodiment, the projection is non-vertical. In the case of the non-vertical projection, it is required to preset the projection angle and the projection direction in the processor program and then calculate the coordinates finally projected on the plane.
  • a closed pattern H′ is formed by the mapped points.
  • FIG. 7 is a schematic diagram of the sixth embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system is determined after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • sample points P obtained by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • the method of the present embodiment is applicable to both ground self-moving robots and flying self-moving robots.
  • FIG. 8 is a schematic diagram of the seventh embodiment of the present invention.
  • the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • sample points P obtained by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • boundary delimiting as shown in FIG. 8 : three-dimensionally constructing a three-dimensional space using a plurality of sample points P 1 to P 9 by interpolating or fitting the sample points into a combination of standard three-dimensional shapes (i.e., a combination of cuboid C and triangular pyramid D); and achieving a regional division by prohibiting the self-moving robot from moving outside the range defined by the three-dimensional space.
  • the MR after the boundary is delimited, the MR only can move inside or outside the three-dimensional space and cannot cross the surface thereof.
  • the three-dimensional space may be constructed as a single standard three-dimensional shape such as cube, cuboid, sphere or triangular pyramid or a combination of two or more of standard three-dimensional shapes by interpolating or fitting the sample points.
  • the method of the present embodiment is mainly applicable to flying self-moving robots.
  • a plurality of base stations are placed in the movement area of the self-moving robot and the coordinates of the self-moving robot are determined by measuring the distances from the self-moving robot to the base stations, hereby delimiting a boundary.
  • the areas divided by the boundary may be set as a working area or a non-working area. In the setting, the working area may also use the default area delimited by the self-moving robot, or may be selected artificially.
  • the sampling is performed based on the plane movement trajectory, and the boundary is delimited on the plane.
  • the sampling is performed based on the spatial movement trajectory, the vertical or non-vertical projection is conducted to form the mapped points on the plane, and then the boundary is delimited using the mapped points.
  • the sampling is performed based on the spatial movement trajectory, and the boundary is delimited in the space.
  • the present invention provides a self-moving robot movement boundary delimiting method, which achieves regional division by distance measurement and positioning based on stationary base stations and have obvious advantages in term of either accuracy or convenience compared to the prior art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

In a self-moving robot movement boundary delimiting method, in step 100: setting up three or more base stations in a movement area of a self-moving robot, and establishing a coordinate system; in step 200: artificially planning a movement path in the movement area of the self-moving robot, gathering sample points on the path, and determining the coordinates of the sample points in the coordinate system; and in step 300: delimiting a boundary according to the coordinates of the gathered sample points, and setting the self-moving robot to work inside or outside the boundary. The present invention achieves a regional division by distance measurement and positioning based on stationary base stations, thus improving accuracy and convenience compared to the prior art.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a self-moving robot movement boundary delimiting method, which belongs to a technical field of self-moving robot movement control.
  • BACKGROUND ART
  • The self-moving robot is a typical robot, including various types such as sweeping robot, mowing robot, home service robot, surveillance robot and the like, and is very popular with customers for that it is characterized as being capable of walking freely. How to effectively control the movement of the self-moving robot within a certain working space is a key issue. In order to solve the problem of how to restrict the movement range of the self-moving robot, the self-moving robot needs to divide its movement region into blocks. The existing regional division methods comprise Satellite Positioning method, Marker Setting-up method, Spatial Infrared Signal Guidance method and the like. However, these existing regional division methods have the problems of low precision and cumbersome marker arrangement, and lack universality because their applications need to be particularly set according to the specific requirements of the actual environment. The invention application published as CN 101109809A discloses positioning device, system and method based on a direction control photosensitive array, in which the real-time positioning of moving objects in a house or a small area is realized using sine theorem calculation by three infrared signal emitters fixed in the same plane and a signal receiver mounted on the robot device. However, such method only can realize the real-time positioning of the robot, and has a low accuracy of calculation and cannot realize the function of delimiting movement boundary.
  • SUMMARY OF INVENTION
  • In view of the deficiencies in the prior art, the object of the present invention aims to provide a self-moving robot movement boundary delimiting method, which achieves regional division by distance measurement and positioning based on stationary base stations and has obvious advantages in term of either accuracy or convenience compared to the prior art.
  • The object of the present invention is achieved by the following technical solutions.
  • A self-moving robot movement boundary delimiting method comprises the following steps:
  • step 100: setting up three or more base stations in a movement area of a self-moving robot, and establishing a coordinate system;
  • step 200: artificially planning a movement path in the movement area of the self-moving robot, gathering sample points on the path, and determining the coordinates of the sample points in the coordinate system; and
  • step 300: delimiting a boundary according to the coordinates of the gathered sample points, and setting the self-moving robot to work inside or outside the boundary.
  • In the step 100, establishing the coordinate system by using one of the base stations as an origin, and calculating distances between the respective base stations by measuring signal transmission time between the respective base stations, whereby determining the coordinates of the respective base stations in the coordinate system.
  • In the step 200, determining the coordinates of the sample points specifically comprises calculating the coordinates of the sample points in the coordinate system by measuring signal transmission time between the self-moving robot and the respective base stations, and methods for the calculation include Geometric Positioning method, Least Squares method or Time Difference Of Arrival method. In the step 200, the artificially planned movement path may be implemented in various ways, specifically including: a path formed by controlling the self-moving robot to move by a user via an interactive device; or a path formed by moving a positioning device provided for the self-moving robot in the movement area after the positioning device is detached from the self-moving robot by the user.
  • More specifically, the gathering of the sample points in the step 200 is an interval gathering which is performed automatically at a preset time interval by the self-moving robot being moved, or is a random gathering which is performed artificially.
  • The present invention establishes the coordinate system by setting up the base stations. The coordinate system may be a plane coordinate system or a three-dimensional coordinate system. In the different coordinate systems, the shapes of the delimited boundaries may differ.
  • Specifically, the coordinate system in the step 100 is a plane coordinate system established using three base stations, and a plane in which the plane coordinate system is located is coplanar with the movement area of the self-moving robot.
  • The boundary in the step 300 is an open or closed line formed by the sample points.
  • The coordinate system in the step 100 is a three-dimensional coordinate system established using four base stations.
  • The step 300 specifically comprises vertically or non-vertically projecting a set of the gathered sample points onto a plane, in which the movement area of the self-moving robot is located, to form mapped points, and the boundary is an open or closed line formed by connecting the mapped points.
  • The boundary in the step 300 is a plane determined using three sample points, or a plane fitted out using three or more sample points.
  • The boundary in the step 300 is a surface of a three-dimensional space constructed using a plurality of sample points by interpolating or fitting the sample points into a standard three-dimensional shape or a combination of standard three-dimensional shapes.
  • The standard three-dimensional shape is cube, cuboid, sphere or triangular pyramid.
  • In sum, the present invention delimits a movement boundary by distance measurement and positioning based on stationary base stations, and has obvious advantages in term of either accuracy or convenience compared to the prior art.
  • The technical solutions of the present invention will be described in detail with reference to the accompanying drawings and the specific embodiments.
  • DESCRIPTION OF ATTACHED DRAWINGS
  • FIG. 1 is a schematic diagram of a plane coordinate system established according to the present invention;
  • FIG. 2 is a schematic diagram of a first embodiment of the present invention;
  • FIG. 3 is a schematic diagram of a second embodiment of the present invention;
  • FIG. 4 is a schematic diagram of a third embodiment of the present invention;
  • FIG. 5 is a schematic diagram of a fourth embodiment of the present invention;
  • FIG. 6 is a schematic diagram of a fifth embodiment of the present invention;
  • FIG. 7 is a schematic diagram of a sixth embodiment of the present invention; and
  • FIG. 8 is a schematic diagram of a seventh embodiment of the present invention.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • The present invention provides a self-moving robot movement boundary delimiting method which delimits a movement boundary by the distance measurement and positioning based on stationary base stations. To be specific, such automatic robot positioning system comprises a self-moving robot (MR) and three or more base stations (BS). The self-moving robot and the base stations each are provided with respective wireless signal transceivers. In order to ensure reliability of measurement, the wireless signal to be transmitted may be infrared ray, ultrasonic, laser, electromagnetic wave, etc., and the transmission speed k thereof is known. In a normal operation, the self-moving robot and the base stations transmit the wireless signals, receive the signals from each other and measure the transmission time t of the signals. The distances L between the base stations and the distances S between the self-moving robot and the respective base stations can be calculated by k×t.
  • FIG. 1 is a schematic diagram of a plane coordinate system established according to the present invention. As shown in FIG. 1, the plane coordinate system is established as follows. First, according to the principle of determining a plane using three points, a plane is determined using three base stations BS1, BS2 and BS3, and a coordinate system is established in the plane. The first base station BS1 is an origin (0, 0) of the coordinate system, and a straight line on which the first base station BS1 the second base station BS2 are located may be set as an X-axis while a straight line vertical to the X-axis is a Y-axis. With the above formula k×t, the relative distances L between the respective base stations are calculated, hereby obtaining the respective coordinates of the base stations in the plane coordinate system.
  • The above method of establishing the coordinate system is relatively simple. In practice, it is not necessary to set one of the base stations as the origin and to determine the X-axis using the first and second base stations. For example, under the assumption that the coordinates of the first base station are (X1, Y1) the coordinates of the second base station shall be (X1+L1×cos A, Y1+L1×sin A), where L1 is the distance between the first and second base stations, and A is the angle between the connection line of the first and second base stations and the X-axis. X1, Y1 and A may be arbitrarily selected to determine the coordinate system. Once the coordinate system is established, the coordinates of the respective base stations can be determined.
  • Of course, a plane coordinate system can be established using three base stations and a three-dimensional coordinate system can be established using four base stations. Further, it is to be noted that if the plane coordinate system is established using three base stations, it is necessary that the three base stations are not on the same line. In addition, the number of the base stations to be set up may be increased so as to improve the accuracy of calculation.
  • As shown in conjunction with FIG. 1, the distances S between the self-moving robot and the respective base stations are obtained by calculation. For example, in the case of the plane coordinate system established using three points, S1 is calculated according to the measured time t1 of the signal from the self-moving robot to the first base station as well as the known transmission speed. S2 and S3 can be calculated in the similar manner. As shown in FIG. 1, since the coordinates of the first base station BS1 are (0, 0), and the distance between the first base station BS1 and the second base station BS2 is L1, the coordinates of the second base station BS2 are (L1, 0). The angle A can be calculated according to S1, S2 and L1, and in turn the coordinates of the MR can be calculated according to S3. The calculation method utilized above may be referred to as the Geometric Positioning method.
  • Further, the coordinates of the MR can also be calculated by the Least Squares method according to the following formula:

  • (x−x 1)2+(y−y 1)2 =r 1 2 ,r 1 =t 1 ×k,
  • wherein the coordinates of the first base station BS1 are (x1, y1), the coordinates of the self-moving robot are (x, y), t1 is the transmission time of the signal from the self-moving robot to the first base station, and r1 is the distance from the self-moving robot to the first base station. Similarly, the formulas corresponding to the other two base stations can be obtained. The values of x and y (that is, the coordinates of MR) can be found once t1, t2 and t3 are measured.
  • In addition, a Time Difference Of Arrival (i.e., “TDOA” for short) method may be used to determine the coordinates of the MR.
  • The first base station is assumed to be farther from the MR than the second base station, and the formula is written out as below:

  • √{square root over (((x−x 1)2+(y−y 1)2))}−√{square root over (((x−x 2)2+(y−y 2)2))}=(t 1 −t 2k
  • wherein, the coordinates of the first base station are (x1, y1), the coordinates of the second base station are (x2, y2), the coordinates of the self-moving robot are (x, y), and t1 and t2 are the transmission time of the signal from the self-moving robot to the first base station and the second base station, respectively.
  • Similarly, the rest two formulas can be written out. Once t1, t2 and t3 are measured, the values of x and y (that is, the coordinates of MR) can be found.
  • The three methods described above can locate the movement of the robot. In order to obtain a boundary desired by the user, it is required to artificially manipulate the MR to move and gather sample points P on the movement path in advance. Specifically, the user may walk while holding the MR or a positioning device which has been detached from the MR and which is equipped with a signal transceiver, or may control the MR to move via an interactive device. The sample points should be gathered at intervals during the movement. The user may set the time interval for gathering the sample points by the interactive device so that the MR can automatically gather the sample points at this time interval, or may manually manipulate corresponding function keys to gather the sample points.
  • After the sample points P are obtained, the sample points may be connected according to the boundary delimiting modes preset on the interactive device. The boundary delimiting mode refers to the connection mode of the sample points. For example, the sample points may be connected sequentially by straight lines or curves to form a boundary, or the sample points are used to be fitted into a curved boundary, or the first and last points are connected to form a closed boundary, or the sample points are used to obtain a straight boundary of which both ends can be extended infinitely. The boundary delimiting mode may be artificially designed and programmed into the MR for easy selection by the user. The interactive device comprises a selector button or a selector indicator screen that is provided on the surface of the MR, or a remote control provided for the MR, or a mobile terminal (e.g., a mobile phone, a tablet, etc.) that communicates with the MR via Bluetooth or Wi-Fi.
  • After the boundary is delimited using the sample points, the MR is programmed and set to be prohibited from crossing the boundary, so that it can operate inside or outside the delimited boundary.
  • First Embodiment
  • FIG. 2 is a schematic diagram of the first embodiment of the present invention. As shown in FIG. 2, in the present embodiment, the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of plane coordinate system: placing three base stations BS in the movement area A of the self-moving robot MR, such that the three base stations BS are assuredly not on the same line; and determining a plane coordinate system using the three base stations BS, wherein the plane coordinate system is located in the movement area A of the self-moving robot.
  • second, obtainment of sample points: obtaining sample points P by MR automatic gathering or by artificial random gathering; and calculating coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival (i.e., “TDOA” for short) method.
  • last, completion of boundary delimiting: delimiting a straight or curved boundary according to the gathered sample points; and then achieving a regional division by setting the MR to be prohibited from crossing the boundary. In the embodiment shown in FIG. 2, the curve X is determined by the four gathered sample points P1 to P4. After the MR is set to be prohibited from crossing the curve X, as shown by a plurality of straight lines Y with arrows in FIG. 1 which shows the actual movement position of the MR, the MR only moves at one side of the curve X, and does not cross the curve X to move at the other side.
  • Of course, if there are additional obstacles such as walls in the movement area A, the additional obstacles may be combined with the curve X to achieve a completely separated regional division. Since both ends of the curve X are not closed at the boundary with the movement area A, it is preferable to connect the curve X and the obstacles or to add other design functions so that the regional division is more complete. In the case of a straight line determined using a number of sample points, the system may assume that the straight line can be infinitely extended from its endpoints until it intersects with the boundary with the movement area A to form a closed divided region.
  • Second Embodiment
  • FIG. 3 is a schematic diagram of the second embodiment of the present invention. As shown in FIG. 3, in the present embodiment, the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of plane coordinate system: placing three base stations BS in the movement area A of the self-moving robot MR, such that the three base stations BS are assuredly not on the same line, and thus a plane can be determined using the three base stations BS; after the base stations BS are placed, determining a plane coordinate system by the methods described above, wherein the plane coordinate system is located in the movement area A of the self-moving robot.
  • second, obtainment of sample points: obtaining sample points P by MR automatic gathering or artificial random gathering; and calculating coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • last, completion of boundary delimiting: delimiting a closed graphic according to the set of gathered sample points, wherein the delimiting method comprises straight line or curve interpolation or fitting; and dividing the movement area into an in-graphic area and an out-graphic area after the closed graphic is determined, hereby achieving the division of the movement area of the self-moving robot. In the embodiment shown in FIG. 3, a closed graphic M is determined using four gathered sample points P1 to P4. After the MR is set to be prohibited from crossing the closed graphic M, as shown by a plurality of straight lines N1 and N2 with arrows in FIG. 3 which shows the actual movement position of the MR, the MR only moves inside or outside the closed graphic M and does not cross the closed graphic M.
  • Further, the self-moving robot may be programmed and set to allow the self-moving robot to complete tasks for a certain time or a certain distance within the delimited boundary, and then leave the delimited boundary to continue other tasks.
  • Third Embodiment
  • FIG. 4 is a schematic diagram of the third embodiment of the present invention. As shown in FIG. 4, in the present embodiment, the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system: placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • second, obtainment of sample points: obtaining sample points P by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • last, completion of boundary delimiting, as shown in FIG. 4: vertically or non-vertically projecting the set of gathered sample points P onto the XY plane or other plane, wherein the projection plane is a plane in which the movement area A of the self-moving robot is located; and determining a boundary using the mapped points P1′ to P4′ after the sample points P1 to P4 gathered in the space are onto the plane coordinate system XOY, wherein the boundary may be consisted of multiple straight lines connected between the mapped points or may be an envelope curve. As shown in FIG. 3, in the present embodiment, the boundary is the curve Q. Then, the regional division is achieved by setting the MR to be prohibited from crossing the curve Q. As shown by a plurality of straight lines Z with arrows in FIG. 3 which shows the actual movement position of the MR, after the MR is set to be prohibited from crossing the curve Q, the MR only moves at one side of the curve Q and does not cross the curve Q to the other side. Similar to the first embodiment, if there are additional obstacles such as walls in the movement area A, the obstacles can be combined with the curve Q to achieve a completely separated regional division.
  • Therefore, in the present embodiment, the sample points are gathered when the self-moving robot moves in the space. Then, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined using the mapped points. Then, the movement area is divided in the manner of prohibiting from crossing the boundary.
  • Fourth Embodiment
  • FIG. 5 is a schematic diagram of the fourth embodiment of the present invention. As shown in FIG. 5 and in comparison with FIG. 4, in the present embodiment, the self-moving robot movement boundary delimiting method is substantially the same as that of the third embodiment in that: in both methods, the sample points are gathered when the self-moving robot moves in the space; hereafter, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined with the mapped points; and then, the movement area is divided in the manner of prohibiting from passing across the boundary. The two methods only differ in the graphics formed using the mapped points. In the third embodiment, the graphic is formed as the non-closed curve Q, whereas in the present embodiment, the graphic is formed as a closed graphic H.
  • The other technical contents of the present embodiment are the same as those of the third embodiment and the detailed description thereof are omitted.
  • Fifth Embodiment
  • FIG. 6 is a schematic diagram of the fourth embodiment of the present invention. As shown in FIG. 6 and in comparison of FIG. 5, in the present embodiment, the self-moving robot movement boundary delimiting method is substantially the same as that of the fourth embodiment in that: in both methods, the sample points are gathered when the self-moving robot moves in the space; hereafter, the gathered spatial sample points are projected onto the movement area A of the self-moving robot to form mapped points, and a straight line or a curve is determined with the mapped points; and then, the movement area is divided in the manner of prohibiting from passing across the boundary. The two methods only differ in the projecting directions. In the fourth embodiment, the projection is vertical, whereas in the present embodiment, the projection is non-vertical. In the case of the non-vertical projection, it is required to preset the projection angle and the projection direction in the processor program and then calculate the coordinates finally projected on the plane. A closed pattern H′ is formed by the mapped points.
  • The other technical contents of the present embodiment are the same as those of the fourth embodiment and the detailed description thereof are omitted.
  • Sixth Embodiment
  • FIG. 7 is a schematic diagram of the sixth embodiment of the present invention. As shown in FIG. 7, in the present embodiment, the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system: placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system is determined after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • second, obtainment of sample points: obtaining sample points P by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • last, completion of boundary delimiting as shown in FIG. 7: under the assumption that there are four sample points P1-P4 in the three-dimensional space, determining a plane U using three sample points P1, P2 and P3, or fitting out a plane U using three or more sample points; and achieving a regional division in the manner of prohibiting the MR from crossing the plane. As shown in FIG. 7, after the boundary is delimited, the MR only can move below the plane U and cannot cross the plane U to move above the plane. It is to be noted that when this method is applied to a ground moving robot, the limitation to the robot by the plane U is the intersection line of the plane U and the ground.
  • The method of the present embodiment is applicable to both ground self-moving robots and flying self-moving robots.
  • Seventh Embodiment
  • FIG. 8 is a schematic diagram of the seventh embodiment of the present invention. As shown in FIG. 8, in the present embodiment, the self-moving robot movement boundary delimiting method mainly comprises the following steps:
  • first, determination of three-dimensional coordinate system: placing four base stations BS in the movement area A of the self-moving robot MR, such that the base stations BS form a three-dimensional space; determining a three-dimensional coordinate system after the base stations BS are placed, wherein when the MR is located in the three-dimensional coordinate system, the coordinates of the MR can be calculated according to a signal transmission time.
  • second, obtainment of sample points: obtaining sample points P by MR automatic gathering or artificial random gathering; and calculating the coordinates of the respective sample points by Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
  • last, completion of boundary delimiting as shown in FIG. 8: three-dimensionally constructing a three-dimensional space using a plurality of sample points P1 to P9 by interpolating or fitting the sample points into a combination of standard three-dimensional shapes (i.e., a combination of cuboid C and triangular pyramid D); and achieving a regional division by prohibiting the self-moving robot from moving outside the range defined by the three-dimensional space. As shown in FIG. 8, after the boundary is delimited, the MR only can move inside or outside the three-dimensional space and cannot cross the surface thereof. The three-dimensional space may be constructed as a single standard three-dimensional shape such as cube, cuboid, sphere or triangular pyramid or a combination of two or more of standard three-dimensional shapes by interpolating or fitting the sample points.
  • The method of the present embodiment is mainly applicable to flying self-moving robots.
  • As can be seen from the above seven embodiments, in the present invention, a plurality of base stations are placed in the movement area of the self-moving robot and the coordinates of the self-moving robot are determined by measuring the distances from the self-moving robot to the base stations, hereby delimiting a boundary. The areas divided by the boundary may be set as a working area or a non-working area. In the setting, the working area may also use the default area delimited by the self-moving robot, or may be selected artificially. In the first and second embodiments, the sampling is performed based on the plane movement trajectory, and the boundary is delimited on the plane. In the third, fourth and fifth embodiments, the sampling is performed based on the spatial movement trajectory, the vertical or non-vertical projection is conducted to form the mapped points on the plane, and then the boundary is delimited using the mapped points. In the sixth and seventh embodiments, the sampling is performed based on the spatial movement trajectory, and the boundary is delimited in the space.
  • In sum, the present invention provides a self-moving robot movement boundary delimiting method, which achieves regional division by distance measurement and positioning based on stationary base stations and have obvious advantages in term of either accuracy or convenience compared to the prior art.

Claims (18)

What is claimed is:
1. A self-moving robot movement boundary determining method, comprising:
obtaining positions of sample points obtained along a desired boundary of a self-moving robot by moving a positioning device coupling with the self-moving robot along the desired boundary; and
determining a movement boundary according to the positions of the sample points, wherein the self-moving robot is set to work inside or outside the movement boundary.
2. The method according to claim 1, wherein the positioning device coupling with the self-moving robot is moved along the desired boundary by:
controlling the self-moving robot with the positioning device mounted thereon to move along the desired boundary by a user via an interactive device.
3. The method according to claim 1, wherein the positioning device coupling with the self-moving robot is moved along the desired boundary by:
moving the self-moving robot with the positioning device mounted thereon to move along the desired boundary while the self-moving robot is held by a user.
4. The method according to claim 1, wherein the positioning device coupling with the self-moving robot is moved along the desired boundary by:
moving the positioning device detached from the self-moving robot along the desired boundary.
5. The method according to claim 1, wherein the obtaining the positions of the sample points comprises:
obtaining the positions of the sample points according to a position of a base station, wherein the base station is in communication with the positioning device coupling with the self-moving robot, and a position of the base station is known.
6. The method according to claim 5, wherein the obtaining the positions of the sample points according to a position of a base station comprises:
setting up at least three base stations in a movement area of the self-moving robot, to establish a coordinate system according to relative locations between the at least three base stations; and
determining coordinates of the sample points in the coordinate system to obtain the positions of sample points.
7. The method according to claim 6, wherein the determining the coordinates of the sample points in the coordinate system comprises:
calculating the coordinates of the sample points in the coordinate system by measuring signal transmission time between the positioning device coupling with the self-moving robot and the respective base stations, according to Geometric Positioning method, Least Squares method or Time Difference Of Arrival method.
8. The method according to claim 6, wherein the coordinate system is established by using one of the at least three base stations as an origin, and calculating distances between the respective base stations by measuring signal transmission time between the respective base stations, whereby determining the coordinates of the respective base stations in the coordinate system.
9. The method according to claim 6, wherein the coordinate system is a plane coordinate system established using three base stations, and a plane in which the plane coordinate system is located is coplanar with the movement area of the self-moving robot.
10. The method according to claim 6, wherein the coordinate system is a three-dimensional coordinate system established using four base stations.
11. The method according to claim 10, wherein the determining a movement boundary according to the positions of the sample points comprises:
vertically or non-vertically projecting the sample points onto a plane, in which the movement area of the self-moving robot is located, to form mapped points, and the boundary is formed by connecting the mapped points.
12. The method according to claim 10, wherein the movement boundary is a surface of a three-dimensional space constructed using the sample points by interpolating or fitting the sample points into a standard three-dimensional shape or a combination of standard three-dimensional shapes.
13. The method according to claim 12, wherein the standard three-dimensional shape is cube, cuboid, sphere or triangular pyramid.
14. The method according to claim 1, wherein the sample points are obtained at a preset time interval performed automatically by the self-moving robot or obtained at a random interval controlled by a user.
15. The method according to claim 1, wherein the movement boundary is an open or closed line formed by the sample points.
16. A self-moving robot movement boundary determining method, comprising:
obtaining positions of sample points obtained along a desired boundary of a self-moving robot by moving the self-moving robot along the desired boundary; and
determining a movement boundary according to the positions of the sample points, wherein the self-moving robot is set to work inside or outside the movement boundary.
17. The method according to claim 16, wherein the obtaining the positions of the sample points comprises:
obtaining the positions of the sample points according to a position of a base station, wherein the base station is in communication with the self-moving robot, and a position of the base station is known.
18. The method according to claim 17, wherein the obtaining the positions of the sample points according to a position of a base station comprises:
setting up at least three base stations in a movement area of the self-moving robot, to establish a coordinate system according to relative locations between the at least three base stations; and
determining coordinates of the sample points in the coordinate system to obtain the positions of sample points.
US16/691,419 2014-09-26 2019-11-21 Self-moving robot movement boundary determining method Abandoned US20200089235A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/691,419 US20200089235A1 (en) 2014-09-26 2019-11-21 Self-moving robot movement boundary determining method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201410504400.6 2014-09-26
CN201410504400.6A CN105446350B (en) 2014-09-26 2014-09-26 Self-movement robot moves boundary demarcation method
PCT/CN2015/090736 WO2016045617A1 (en) 2014-09-26 2015-09-25 Self-moving robot movement boundary drawing method
US201715514207A 2017-03-24 2017-03-24
US16/691,419 US20200089235A1 (en) 2014-09-26 2019-11-21 Self-moving robot movement boundary determining method

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2015/090736 Continuation WO2016045617A1 (en) 2014-09-26 2015-09-25 Self-moving robot movement boundary drawing method
US15/514,207 Continuation US10520950B2 (en) 2014-09-26 2015-09-25 Self-moving robot movement boundary delimiting method

Publications (1)

Publication Number Publication Date
US20200089235A1 true US20200089235A1 (en) 2020-03-19

Family

ID=55556670

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/514,207 Active 2036-03-05 US10520950B2 (en) 2014-09-26 2015-09-25 Self-moving robot movement boundary delimiting method
US16/691,419 Abandoned US20200089235A1 (en) 2014-09-26 2019-11-21 Self-moving robot movement boundary determining method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/514,207 Active 2036-03-05 US10520950B2 (en) 2014-09-26 2015-09-25 Self-moving robot movement boundary delimiting method

Country Status (4)

Country Link
US (2) US10520950B2 (en)
EP (3) EP3889725B1 (en)
CN (1) CN105446350B (en)
WO (1) WO2016045617A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022077533A1 (en) * 2020-10-18 2022-04-21 南京朗禾数据有限公司 Method for controlling agricultural-machine operation area calculation system
US11664690B2 (en) 2015-10-08 2023-05-30 Pathfinder Propulsion, LLC Combined propellant-less propulsion and reaction wheel device

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446350B (en) * 2014-09-26 2018-05-29 科沃斯机器人股份有限公司 Self-movement robot moves boundary demarcation method
CN105527961A (en) * 2014-09-30 2016-04-27 科沃斯机器人有限公司 Self-propelled surface-traveling robot system and method for returning to primary charging base
CN104333498B (en) * 2014-11-28 2018-10-02 小米科技有限责任公司 Control the method and device of smart home device
CN109270936A (en) * 2016-04-15 2019-01-25 苏州宝时得电动工具有限公司 Automatic working system and its control method
EP4276645A3 (en) * 2016-12-15 2024-01-10 Positec Power Tools (Suzhou) Co., Ltd. State detection method for an automatic working system and mobile station
CN109392086A (en) * 2017-08-08 2019-02-26 深圳市润安科技发展有限公司 A kind of method and system of the more base station locations of multizone
CN109591008A (en) * 2017-11-18 2019-04-09 广州科语机器人有限公司 The area of safety operaton of mobile robot determines method
CN108335302B (en) * 2018-01-26 2021-10-08 上海思岚科技有限公司 Region segmentation method and device
CN109077667B (en) * 2018-07-16 2020-12-01 广州俊德信息科技有限公司 Adjusting method and system of cleaning electric appliance, storable medium and cleaning electric appliance
KR102242713B1 (en) 2018-08-03 2021-04-22 엘지전자 주식회사 Moving robot and contorlling method and a terminal
KR102266713B1 (en) 2018-08-03 2021-06-21 엘지전자 주식회사 Lawn mover robot, system of lawn mover robot and controlling method for system of lawn mover robot
KR102242714B1 (en) 2018-08-03 2021-04-22 엘지전자 주식회사 Moving robot and contorlling method and a moving robot system
KR102313754B1 (en) 2018-08-03 2021-10-18 엘지전자 주식회사 Moving robot, system of moving robot and method for moving to charging station of moving robot
KR102238352B1 (en) 2018-08-05 2021-04-09 엘지전자 주식회사 Station apparatus and moving robot system
US11960278B2 (en) 2018-08-05 2024-04-16 Lg Electronics Inc. Moving robot and controlling method thereof
CN109186463B (en) * 2018-09-04 2021-12-07 浙江梧斯源通信科技股份有限公司 Anti-falling method applied to movable robot
CN109084779A (en) * 2018-09-29 2018-12-25 上海思依暄机器人科技股份有限公司 A kind of robot navigation method and system
CN109765885A (en) * 2018-11-21 2019-05-17 深圳市迈康信医用机器人有限公司 The method and its system of Wheelchair indoor automatic Pilot
CN109782225B (en) * 2019-01-18 2021-04-16 杭州微萤科技有限公司 Method for positioning coordinates of base station
CN109672979B (en) * 2019-02-14 2020-12-08 中国联合网络通信集团有限公司 Dual-card communication method and device
CN111836185B (en) * 2019-04-22 2023-10-10 苏州科瓴精密机械科技有限公司 Method, device, equipment and storage medium for determining base station position coordinates
CN110226373A (en) * 2019-06-03 2019-09-13 中国农业大学 Vegetables strain inter-row weeder structure
CN110132270B (en) * 2019-06-13 2021-08-31 深圳汉阳科技有限公司 Positioning method of automatic snow sweeping device
CN113093716B (en) * 2019-12-19 2024-04-30 广州极飞科技股份有限公司 Motion trail planning method, device, equipment and storage medium
CN111067432B (en) * 2020-01-02 2021-09-17 小狗电器互联网科技(北京)股份有限公司 Determination method for charging working area of charging pile of sweeper and sweeper
CN111168679B (en) * 2020-01-09 2023-08-22 上海山科机器人有限公司 Walking robot, method of controlling walking robot, and walking robot system
CN113556679A (en) * 2020-04-24 2021-10-26 苏州科瓴精密机械科技有限公司 Method and system for generating virtual boundary of working area of self-moving robot, self-moving robot and readable storage medium
CN111638487B (en) * 2020-05-07 2022-07-22 中国第一汽车股份有限公司 Automatic parking test equipment and method
CN112057883B (en) * 2020-09-08 2021-11-02 北京北特圣迪科技发展有限公司 Attitude control method for four-hoisting-point flexible cable performance suspension platform
US20220197295A1 (en) * 2020-12-22 2022-06-23 Globe (jiangsu) Co., Ltd. Robotic mower, and control method thereof
US20240231370A9 (en) * 2021-02-10 2024-07-11 Beijing Roborock Innovation Technology Co., Ltd. Map display method and apparatus, medium, and electronic device
CN112959327B (en) * 2021-03-31 2022-07-29 上海电气集团股份有限公司 Robot motion control method, system, electronic device, and storage medium
CN113155023B (en) * 2021-04-02 2023-02-07 甘肃旭盛显示科技有限公司 Method and system for measuring glass warping degree of liquid crystal substrate
CN116188480B (en) * 2023-04-23 2023-07-18 安徽同湃特机器人科技有限公司 Calculation method of AGV traveling path point during ceiling operation of spraying robot

Citations (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4864520A (en) * 1983-09-30 1989-09-05 Ryozo Setoguchi Shape generating/creating system for computer aided design, computer aided manufacturing, computer aided engineering and computer applied technology
JPH08128213A (en) * 1994-11-01 1996-05-21 Sekisui House Ltd Construction method using workbench unit
US5561742A (en) * 1992-01-28 1996-10-01 Fanuc Ltd. Multiple-robot control and interference prevention method
US5788851A (en) * 1995-02-13 1998-08-04 Aksys, Ltd. User interface and method for control of medical instruments, such as dialysis machines
US20030003988A1 (en) * 2001-06-15 2003-01-02 Walker Jay S. Method and apparatus for planning and customizing a gaming experience
US20050029029A1 (en) * 2002-08-30 2005-02-10 Aethon, Inc. Robotic cart pulling vehicle
US7233795B1 (en) * 2001-03-19 2007-06-19 Ryden Michael V Location based communications system
US20070198159A1 (en) * 2006-01-18 2007-08-23 I-Guide, Llc Robotic vehicle controller
US20070282484A1 (en) * 2006-06-01 2007-12-06 Samsung Electronics Co., Ltd. Method, medium and apparatus classifying and collecting area feature information according to a robot's moving path, and a robot controlled by the area features
US20090282710A1 (en) * 2007-08-08 2009-11-19 Johnson Rick D Multi-Function Material Moving Assembly and Method
US20100076599A1 (en) * 2008-09-20 2010-03-25 Steven Jacobs Manually driven determination of a region of interest (roi) or a path of interest (poi) for a robotic device
US20110030163A1 (en) * 2009-08-05 2011-02-10 Karcher N. America, Inc. Method and apparatus for extended use of cleaning fluid in a floor cleaning machine
US20110125324A1 (en) * 2009-11-20 2011-05-26 Baek Sanghoon Robot cleaner and controlling method of the same
US20130021174A1 (en) * 2010-04-09 2013-01-24 Daniella Kurland Facilities management
US20130282224A1 (en) * 2012-04-24 2013-10-24 Mamiya-Op Nequos Co., Ltd. Work machine and components thereof
US20130310982A1 (en) * 2012-05-15 2013-11-21 Kuka Laboratories Gmbh Method For Determining Possible Positions Of A Robot Arm
DE102013212060A1 (en) * 2012-06-25 2014-01-02 Lear Corporation Vehicle remote control system and method
US8726458B1 (en) * 2011-05-26 2014-05-20 Scott Reinhold Mahr Solar collector washing device
US20140247116A1 (en) * 2011-11-11 2014-09-04 Bar Code Specialties, Inc. (Dba Bcs Solutions) Robotic inventory systems
US20150271991A1 (en) * 2014-03-31 2015-10-01 Irobot Corporation Autonomous Mobile Robot
EP2945037A2 (en) * 2014-05-15 2015-11-18 iRobot Corporation Autonomous mobile robot confinement system
EP3018548A1 (en) * 2014-11-07 2016-05-11 F. Robotics Acquisitions Ltd. Domestic robotic system
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
CN106662452A (en) * 2014-12-15 2017-05-10 美国 iRobot 公司 Robot lawnmower mapping
US20180116105A1 (en) * 2014-12-22 2018-05-03 Irobot Corporation Robotic Mowing of Separated Lawn Areas
US20180173243A1 (en) * 2016-12-16 2018-06-21 Samsung Electronics Co., Ltd. Movable object and method for controlling the same
US20180173244A1 (en) * 2016-12-19 2018-06-21 Samsung Electronics Co., Ltd. Movable object and control method thereof
US20180215039A1 (en) * 2017-02-02 2018-08-02 Brain Corporation Systems and methods for assisting a robotic apparatus
US20180246215A1 (en) * 2017-02-28 2018-08-30 Stmicroelectronics, Inc. Vehicle dynamics obstacle compensation system
US20180257232A1 (en) * 2017-03-13 2018-09-13 Fanuc Corporation Robot system and robot control method
WO2019020794A1 (en) * 2017-07-28 2019-01-31 RobArt GmbH Magnetometer for robot navigation
DE112017000135T5 (en) * 2017-03-31 2019-03-14 Komatsu Ltd. working vehicle
US20190084158A1 (en) * 2017-09-19 2019-03-21 Autodesk, Inc. Modifying robot dynamics in response to human presence
US20190097827A1 (en) * 2013-01-18 2019-03-28 Irobot Corporation Mobile robot providing environmental mapping for household environmental control
US20190163174A1 (en) * 2017-11-30 2019-05-30 Lg Electronics Inc. Mobile robot and method of controlling the same
US20190167059A1 (en) * 2017-12-06 2019-06-06 Bissell Inc. Method and system for manual control of autonomous floor cleaner
US20190220678A1 (en) * 2018-01-13 2019-07-18 Toyota Jidosha Kabushiki Kaisha Localizing Traffic Situation Using Multi-Vehicle Collaboration
US20190353483A1 (en) * 2018-05-15 2019-11-21 Deere & Company Coverage-based system and method of planning a turn path for a vehicle
US20190354246A1 (en) * 2016-12-23 2019-11-21 Lg Electronics Inc. Airport robot and movement method therefor
US20190362234A1 (en) * 2019-07-02 2019-11-28 Lg Electronics Inc. Artificial intelligence apparatus for cleaning in consideration of user's action and method for the same
US20190370691A1 (en) * 2019-07-12 2019-12-05 Lg Electronics Inc. Artificial intelligence robot for determining cleaning route using sensor data and method for the same
US20200073407A1 (en) * 2018-08-30 2020-03-05 Pony.ai, Inc. Prioritizing vehicle navigation
US20200073381A1 (en) * 2018-08-30 2020-03-05 Pony.ai, Inc. Distributed sensing for vehicle navigation
JP2020033704A (en) * 2018-08-27 2020-03-05 日立建機株式会社 Work machine
US20200086498A1 (en) * 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Voice Modification To Robot Motion Plans
JP2020049852A (en) * 2018-09-27 2020-04-02 株式会社リコー Three-dimensional data generation device, program, and three-dimensional data generation method
US20200130197A1 (en) * 2017-06-30 2020-04-30 Lg Electronics Inc. Moving robot
US20200174490A1 (en) * 2017-07-27 2020-06-04 Waymo Llc Neural networks for vehicle trajectory planning
JPWO2020012609A1 (en) * 2018-07-12 2020-10-22 日立建機株式会社 Work machine
US20210006425A1 (en) * 2018-04-27 2021-01-07 Samsung Electronics Co., Ltd. Electronic device and method for controlling external electronic device
US20210046650A1 (en) * 2019-08-18 2021-02-18 Cobalt Robotics Inc. Elevator interactions by mobile robot
US20210068605A1 (en) * 2018-01-03 2021-03-11 Samsung Electronics Co., Ltd. Moving apparatus for cleaning, collaborative cleaning system, and method of controlling the same
US20210096579A1 (en) * 2016-08-05 2021-04-01 RobArt GmbH Method For Controlling An Autonomous Mobile Robot
CN112612278A (en) * 2020-12-24 2021-04-06 格力博(江苏)股份有限公司 Method for collecting position information, position collecting device and mower
US20210121035A1 (en) * 2019-10-29 2021-04-29 Lg Electronics Inc. Robot cleaner and method of operating the same
US11032298B1 (en) * 2020-04-23 2021-06-08 Specter Ops, Inc. System and method for continuous collection, analysis and reporting of attack paths in a directory services environment
CN213518002U (en) * 2020-12-24 2021-06-22 格力博(江苏)股份有限公司 Position acquisition device and lawn mower
US20210200239A1 (en) * 2019-12-30 2021-07-01 Lg Electronics Inc. Control device and method for a plurality of robots
EP3200040B1 (en) * 2014-09-26 2021-08-04 Ecovacs Robotics Co., Ltd. Drawing method of the movement boundary for a self-moving robot
US20210247775A1 (en) * 2018-06-15 2021-08-12 Ecovacs Robotics Co., Ltd. Method for localizing robot, robot, and storage medium
US20210274310A1 (en) * 2020-02-27 2021-09-02 Psj International Ltd. System for establishing positioning map data and method for the same
US20210283774A1 (en) * 2020-03-12 2021-09-16 Canon Kabushiki Kaisha Robot, control device, and information processing device
US20210304559A1 (en) * 2020-03-27 2021-09-30 Aristocrat Technologies, Inc. Gaming service automation machine with drop box services
US20210362335A1 (en) * 2019-07-12 2021-11-25 Lg Electronics Inc. Robot and method for manage item using same
US20210370176A1 (en) * 2019-09-05 2021-12-02 Tencent Technology (Shenzhen) Company Limited Method and apparatus for controlling movement of virtual object, terminal, and storage medium
US20220026920A1 (en) * 2020-06-10 2022-01-27 AI Incorporated Light weight and real time slam for robots
US20220040860A1 (en) * 2019-01-03 2022-02-10 Lucomm Technologies, Inc. Robotic Post System
US20220107648A1 (en) * 2020-10-03 2022-04-07 Viabot Inc. Systems for setting and programming zoning for use by autonomous modular robots
KR20220046210A (en) * 2020-10-07 2022-04-14 엘지전자 주식회사 Mobile robot and control method thereof
US20220122397A1 (en) * 2019-01-03 2022-04-21 Lucomm Technologies, Inc. Robotic Post
US20220197295A1 (en) * 2020-12-22 2022-06-23 Globe (jiangsu) Co., Ltd. Robotic mower, and control method thereof
US20220266446A1 (en) * 2019-01-03 2022-08-25 Lucomm Technologies, Inc. Flux Sensing System
US20220266451A1 (en) * 2019-01-03 2022-08-25 Lucomm Technologies, Inc. Robotic Gate
CN217429887U (en) * 2021-12-24 2022-09-16 华为技术有限公司 Cleaning robot
US11560297B2 (en) * 2015-09-30 2023-01-24 Stertil B.V. Lifting system with indoor positioning system and method therefor
US20230054004A1 (en) * 2019-03-20 2023-02-23 Lucomm Technologies, Inc. Flux System
US20230057149A1 (en) * 2019-03-20 2023-02-23 Lucomm Technologies, Inc. Robotic Post System
US20230186493A1 (en) * 2021-12-09 2023-06-15 NEOWINE Co., Ltd. System and method for measuring location of moving object based on artificial intelligence

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2254508A (en) * 1991-04-02 1992-10-07 James Collier Mitchinson Vehicle location determining system
US5365516A (en) * 1991-08-16 1994-11-15 Pinpoint Communications, Inc. Communication system and method for determining the location of a transponder unit
US5951610A (en) * 1995-10-31 1999-09-14 Honda Giken Kogyo Kabushiki Kaisha Method of calculating positional relationship of motor vehicle with respect to running path
US5974348A (en) * 1996-12-13 1999-10-26 Rocks; James K. System and method for performing mobile robotic work operations
US6112143A (en) * 1998-08-06 2000-08-29 Caterpillar Inc. Method and apparatus for establishing a perimeter defining an area to be traversed by a mobile machine
US6611738B2 (en) * 1999-07-12 2003-08-26 Bryan J. Ruffner Multifunctional mobile appliance
US6594576B2 (en) * 2001-07-03 2003-07-15 At Road, Inc. Using location data to determine traffic information
GB2386971B (en) * 2002-03-26 2005-11-30 Mcmurtry Ltd Method of operating an automated land maintenance vehicle
KR20030080436A (en) * 2002-04-08 2003-10-17 삼성전자주식회사 Localization apparatus and method of mobile robot
SE526913C2 (en) * 2003-01-02 2005-11-15 Arnex Navigation Systems Ab Procedure in the form of intelligent functions for vehicles and automatic loading machines regarding mapping of terrain and material volumes, obstacle detection and control of vehicles and work tools
KR100520079B1 (en) * 2003-08-01 2005-10-12 삼성전자주식회사 robot system and control method thereof
US20060020370A1 (en) * 2004-07-22 2006-01-26 Shai Abramson System and method for confining a robot
US7026992B1 (en) * 2005-03-31 2006-04-11 Deere & Company Method for configuring a local positioning system
US7456596B2 (en) * 2005-08-19 2008-11-25 Cisco Technology, Inc. Automatic radio site survey using a robot
KR100660945B1 (en) * 2006-09-11 2006-12-26 주식회사 마이크로로보트 Virtual mapping method of absolute address
US8180486B2 (en) * 2006-10-02 2012-05-15 Honda Motor Co., Ltd. Mobile robot and controller for same
CN101109809A (en) 2007-08-17 2008-01-23 张铁军 Positioning device, system and method based on direction control photosensitive array
US8229618B2 (en) * 2008-09-11 2012-07-24 Deere & Company Leader-follower fully autonomous vehicle with operator on side
JP5480799B2 (en) * 2010-12-14 2014-04-23 本田技研工業株式会社 Mobile device, robot and control system thereof
KR101334961B1 (en) * 2011-08-03 2013-11-29 엘지전자 주식회사 Lawn mower robot system and control method for the same
US9471063B2 (en) * 2011-08-11 2016-10-18 Chien Ouyang Robotic lawn mower with network sensors
EP2866980A4 (en) * 2012-06-26 2016-07-20 Husqvarna Ab Detachable user interface for a robotic vehicle
US9497901B2 (en) * 2012-08-14 2016-11-22 Husqvarna Ab Boundary definition system for a robotic vehicle
CN202795052U (en) * 2012-08-29 2013-03-13 科沃斯机器人科技(苏州)有限公司 Automatic movable robot walking range limiting system
CN103076802B (en) * 2012-10-09 2016-01-20 江苏大学 Robot virtual boundary is set up and recognition methods and system
CN103823464A (en) * 2012-11-16 2014-05-28 苏州宝时得电动工具有限公司 Self-driven mobile apparatus
US9516806B2 (en) * 2014-10-10 2016-12-13 Irobot Corporation Robotic lawn mowing boundary determination
US10518407B2 (en) * 2015-01-06 2019-12-31 Discovery Robotics Apparatus and methods for providing a reconfigurable robotic platform
US10328573B2 (en) * 2015-01-06 2019-06-25 Discovery Robotics Robotic platform with teach-repeat mode
US20180361581A1 (en) * 2015-01-06 2018-12-20 Discovery Robotics Robotic platform with following mode
US9526390B2 (en) * 2015-01-20 2016-12-27 Lg Electronics Inc. Robot cleaner
EP3392729B1 (en) * 2015-12-17 2021-10-27 Positec Power Tools (Suzhou) Co., Ltd Auto-movement robot system
CN109270936A (en) * 2016-04-15 2019-01-25 苏州宝时得电动工具有限公司 Automatic working system and its control method
EP3413155B1 (en) * 2017-06-09 2020-02-26 Andreas Stihl AG & Co. KG Method for the detection of at least one section of a limiting edge of a surface to be processed, method for operating an autonomous mobile green area processing robot, detection system and green area processing system
KR102070068B1 (en) * 2017-11-30 2020-03-23 엘지전자 주식회사 Moving Robot and controlling method
US10575699B2 (en) * 2018-01-05 2020-03-03 Irobot Corporation System for spot cleaning by a mobile robot

Patent Citations (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4864520A (en) * 1983-09-30 1989-09-05 Ryozo Setoguchi Shape generating/creating system for computer aided design, computer aided manufacturing, computer aided engineering and computer applied technology
US5561742A (en) * 1992-01-28 1996-10-01 Fanuc Ltd. Multiple-robot control and interference prevention method
JPH08128213A (en) * 1994-11-01 1996-05-21 Sekisui House Ltd Construction method using workbench unit
US5788851A (en) * 1995-02-13 1998-08-04 Aksys, Ltd. User interface and method for control of medical instruments, such as dialysis machines
US7233795B1 (en) * 2001-03-19 2007-06-19 Ryden Michael V Location based communications system
US20030003988A1 (en) * 2001-06-15 2003-01-02 Walker Jay S. Method and apparatus for planning and customizing a gaming experience
US20050029029A1 (en) * 2002-08-30 2005-02-10 Aethon, Inc. Robotic cart pulling vehicle
US20070198159A1 (en) * 2006-01-18 2007-08-23 I-Guide, Llc Robotic vehicle controller
US20070282484A1 (en) * 2006-06-01 2007-12-06 Samsung Electronics Co., Ltd. Method, medium and apparatus classifying and collecting area feature information according to a robot's moving path, and a robot controlled by the area features
US20090282710A1 (en) * 2007-08-08 2009-11-19 Johnson Rick D Multi-Function Material Moving Assembly and Method
US20100076599A1 (en) * 2008-09-20 2010-03-25 Steven Jacobs Manually driven determination of a region of interest (roi) or a path of interest (poi) for a robotic device
US20110030163A1 (en) * 2009-08-05 2011-02-10 Karcher N. America, Inc. Method and apparatus for extended use of cleaning fluid in a floor cleaning machine
US20110125324A1 (en) * 2009-11-20 2011-05-26 Baek Sanghoon Robot cleaner and controlling method of the same
US20130021174A1 (en) * 2010-04-09 2013-01-24 Daniella Kurland Facilities management
US8726458B1 (en) * 2011-05-26 2014-05-20 Scott Reinhold Mahr Solar collector washing device
US20140247116A1 (en) * 2011-11-11 2014-09-04 Bar Code Specialties, Inc. (Dba Bcs Solutions) Robotic inventory systems
US20130282224A1 (en) * 2012-04-24 2013-10-24 Mamiya-Op Nequos Co., Ltd. Work machine and components thereof
US20130310982A1 (en) * 2012-05-15 2013-11-21 Kuka Laboratories Gmbh Method For Determining Possible Positions Of A Robot Arm
DE102013212060A1 (en) * 2012-06-25 2014-01-02 Lear Corporation Vehicle remote control system and method
US20190097827A1 (en) * 2013-01-18 2019-03-28 Irobot Corporation Mobile robot providing environmental mapping for household environmental control
US20170094897A1 (en) * 2014-03-31 2017-04-06 Irobot Corporation Autonomous Mobile Robot
US10091930B2 (en) * 2014-03-31 2018-10-09 Irobot Corporation Autonomous mobile robot
US20150271991A1 (en) * 2014-03-31 2015-10-01 Irobot Corporation Autonomous Mobile Robot
US20180352735A1 (en) * 2014-03-31 2018-12-13 Irobot Corporation Autonomous Mobile Robot
EP2945037A2 (en) * 2014-05-15 2015-11-18 iRobot Corporation Autonomous mobile robot confinement system
US20150328775A1 (en) * 2014-05-15 2015-11-19 Irobot Corporation Autonomous Mobile Robot Confinement System
EP3200040B1 (en) * 2014-09-26 2021-08-04 Ecovacs Robotics Co., Ltd. Drawing method of the movement boundary for a self-moving robot
EP3018548A1 (en) * 2014-11-07 2016-05-11 F. Robotics Acquisitions Ltd. Domestic robotic system
CN106662452A (en) * 2014-12-15 2017-05-10 美国 iRobot 公司 Robot lawnmower mapping
US20180116105A1 (en) * 2014-12-22 2018-05-03 Irobot Corporation Robotic Mowing of Separated Lawn Areas
US20160257000A1 (en) * 2015-03-04 2016-09-08 The Johns Hopkins University Robot control, training and collaboration in an immersive virtual reality environment
US11560297B2 (en) * 2015-09-30 2023-01-24 Stertil B.V. Lifting system with indoor positioning system and method therefor
US20210096579A1 (en) * 2016-08-05 2021-04-01 RobArt GmbH Method For Controlling An Autonomous Mobile Robot
US20180173243A1 (en) * 2016-12-16 2018-06-21 Samsung Electronics Co., Ltd. Movable object and method for controlling the same
US20180173244A1 (en) * 2016-12-19 2018-06-21 Samsung Electronics Co., Ltd. Movable object and control method thereof
US20190354246A1 (en) * 2016-12-23 2019-11-21 Lg Electronics Inc. Airport robot and movement method therefor
US20180215039A1 (en) * 2017-02-02 2018-08-02 Brain Corporation Systems and methods for assisting a robotic apparatus
US20180246215A1 (en) * 2017-02-28 2018-08-30 Stmicroelectronics, Inc. Vehicle dynamics obstacle compensation system
US20180257232A1 (en) * 2017-03-13 2018-09-13 Fanuc Corporation Robot system and robot control method
DE112017000135T5 (en) * 2017-03-31 2019-03-14 Komatsu Ltd. working vehicle
US20200130197A1 (en) * 2017-06-30 2020-04-30 Lg Electronics Inc. Moving robot
US20200174490A1 (en) * 2017-07-27 2020-06-04 Waymo Llc Neural networks for vehicle trajectory planning
WO2019020794A1 (en) * 2017-07-28 2019-01-31 RobArt GmbH Magnetometer for robot navigation
US20190084158A1 (en) * 2017-09-19 2019-03-21 Autodesk, Inc. Modifying robot dynamics in response to human presence
US20190163174A1 (en) * 2017-11-30 2019-05-30 Lg Electronics Inc. Mobile robot and method of controlling the same
US20190167059A1 (en) * 2017-12-06 2019-06-06 Bissell Inc. Method and system for manual control of autonomous floor cleaner
US20210068605A1 (en) * 2018-01-03 2021-03-11 Samsung Electronics Co., Ltd. Moving apparatus for cleaning, collaborative cleaning system, and method of controlling the same
US20190220678A1 (en) * 2018-01-13 2019-07-18 Toyota Jidosha Kabushiki Kaisha Localizing Traffic Situation Using Multi-Vehicle Collaboration
US20210006425A1 (en) * 2018-04-27 2021-01-07 Samsung Electronics Co., Ltd. Electronic device and method for controlling external electronic device
US20190353483A1 (en) * 2018-05-15 2019-11-21 Deere & Company Coverage-based system and method of planning a turn path for a vehicle
US20210247775A1 (en) * 2018-06-15 2021-08-12 Ecovacs Robotics Co., Ltd. Method for localizing robot, robot, and storage medium
JPWO2020012609A1 (en) * 2018-07-12 2020-10-22 日立建機株式会社 Work machine
JP2020033704A (en) * 2018-08-27 2020-03-05 日立建機株式会社 Work machine
US20200073381A1 (en) * 2018-08-30 2020-03-05 Pony.ai, Inc. Distributed sensing for vehicle navigation
US20200073407A1 (en) * 2018-08-30 2020-03-05 Pony.ai, Inc. Prioritizing vehicle navigation
US20200086497A1 (en) * 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Stopping Robot Motion Based On Sound Cues
US20200086498A1 (en) * 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Voice Modification To Robot Motion Plans
JP2020049852A (en) * 2018-09-27 2020-04-02 株式会社リコー Three-dimensional data generation device, program, and three-dimensional data generation method
US20220266446A1 (en) * 2019-01-03 2022-08-25 Lucomm Technologies, Inc. Flux Sensing System
US20220040860A1 (en) * 2019-01-03 2022-02-10 Lucomm Technologies, Inc. Robotic Post System
US20220266451A1 (en) * 2019-01-03 2022-08-25 Lucomm Technologies, Inc. Robotic Gate
US20220122397A1 (en) * 2019-01-03 2022-04-21 Lucomm Technologies, Inc. Robotic Post
US20230054004A1 (en) * 2019-03-20 2023-02-23 Lucomm Technologies, Inc. Flux System
US20230057149A1 (en) * 2019-03-20 2023-02-23 Lucomm Technologies, Inc. Robotic Post System
US20190362234A1 (en) * 2019-07-02 2019-11-28 Lg Electronics Inc. Artificial intelligence apparatus for cleaning in consideration of user's action and method for the same
US20190370691A1 (en) * 2019-07-12 2019-12-05 Lg Electronics Inc. Artificial intelligence robot for determining cleaning route using sensor data and method for the same
US20210362335A1 (en) * 2019-07-12 2021-11-25 Lg Electronics Inc. Robot and method for manage item using same
US20210046650A1 (en) * 2019-08-18 2021-02-18 Cobalt Robotics Inc. Elevator interactions by mobile robot
US20210370176A1 (en) * 2019-09-05 2021-12-02 Tencent Technology (Shenzhen) Company Limited Method and apparatus for controlling movement of virtual object, terminal, and storage medium
US20210121035A1 (en) * 2019-10-29 2021-04-29 Lg Electronics Inc. Robot cleaner and method of operating the same
US20210200239A1 (en) * 2019-12-30 2021-07-01 Lg Electronics Inc. Control device and method for a plurality of robots
US20210274310A1 (en) * 2020-02-27 2021-09-02 Psj International Ltd. System for establishing positioning map data and method for the same
US20210283774A1 (en) * 2020-03-12 2021-09-16 Canon Kabushiki Kaisha Robot, control device, and information processing device
US20210304559A1 (en) * 2020-03-27 2021-09-30 Aristocrat Technologies, Inc. Gaming service automation machine with drop box services
US11032298B1 (en) * 2020-04-23 2021-06-08 Specter Ops, Inc. System and method for continuous collection, analysis and reporting of attack paths in a directory services environment
US20220026920A1 (en) * 2020-06-10 2022-01-27 AI Incorporated Light weight and real time slam for robots
US20220107648A1 (en) * 2020-10-03 2022-04-07 Viabot Inc. Systems for setting and programming zoning for use by autonomous modular robots
KR20220046210A (en) * 2020-10-07 2022-04-14 엘지전자 주식회사 Mobile robot and control method thereof
US20220197295A1 (en) * 2020-12-22 2022-06-23 Globe (jiangsu) Co., Ltd. Robotic mower, and control method thereof
US20230292657A1 (en) * 2020-12-22 2023-09-21 Greenworks (Jiangsu) Co., Ltd. Robotic tool system and control method thereof
CN213518002U (en) * 2020-12-24 2021-06-22 格力博(江苏)股份有限公司 Position acquisition device and lawn mower
CN112612278A (en) * 2020-12-24 2021-04-06 格力博(江苏)股份有限公司 Method for collecting position information, position collecting device and mower
US20230186493A1 (en) * 2021-12-09 2023-06-15 NEOWINE Co., Ltd. System and method for measuring location of moving object based on artificial intelligence
CN217429887U (en) * 2021-12-24 2022-09-16 华为技术有限公司 Cleaning robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11664690B2 (en) 2015-10-08 2023-05-30 Pathfinder Propulsion, LLC Combined propellant-less propulsion and reaction wheel device
WO2022077533A1 (en) * 2020-10-18 2022-04-21 南京朗禾数据有限公司 Method for controlling agricultural-machine operation area calculation system

Also Published As

Publication number Publication date
CN105446350B (en) 2018-05-29
CN105446350A (en) 2016-03-30
EP3200040A1 (en) 2017-08-02
WO2016045617A1 (en) 2016-03-31
EP4345564A1 (en) 2024-04-03
US20170248956A1 (en) 2017-08-31
EP3889725A1 (en) 2021-10-06
EP3889725B1 (en) 2024-01-03
EP3200040B1 (en) 2021-08-04
EP3200040A4 (en) 2018-06-20
US10520950B2 (en) 2019-12-31

Similar Documents

Publication Publication Date Title
US20200089235A1 (en) Self-moving robot movement boundary determining method
JP7136898B2 (en) ROBOT PET MONITORING METHOD AND CHIP BASED ON GRID MAP
CN105526934B (en) Indoor and outdoor integrated high-precision positioning navigation system and positioning method thereof
CN104703118A (en) System of indoor robot for locating mobile terminal based on bluetooth technology
KR101247964B1 (en) Method for Measuring Location of Radio Frequency Identification Reader by Using Beacon
Roa et al. Optimal placement of sensors for trilateration: Regular lattices vs meta-heuristic solutions
US20230168334A1 (en) Indoor Positioning Method and Device
CN110022574A (en) A kind of method of automatic configuration of UWB indoor positioning base station
CN108344970B (en) Wireless positioning automatic calibration method using mobile robot
US9880022B1 (en) Point layout system with third transmitter
CN103472434A (en) Robot sound positioning method
CN110118555A (en) UAV Navigation System and method
CN106767762B (en) Indoor positioning navigation method for invisible laser calibration
CN111638487B (en) Automatic parking test equipment and method
CN108709558A (en) A kind of method of large scale workshop high accuracy positioning
CN113050137B (en) Multi-point cooperative measurement spatial information acquisition method
CN108507580A (en) A kind of mobile robot platform self aligning system and method for self-locating
CN111432328B (en) Node positioning method, device and storage medium of wireless sensor network
CN108226863A (en) A kind of monocular Satellite Tracking localization method
CN109884615A (en) A kind of unmanned plane with indoor light stream positioning system
Černohorský et al. Mobile robot indoor navigation
CN112653525A (en) Test system
CN114001743B (en) Map drawing method, device and system, storage medium and electronic equipment
KR101316524B1 (en) Method and apparatus for estimating location in the object using omnidirectional vision sensor
CN109392086A (en) A kind of method and system of the more base station locations of multizone

Legal Events

Date Code Title Description
AS Assignment

Owner name: ECOVACS ROBOTICS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TANG, JINJU;REEL/FRAME:051097/0735

Effective date: 20191019

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION