KR101897775B1 - Moving robot and controlling method thereof - Google Patents
Moving robot and controlling method thereof Download PDFInfo
- Publication number
- KR101897775B1 KR101897775B1 KR1020160026616A KR20160026616A KR101897775B1 KR 101897775 B1 KR101897775 B1 KR 101897775B1 KR 1020160026616 A KR1020160026616 A KR 1020160026616A KR 20160026616 A KR20160026616 A KR 20160026616A KR 101897775 B1 KR101897775 B1 KR 101897775B1
- Authority
- KR
- South Korea
- Prior art keywords
- dimensional
- camera sensor
- main body
- mobile robot
- dimensional camera
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L9/00—Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
- A47L9/28—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
- A47L9/2836—Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
- A47L9/2852—Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The present invention relates to a mobile robot, and more particularly, to a vacuum cleaner that performs autonomous traveling, including a main body, a driving unit for providing driving force for moving the main body, A three-dimensional camera sensor for generating three-dimensional coordinate information related to the three-dimensional coordinate information, and calculating a distance between a plurality of points distributed around the main body and one point of the main body based on the generated three-dimensional coordinate information, And a control unit for detecting information related to at least one of the terrain and the obstacles existing around the main body using the plurality of distance values, and controlling the driving unit based on the detected information.
Description
BACKGROUND OF THE
In general, robots have been developed for industrial use and have been part of factory automation. In recent years, medical robots, aerospace robots, and the like have been developed, and household robots that can be used in ordinary homes are being developed.
A representative example of the domestic robot is a robot cleaner, which is a type of household appliance that sucks and cleanes dust or foreign matter around the robot while traveling in a certain area by itself. Such a robot cleaner is generally equipped with a rechargeable battery and has an obstacle sensor capable of avoiding obstacles during traveling, so that it can run and clean by itself.
In recent years, research has been actively carried out to utilize the robot cleaner in various fields such as health care, smart home, and remote control by moving away from the cleaning operation by merely autonomously moving the cleaning area by the robot cleaner.
Specifically, the conventional robot cleaner includes only an infrared sensor, an ultrasonic sensor, an RF sensor, an optical sensor, or a camera sensor for acquiring a two-dimensional image in order to detect information related to an obstacle. Therefore, it is difficult to obtain accurate obstacle information in the conventional robot cleaner.
In particular, conventional robot cleaners generally detect obstacles by using two-dimensional image information obtained by a two-dimensional camera sensor. By using such two-dimensional image information, the distance between the obstacle and the robot body and the three- It is difficult to detect.
In addition, the conventional robot cleaner extracts feature points from two-dimensional image information to detect obstacle information. When the two-dimensional image information is formed such that feature point extraction is disadvantageous, the accuracy of the detected obstacle information is remarkably degraded.
In order to detect the information related to the material of the floor surface, the conventional robot cleaner determines the output level of the driving motor of the robot cleaner according to the material of the floor surface based on the output value of the driving motor, Or not. However, since the degree to which the output value of the driving motor varies depending on the material of the bottom surface is not constant, such a determination method has a problem that accuracy is low.
In order to solve such a problem, there is a need for a robot cleaner equipped with a three-dimensional camera sensor.
However, since the viewing angle of the 3D camera sensor is narrower than that of the 2D camera sensor, there arises a problem that the area in which the robot cleaner using the 3D camera sensor can acquire the obstacle information at a certain point is somewhat limited.
Also, since the 3D camera sensor acquires a larger amount of data than the 2D camera sensor, the amount of computation of the robot cleaner using the 3D camera sensor may be excessively increased.
If the amount of computation of the robot cleaner is excessively increased, time required for detecting the obstacle information and determining the traveling algorithm corresponding to the obstacle information is increased, so that it is difficult to immediately react to the surrounding obstacles.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a three-dimensional camera sensor that acquires three-dimensional coordinate information related to an obstacle located in a terrain or a periphery around a mobile robot or a robot cleaner, The present invention provides a vacuum cleaner that performs autonomous traveling that performs traveling, and a control method thereof.
It is another object of the present invention to provide a cleaner for performing autonomous traveling which can reduce an amount of computation and immediately respond to an obstacle in the process of detecting information related to a bottom surface using three-dimensional coordinate information obtained from a three- And to provide a control method thereof.
It is another object of the present invention to provide a method and apparatus for detecting information related to a floor by using three-dimensional coordinate information obtained from a three-dimensional camera sensor, And a control method thereof.
It is another object of the present invention to provide a vacuum cleaner which can detect the distance between an obstacle and a main body and the shape of an obstacle more accurately by using a three-dimensional camera sensor, and a control method thereof.
It is also an object of the present invention to provide a vacuum cleaner and a control method thereof that can autonomously travel while accurately detecting information related to an obstacle even when the viewpoint of the three-dimensional camera sensor is changed.
According to an aspect of the present invention, there is provided a vacuum cleaner for performing autonomous traveling, the vacuum cleaner including a main body, a driving unit for providing a driving force for moving the main body, And a three-dimensional camera sensor for generating related three-dimensional coordinate information.
In addition, the vacuum cleaner performing autonomous travel according to the present invention calculates the distance between a plurality of points distributed around the main body and one point of the main body, based on the generated three-dimensional coordinate information, And a control unit for detecting information related to at least one of the terrain and obstacles existing around the main body by using a plurality of distance values and controlling the driving unit based on the detected information.
In one embodiment, the three-dimensional camera sensor may capture an image associated with a floor surface located on a side of the main body in the traveling direction, and may include a plurality of three-dimensional And generates coordinate information.
In one embodiment, the plurality of points are located within a predetermined distance from a reference point of the photographed image.
In one embodiment, the plurality of points form a grid including the reference point.
In one embodiment, the controller calculates distances from one point of the main body to the plurality of points using the plurality of three-dimensional coordinate information, and calculates a degree of dispersion (degree) of the calculated plurality of distance values and the output level of the driving unit is set based on the scattering of the driving signal.
In one embodiment, the control unit calculates an average value and a variance of the calculated plurality of distance values, and sets an output level of the driving unit based on the calculated average value and variance.
In one embodiment, the apparatus further includes a memory for storing a database related to a distance value between the floor surface and the main body, and the controller updates the database using the calculated plurality of distance values.
In one embodiment, the control unit controls the driving unit to increase the driving force when the calculated variance is greater than a predetermined reference variance value.
In one embodiment, the image forming apparatus may further include an output sensing unit that senses information related to an output of the driving unit, wherein the control unit uses the output sensing unit to determine whether a change in output of the driving unit during a unit time And controls the three-dimensional camera sensor to photograph an image related to the bottom surface positioned on the side of the main body in the traveling direction, according to the determination result related to the output change.
In one embodiment, the apparatus further comprises a connecting member coupled between the three-dimensional camera sensor and the body, the connecting member changing a direction in which the three-dimensional camera sensor is directed, the connecting member tilting the three- tilting the three-dimensional camera sensor, and a second rotation motor for panning the three-dimensional camera sensor.
In one embodiment, the control unit detects the information related to the obstacle disposed around the main body based on the three-dimensional coordinate information, and controls the driving unit to perform the avoidance driving for the detected obstacle .
In one embodiment, the control unit predicts the moving direction of the main body when performing the avoidance driving, and controls the connecting member to control the direction of the three-dimensional camera sensor based on the predicted moving direction .
In one embodiment, the controller rotates the three-dimensional camera sensor in a direction opposite to the predicted movement direction.
In one embodiment, the control unit returns the direction in which the three-dimensional camera sensor is directed in the moving direction of the main body when the avoidance driving is completed.
In one embodiment, the controller generates normal vector information by using neighboring three-dimensional coordinate information with respect to any one of the generated three-dimensional coordinate information, and forms a plane on the basis of the generated vector information And detects an area corresponding to a bottom surface of the detected area.
According to the present invention, since the mobile robot can acquire the three-dimensional coordinate information related to the obstacle by using the three-dimensional camera sensor, the mobile robot can more accurately acquire the information related to the terrain or the obstacle located around the mobile robot The effect is derived.
In addition, according to the present invention, since the mobile robot can detect the distance between the obstacle located in the periphery of the mobile robot and the mobile robot in real time using the three-dimensional coordinate information, Effect is derived.
In addition, according to the present invention, the obstacle avoiding operation for avoiding an obstacle can be performed quickly by reducing the calculation amount of the mobile robot using the three-dimensional camera sensor, so that the obstacle avoiding operation performance of the mobile robot is improved . That is, according to the present invention, since the mobile robot can immediately change the moving direction to avoid the obstacle, the collision between the mobile robot and the obstacle can be prevented.
According to the present invention, since the effect of noise generated in the three-dimensional camera sensor can be reduced when the mobile robot detects the terrain or obstacle information located in the periphery of the mobile robot, Or the information related to the obstacle can be obtained more accurately.
Further, according to the present invention, it is possible to more accurately detect the material or shape of the floor in which the mobile robot is running.
According to the present invention, even when the viewpoint of the three-dimensional camera sensor is changed by the movement of the mobile robot or the rotation of the three-dimensional camera sensor, information related to the terrain or the obstacle located in the periphery of the mobile robot can be accurately detected .
According to another aspect of the present invention, there is provided a mobile robot for detecting obstacles or terrain information using a three-dimensional camera sensor having a relatively narrow viewing angle, which is capable of avoiding an obstacle located at a wider angle than a viewing angle of a three- Effect is derived.
1A is a block diagram illustrating components of a mobile robot according to an embodiment of the present invention.
1B is a block diagram showing more detailed components of the
2A to 2D are conceptual diagrams showing an embodiment of a mobile robot having a three-dimensional camera sensor according to the present invention.
FIGS. 3A and 3B are conceptual diagrams showing a connection member for changing the direction in which the three-dimensional camera sensor provided in the mobile robot according to the present invention is oriented relative to the main body of the mobile robot.
4A and 4B are conceptual diagrams illustrating a method of detecting a bottom surface using three-dimensional coordinate information in a mobile robot according to the present invention.
4C is a flowchart illustrating a method of detecting a bottom surface using three-dimensional coordinate information.
5 is a flowchart illustrating a method of detecting a bottom surface using three-dimensional coordinate information in a mobile robot according to the present invention.
FIG. 6 is a graph showing a change in the output value of the driving motor over time when the mobile robot according to the present invention travels roughly.
FIGS. 7A and 7B are conceptual diagrams showing a method for a mobile robot according to the present invention to detect information related to a material of a floor using a three-dimensional camera sensor. FIG.
7C is a flowchart illustrating a method for a mobile robot according to the present invention to detect information related to a material of a floor using a three-dimensional camera sensor.
8A and 8B are conceptual diagrams showing a method of avoiding an obstacle while changing a photographing angle by a three-dimensional camera sensor provided in a mobile robot according to the present invention.
FIG. 8C is a flowchart illustrating a method of changing a direction of a three-dimensional camera sensor based on three-dimensional coordinate information obtained by a three-dimensional camera sensor according to the present invention.
9A to 9C are conceptual diagrams illustrating an embodiment for determining a three-dimensional coordinate system associated with three-dimensional coordinate information generated by a three-dimensional camera sensor of a mobile robot according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In the following, FIG. 1A, components of a mobile robot according to an embodiment of the present invention will be described in detail.
1A, a mobile robot according to an embodiment of the present invention includes a
At this time, it is needless to say that the components shown in FIG. 1A are not essential, so that a robot cleaner having components having more components or fewer components can be implemented. Hereinafter, each component will be described.
First, the
At this time, the
The battery may be located at the bottom of the center of the robot cleaner, or may be located at either the left or right side. In the latter case, the mobile robot may further include a balance weight to eliminate weight biases of the battery.
Meanwhile, the driving
Meanwhile, the
Also, the
The
On the other hand, the
In addition, the
The
On the other hand, the
The
Accordingly, the
In addition, the
Accordingly, the
At this time, the
The
Meanwhile, the
The
1B, the
The external
The mobile robot can receive the guidance signal generated by the charging signal using the external
In addition, the mobile robot can detect a signal generated by a remote control device such as a remote controller or a terminal by using an external
The external
On the other hand, the
The
Ultrasonic sensors, for example, can typically be used to detect distant obstacles in general. The ultrasonic sensor includes a transmitter and a receiver. The
Also, the
In one embodiment, a plurality (e. G., Five) of ultrasonic sensors may be installed along the outer circumferential surface on the front side of the mobile robot. At this time, preferably, the ultrasonic sensor can be installed on the front side of the mobile robot alternately with the transmitting part and the receiving part.
That is, the transmitting unit may be disposed to be spaced left and right from the front center of the main body, and one or two transmitting units may be disposed between the receiving units to form a receiving area of the ultrasonic signal reflected from the obstacle or the like. With this arrangement, the receiving area can be expanded while reducing the number of sensors. The angle of origin of the ultrasonic waves can maintain an angle range that does not affect different signals to prevent crosstalk. Also, the receiving sensitivity of the receiving units may be set differently.
In addition, the ultrasonic sensor may be installed upward by a predetermined angle so that the ultrasonic wave emitted from the ultrasonic sensor is outputted upward, and the ultrasonic sensor may further include a predetermined blocking member to prevent the ultrasonic wave from being radiated downward.
As described above, the
For example, the
The infrared sensor may be installed on the outer surface of the mobile robot together with the ultrasonic sensor. The infrared sensor can also detect the obstacles existing on the front or side and transmit the obstacle information to the
On the other hand, the cliff sensor 143 (or Cliff Sensor) can detect obstacles on the floor supporting the main body of the mobile robot by mainly using various types of optical sensors.
That is, the
For example, any one of the
For example, the
The PSD sensor uses a semiconductor surface resistance to detect the shortest path distance of incident light at one p-n junction. The PSD sensor includes a one-dimensional PSD sensor for detecting light in only one direction and a two-dimensional PSD sensor for detecting a light position on a plane, all of which can have a pin photodiode structure. The PSD sensor is a type of infrared sensor that measures the distance by measuring the angle of the infrared ray reflected from the obstacle after transmitting the infrared ray by using the infrared ray. That is, the PSD sensor uses the triangulation method to calculate the distance to the obstacle.
The PSD sensor includes a light emitting unit that emits infrared rays to an obstacle, and a light receiving unit that receives infrared rays that are reflected from the obstacle and is returned to the obstacle. When an obstacle is detected by using the PSD sensor, a stable measurement value can be obtained irrespective of the reflectance and the color difference of the obstacle.
The
On the other hand, the
As another example, the
On the other hand, the
The
Also, one or more light sources may be installed adjacent to the image sensor. The at least one light source irradiates light onto a predetermined area of the bottom surface, which is photographed by the image sensor. That is, when the mobile robot moves in a specific region along the bottom surface, a certain distance is maintained between the image sensor and the bottom surface when the bottom surface is flat. On the other hand, when the mobile robot moves on the bottom surface of a nonuniform surface, it is distant by a certain distance due to the unevenness and obstacles on the bottom surface. At this time, one or more light sources may be controlled by the
Using the
On the other hand, the
The
The three-
That is, the three-
Specifically, the three-
In one embodiment, the three-
In another embodiment, the three-
In another embodiment, the three-
2A to 2D, an embodiment of a mobile robot having a three-
2A and 2B, the three-
2A, the direction in which the three-
Referring to FIG. 2B, the three-
In one embodiment, the
That is, when the
When the
For example, the
In addition, the
As shown in FIGS. 2C and 2D, the three-
Referring to FIGS. 2C and 2D, the viewing angle of the three-
The
Accordingly, even when the moving direction of the
3A and 3B, an embodiment of a connecting member for changing the direction in which the three-dimensional camera sensor provided in the mobile robot according to the present invention is oriented relative to the main body of the mobile robot will be described.
As shown in FIG. 3A, a part of the main body of the
Referring to FIG. 3A, the connecting
Although not shown in FIG. 3A, the connecting
3B, the connecting
Specifically, the first connecting
The second linking member may be connected to the combination of the
3B, the connecting
4A and 4B below illustrate an embodiment related to a method of detecting a floor using three-dimensional coordinate information in a mobile robot according to the present invention.
Referring to FIG. 4A, the three-
More specifically, as shown in FIG. 4A, the
FIG. 4B shows an embodiment in which the
In the following FIG. 4C, a method of detecting the bottom surface using three-dimensional coordinate information and normal vector information is described.
As shown in FIG. 4C, the three-
In addition, the
Further, the
Then, the
5, another embodiment related to a method of detecting a floor using three-dimensional coordinate information in a mobile robot according to the present invention will be described.
The three-
The
For example, the two-dimensional image may be formed of 320 pixels by 240 pixels, and the unit area may be 20 pixels by 20 pixels. In another example, the unit area may be formed in a square, and the length of one side of the square may correspond to 10% of the short length of the width and height of the two-dimensional image. In another example, the
Next, the
Specifically, the
Also, the
Based on the generated normal vector information, the
That is, the
Referring to FIG. 5, unlike the normal vector information generation method (S402) shown in FIG. 4C, the
The
That is, the
In detecting the information related to the terrain, floor, obstacles, and the like existing around the
In addition, when the amount of computation of the
FIG. 6 is a graph showing a change in output value of the driving motor over time when the mobile robot according to the present invention travels roughly.
The
Specifically, the driving motor may be installed inside the main body to transmit the driving force to the driving wheels. Further, the driving motor may transmit the driving force to the agitator disposed rotatably inside the suction head of the mobile robot cleaner.
In addition, the
Further, the
The existing mobile robot detects information related to the material of the floor under travel of the mobile robot using only the
However, since the output of the motor is not constantly changed according to the bottom material, information related to the bottom material detected using only the output value of the driving motor or its variation is inferior in accuracy.
Therefore, in the present invention, a method of detecting information related to a material of a floor surface using three-dimensional coordinate information generated by a three-dimensional camera sensor or using three-dimensional coordinate information together with an output value of the driving motor or a change thereof I suggest.
Referring to FIG. 7A, the
As shown in FIG. 7A, the
That is, the three-
Referring to FIG. 7A, the plurality of points may be located within a predetermined distance from a reference point of the
That is, the three-
In one embodiment, the plurality of points may form a grid including the reference point. Referring to FIG. 7A, a plurality of points may be formed in 25 points including reference points, and may be formed in a lattice shape by being distributed at equal intervals.
For example, the three-
Thus, the three-
The
For example, referring to FIG. 7B, a table 720 is shown that includes distance values between one point of the body from 25 points included in the
The
Specifically, the
For example, when the calculated variance is within the range of 0 to 3, the
In another example, the
In another example, the
On the other hand, the
For example, the
In another example, the
Meanwhile, the
Whenever the
In the following Fig. 7C, the control method of the mobile robot shown in Figs. 7A and 7B will be described.
The three-
Specifically, the
That is, when the
For example, the
In another example, the
Thereafter, the three-
The
For example, one point of the body may be located on the front of the lens of the three-
The
Further, the
The above variance is calculated to control the cleaner using a degree of scattering for a plurality of distance values. Accordingly, the
8A and 8B, a method of avoiding an obstacle while changing a photographing angle by a three-dimensional camera sensor provided in the mobile robot according to the present invention will be described.
8A, the three-
The
That is, the
In addition, when the
Specifically, the
Referring to FIG. 8B, the
Specifically, when the traveling direction of the
For example, the angle formed by the
In another example, the
Thus, according to the
Although not shown in FIGS. 8A and 8B, when the
9A to 9C, a method for determining a three-dimensional coordinate system related to three-dimensional coordinate information generated by the three-
As shown in FIG. 9A, the camera coordinate system (Xc, Yc, Zc) can be defined by the direction of the lens of the three-
Referring to FIG. 9B, the global coordinate system (X, Y, Z) can be determined by the surrounding terrain of the mobile robot. The global coordinate system (X, Y, Z) can be defined separately from the lens of the three-
The
Specifically, the
Thereafter, the
In one embodiment, the
Meanwhile, the
That is, when it is determined that the normal vector corresponding to the specific three-dimensional coordinate information and the inner product value calculated by the Z axis are smaller than the threshold value, the
Accordingly, the
When the two axes of the global coordinate system are determined as described above, the
9C, the
Specifically, the
The
After the global coordinate system (X, Y, Z) is set as described above, the
In addition, the
With this configuration, even if the direction of the three-
According to the present invention, since the mobile robot can acquire the three-dimensional coordinate information related to the obstacle by using the three-dimensional camera sensor, the mobile robot can acquire the information related to the terrain or the obstacle located near the mobile robot more accurately The effect can be obtained.
In addition, according to the present invention, since the mobile robot can detect the distance between the obstacle located in the periphery of the mobile robot and the mobile robot in real time using the three-dimensional coordinate information, Effect is derived.
In addition, according to the present invention, the obstacle avoiding operation for avoiding an obstacle can be performed quickly by reducing the calculation amount of the mobile robot using the three-dimensional camera sensor, so that the obstacle avoiding operation performance of the mobile robot is improved . That is, according to the present invention, since the mobile robot can immediately change the moving direction to avoid the obstacle, the collision between the mobile robot and the obstacle can be prevented.
According to the present invention, since the effect of noise generated in the three-dimensional camera sensor can be reduced when the mobile robot detects the terrain or obstacle information located in the periphery of the mobile robot, Or the information related to the obstacle can be obtained more accurately.
Further, according to the present invention, it is possible to more accurately detect the material or shape of the floor in which the mobile robot is running.
According to the present invention, even when the viewpoint of the three-dimensional camera sensor is changed by the movement of the mobile robot or the rotation of the three-dimensional camera sensor, information related to the terrain or the obstacle located in the periphery of the mobile robot can be accurately detected .
According to another aspect of the present invention, there is provided a mobile robot for detecting obstacles or terrain information using a three-dimensional camera sensor having a relatively narrow viewing angle, which is capable of avoiding an obstacle located at a wider angle than a viewing angle of a three- Effect is derived.
It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
Claims (20)
A driving unit for providing driving force for moving the main body;
A three-dimensional camera sensor attached to one surface of the body to generate three-dimensional coordinate information related to the surroundings of the body;
Calculating a distance between a plurality of points distributed around the main body and one point of the main body based on the generated three-dimensional coordinate information,
Detecting information related to at least one of the terrain and obstacles existing around the main body using the calculated plurality of distance values,
And a control unit for controlling the driving unit based on the detected information,
Wherein,
Generating normal vector information using any one of the generated three-dimensional coordinate information using neighboring three-dimensional coordinate information,
Detects an area forming a plane based on the generated vector information,
And detecting an area corresponding to a bottom surface of the detected area.
The three-dimensional camera sensor comprises:
An image related to a bottom surface located on the side of the main body in the traveling direction,
Dimensional coordinate information corresponding to a plurality of points included in an image related to the bottom surface.
Wherein the plurality of points are located within a predetermined distance from a reference point of the photographed image.
Wherein the plurality of points form a grid including the reference point.
Wherein,
Calculating distances from one point of the main body to the plurality of points using the plurality of three-dimensional coordinate information,
Wherein the output level of the driving unit is set based on a degree of scattering of the calculated plurality of distance values.
Wherein,
Calculating a mean value and a variance of the calculated plurality of distance values,
Wherein the output level of the driving unit is set based on the calculated average value and the variance.
Further comprising a memory for storing a database associated with distance values between said floor and said body,
Wherein,
And updates the database using the calculated distance values. ≪ RTI ID = 0.0 > 11. < / RTI >
Wherein,
And controls the driving unit to increase the driving force if the calculated variance is greater than a preset reference variance value.
Further comprising output sensing means for sensing information related to an output of the driving unit,
Wherein,
Determining whether an output change of the driving unit during a unit time exceeds a predetermined threshold value by using the output sensing unit,
Wherein the controller controls the three-dimensional camera sensor to photograph an image related to a bottom surface positioned on a side of a traveling direction of the main body, according to a determination result related to the output change.
Further comprising a connecting member coupled between the three-dimensional camera sensor and the main body, for changing a direction in which the three-dimensional camera sensor is directed,
The connecting member includes:
A first rotation motor for tilting the three-dimensional camera sensor,
And a second rotation motor for panning the three-dimensional camera sensor.
Wherein,
Based on the three-dimensional coordinate information, information related to an obstacle disposed around the body,
And controls the driving unit to perform the avoidance driving with respect to the detected obstacle.
The control unit
Wherein when the avoidance travel is performed, the moving direction of the main body is predicted,
Wherein the control unit controls the connection member so that the direction of the three-dimensional camera sensor is changed based on the predicted movement direction.
Wherein,
And rotating the three-dimensional camera sensor in a direction opposite to the predicted movement direction.
Wherein,
Wherein when the avoidance travel is completed, the direction in which the three-dimensional camera sensor is directed returns in the moving direction of the main body.
The three-dimensional camera sensor comprises:
Capturing a two-dimensional image related to the periphery of the main body,
Acquiring a plurality of three-dimensional coordinate information corresponding to the two-dimensional image,
Wherein,
Dividing the two-dimensional image into unit areas,
Wherein the normal vector information is generated using three-dimensional coordinate information corresponding to at least a part of the divided unit areas.
Wherein,
Wherein the determination unit determines whether the divided unit area is a plane based on the generated normal vector information.
Wherein,
Wherein a region corresponding to a bottom surface of the two-dimensional image is detected using a determination result of whether or not the divided unit area is a plane.
Wherein,
And sets information related to the global coordinate system using the three-dimensional coordinate information.
Wherein,
Dimensional coordinate system, and converts the three-dimensional coordinate information generated by the three-dimensional camera sensor into the global coordinate system based on the information related to the global coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160026616A KR101897775B1 (en) | 2016-03-04 | 2016-03-04 | Moving robot and controlling method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160026616A KR101897775B1 (en) | 2016-03-04 | 2016-03-04 | Moving robot and controlling method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170103556A KR20170103556A (en) | 2017-09-13 |
KR101897775B1 true KR101897775B1 (en) | 2018-09-12 |
Family
ID=59967861
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160026616A KR101897775B1 (en) | 2016-03-04 | 2016-03-04 | Moving robot and controlling method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101897775B1 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102234642B1 (en) * | 2019-01-17 | 2021-04-01 | 엘지전자 주식회사 | a Moving robot and Controlling method for the moving robot |
KR102707921B1 (en) * | 2019-01-22 | 2024-09-23 | 삼성전자주식회사 | Robot and method for controlling thereof |
JPWO2020189462A1 (en) * | 2019-03-15 | 2021-11-18 | ヤマハ発動機株式会社 | Vehicle traveling on the default route |
KR20200130884A (en) * | 2019-04-30 | 2020-11-23 | 에브리봇 주식회사 | Mobile Robot |
KR20210094214A (en) | 2020-01-21 | 2021-07-29 | 삼성전자주식회사 | Electronic device and method for controlling robot |
KR102361982B1 (en) * | 2020-03-04 | 2022-02-11 | 엘지전자 주식회사 | Moving robot and method for controlling thereof |
KR102348963B1 (en) * | 2020-03-10 | 2022-01-11 | 엘지전자 주식회사 | Robot cleaner and Controlling method for the same |
KR20210130478A (en) | 2020-04-22 | 2021-11-01 | 삼성전자주식회사 | Electronic apparatus and controlling method thereof |
KR20220039101A (en) * | 2020-09-21 | 2022-03-29 | 삼성전자주식회사 | Robot and controlling method thereof |
CN113217071A (en) * | 2021-05-10 | 2021-08-06 | 中煤科工集团重庆研究院有限公司 | Automatic suction hood for downhole operation |
KR20230036447A (en) * | 2021-09-07 | 2023-03-14 | 삼성전자주식회사 | Robot and controlling method thereof |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100500831B1 (en) * | 2002-11-25 | 2005-07-12 | 삼성광주전자 주식회사 | Method calculating rotated angles of robot cleaner |
KR100738888B1 (en) * | 2005-10-27 | 2007-07-12 | 엘지전자 주식회사 | The Apparatus and Method for Controlling the Camera of Robot Cleaner |
JP2015056057A (en) * | 2013-09-12 | 2015-03-23 | トヨタ自動車株式会社 | Method of estimating posture and robot |
KR101705601B1 (en) * | 2014-05-30 | 2017-02-13 | 동명대학교 산학협력단 | Apparatus and method for estimating the location of autonomous robot based on three-dimensional depth information |
-
2016
- 2016-03-04 KR KR1020160026616A patent/KR101897775B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
KR20170103556A (en) | 2017-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101897775B1 (en) | Moving robot and controlling method thereof | |
US10966585B2 (en) | Moving robot and controlling method thereof | |
JP6891289B2 (en) | Vacuum cleaner and its control method | |
KR102398330B1 (en) | Moving robot and controlling method thereof | |
US10591925B2 (en) | Cleaner and controlling method thereof | |
US11029700B2 (en) | Mobile robot and control method thereof | |
KR101822942B1 (en) | Robot cleaner and controlling method of the same | |
US11412907B2 (en) | Cleaner and controlling method thereof | |
US11998159B2 (en) | Vacuum cleaner and control method thereof | |
KR101917701B1 (en) | Cleaner and controlling method thereof | |
US11832782B2 (en) | Vacuum cleaner and method for controlling same | |
US12059115B2 (en) | Cleaner and method for controlling same | |
KR101786516B1 (en) | Moving robot and controlling method thereof | |
US20210244252A1 (en) | Artificial intelligence vacuum cleaner and control method therefor | |
US11969136B2 (en) | Vacuum cleaner and control method therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |