Nothing Special   »   [go: up one dir, main page]

CN109358623A - A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet - Google Patents

A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet Download PDF

Info

Publication number
CN109358623A
CN109358623A CN201811240369.4A CN201811240369A CN109358623A CN 109358623 A CN109358623 A CN 109358623A CN 201811240369 A CN201811240369 A CN 201811240369A CN 109358623 A CN109358623 A CN 109358623A
Authority
CN
China
Prior art keywords
robot
light stream
offset
coordinate
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811240369.4A
Other languages
Chinese (zh)
Inventor
戴剑锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201811240369.4A priority Critical patent/CN109358623A/en
Publication of CN109358623A publication Critical patent/CN109358623A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0227Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
    • G05D1/0229Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area in combination with fixed guiding means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)

Abstract

The present invention discloses recognition methods, chip and the clean robot of the carpet offset of robot motion a kind of, including, according to sensing data fusion calculating robot's current position coordinates every the first preset time sensor, then the offset according to the relative positional relationship calculating robot of robot current location and initial position relative to the preset direction, then added up to obtain offset statistical value;It is averaging to obtain offset average value by calculating the times of collection of position coordinates in the second preset time, determines the case where robot deviates the preset direction further according to the offset average value.The recognition methods improves the accuracy of detection carpet offset direction and offset amplitude.

Description

A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet
Technical field
The present invention relates to robots to detect control field, and in particular to a kind of robot motion's carpet offset direction and amplitude Recognition methods, chip and clean robot.
Background technique
Robot navigation based on inertial navigation is all based on global grating map and carries out when returning cradle, this mode All assume that global map is the accurate situation of comparison, however, many conventional autonomous robots are not suitably or accurately It determines robot location and/or posture and does not suitably control the movement of robot, to cannot ensure that robot rests on To on fixed line and/or designated position and/or posture is reached, the position calculated result of robot is caused to malfunction, for example, autonomous type The track of cleaning equipment may be interfered by the influence of carpet texture.Carpet texture can to the effect of the movement of object To be referred to as carpet offset.Carpet offset can be indicated by the carpet offset vector with both amplitude and direction.Carpet offset Vector can be the attribute of carpet.
When robot navigates by water in carpet-covered environment, the movement of robot is not only made by the promotion of frictional force With, and influenced by the active force that carpet is applied to robot.Movement based on robot relative to carpet texture, robot Driving wheel carpet fiber can be made to hold up or fall down.Particularly, when fiber falls down along carpet texture, carpet can be along carpet The direction of texture pushes or guided robot.As shown in figure 3, the robot 1 in left side is towards in arrow direction C motion process, machine Impetus of the driving wheel A of device people 1 by frictional force f11, and carpet fiber be applied to robot 1 driving wheel A it is inside Directed force F 11 so that robot 1 by frictional force f11 and the resultant force F12 of inside directed force F 11 effect and during the motion Deviate arrow direction C;As shown in figure 3, the robot 2 on right side is towards in arrow direction C motion process, the driving wheel B of robot 2 By the impetus of frictional force f21, and carpet fiber is applied to the driving wheel B outward force F21 of robot 2, so that Robot 2 is deviateed arrow direction C by the frictional force f21 and resultant force F22 of inside directed force F 21 effect during the motion. Therefore, when robot passes through carpet, position estimation error may be accumulated at any time, and robot possibly can not establish accurately Environmental map or possibly can not be effective, accurate and/or be safely navigated by water in environment, to cannot be used for execution task for example Vacuum cleaning.
General industry can consider the influence that carpet is eliminated using light stream sensor.Although light stream sensor ensure that machine The positional accuracy of people, but cannot be guaranteed the influence of the direction each opposite sex of robot motion's regularity with eliminating carpet.
Summary of the invention
To solve the above-mentioned problems, the present invention provides a kind of identification side of robot motion's carpet offset direction and amplitude Method, chip and clean robot, its technical solution is as follows:
A kind of recognition methods of the carpet offset of robot motion, robot make straight line fortune in carpet surface since initial position It is dynamic, wherein the coordinate of robot sensing requires to be transformed under global coordinate system, which includes: step S1, determines machine A device people preset direction for linear motion on carpeted surfaces, the preset direction be the preset coordinate axis of global coordinate system just Direction, while the initial position co-ordinates and initial time of recorder people, and enter step S2;Step S2, pre- at interval of first If the data that the time senses light stream sensor and the data that code-disc senses in the same time carry out fusion calculation, robot is obtained Current position coordinates, the actual distance that the driving wheel of robot advances on corresponding carpet, and enter step S3;Step S3, root According to the relative positional relationship of robot current position coordinates and initial position co-ordinates, calculating robot's current kinetic direction relative to The offset of the preset direction, then added up to obtain offset statistical value, subsequently into step S4;Wherein offset is machine The vertical range of straight line where people current location and the preset direction;Step S4, the current time and institute that judgement record obtains Whether the difference for stating initial time is greater than the second preset time, is to enter step S5, otherwise return step S2;Step S5, base In the time interval of the sensing data of the first preset time, the acquisition time of the position coordinates of the second preset time inner machine people is calculated Then number is averaging times of collection using the offset statistical value to obtain offset average value, as carpet offset, and enter Step S6;Step S6, determine that the state of the preset direction is deviateed in robot current kinetic direction according to the offset average value, Wherein, it is related to fasten the change in coordinate axis direction of robot deviation for positive and negative and the world coordinates of the offset average value, described inclined The numerical values recited for moving average value determines that the amplitude of the preset direction is deviateed in robot current kinetic direction;Wherein, the offset Amount, the offset statistical value and the times of collection are initialized to zero in the initial time;First preset time is each The time of fusion calculation;Second preset time is the detection time that carpet offset occurs for determining robot;The initial bit of robot It sets and current position coordinates is all world coordinates.
It further, further include that robot judges it in the deviation angle of carpet surface according to the angle change of gyroscope And direction, i.e., the differential seat angle of the described current position coordinates and the initial position co-ordinates measured.
Further, in the step S2, the fusion calculation process includes: when the sensing data of light stream sensor are reliable When, first dimension identical as code-disc is converted by the picture displacement amount that light stream sensor obtains in each first preset time Displacement, cumulative integral then is carried out to the sensing data of light stream sensor on time dimension, obtains light stream sensor phase For the light stream deviation post coordinate of its initial position;Then according to the rigid connection relationship of light stream sensor and robot center Light stream deviation post coordinate translation is converted to obtain the machine center coordinate under current location, i.e. the current location of robot is sat Mark corresponds to the actual distance that the driving wheel of robot on carpet advances;When the sensing data of light stream sensor are unreliable, by code The pulse data that disk senses in each first preset time carries out integral calculation on time dimension, and by calculated result The machine center coordinate is updated, so that the current position coordinates of robot are obtained, on corresponding carpet before the driving wheel of robot Into actual distance;The machine center coordinate is put down according to light stream sensor and the rigid connection relationship at robot center simultaneously Conversion is moved, and the coordinate of translation conversion is updated into the light stream deviation post coordinate;Wherein, the sensing data of light stream sensor Reliability is obtained by the interrupt signal judgement of light stream sensor built-in algorithm, when the interrupt signal of light stream sensor output is high electricity Flat, then the sensing data of light stream sensor are reliable, when the interrupt signal of light stream sensor output is low level, then light stream sensor Sensing data it is unreliable.
Further, the rigid connection relationship is that the light stream coordinate system of light stream sensor and the machine at robot center are sat Mark the relative positional relationship of system, the position including light stream sensor size, light stream sensor at a distance from robot center Position and robot center line and coordinate system of machine preset coordinate axis angle;Wherein, coordinate system of machine Preset coordinate axis positive direction is robot current kinetic direction;The preset coordinate axis positive direction and global coordinate system of coordinate system of machine Preset coordinate axis positive direction angle be based on gyroscope detection numerical value be calculated, as robot current location relative to The deviation angle of the preset direction.
Further, in the step S5, the times of collection be second preset time with described first it is default when Between ratio.
A kind of chip, for storing program, described program executes the recognition methods for controlling robot.
A kind of clean robot, the clean robot are a kind of for cleaning the robot of carpet surface, the cleaner The chip built in device people.
Compared with prior art, it is higher first reliability to be calculated in the data fusion of code-disc and light stream sensor by the present invention Relative offset coordinate data, averaged by offset cumulative at preset time intervals to complete the knowledge deviated to carpet Other process, improves accuracy of the robot to carpet offset identification, and the biased error for reducing sensor influences.
Detailed description of the invention
Fig. 1 is the structural model schematic diagram of robot in the embodiment of the present invention;
Fig. 2 is that the distribution of robot coordinate system, light stream coordinate system and global coordinate system under current location in the embodiment of the present invention are shown It is intended to;
Fig. 3 be the embodiment of the present invention in robot driving wheel carpet surface force analysis top view schematic diagram;
Fig. 4 is robot coordinate system and light stream coordinate system transition diagram in the embodiment of the present invention;
Fig. 5 is a kind of flow chart of the recognition methods of the carpet offset of robot motion provided in an embodiment of the present invention;
Fig. 6 is the flow chart of the sensing data fusion calculation method of light stream sensor provided in an embodiment of the present invention and code-disc.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention is retouched in detail It states.It should be appreciated that disclosed below, the specific embodiments are only for explaining the present invention, is not intended to limit the present invention.
In the description of invention, it is to be understood that term " center ", " longitudinal direction ", " transverse direction ", "upper", "lower", " preceding ", The orientation or positional relationship of the instructions such as " rear ", "left", "right", " hard straight ", "horizontal", "top", "bottom", "inner", "outside" be based on Orientation or positional relationship shown in the drawings is merely for convenience of description invention and simplifies description, rather than indication or suggestion is signified Device or element must have a particular orientation, be constructed and operated in a specific orientation, therefore should not be understood as to invention Limitation.
It should be noted that, in this document, relational terms such as first and second and the like are used merely to a reality Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation In any actual relationship or order or sequence.
Describe the method and system for estimating offset such as carpet offset.The embodiment of the present invention is for estimating to sweep the floor Illustrative embodiments, but the exemplary implementation are described under the system of robot carpet offset experienced or the background of method Mode will be suitable for other kinds of equipment, such as can pass through the robot moving equipment of carpet surface.It is understood that art Language carpet includes may have the cushion of texture or villus and other floor coverings.Since carpet texture acts on mobile object Caused by the direction of carpet rely on power and may will affect the movement of object.
It is used for equipped with gyroscope for the detection of rotational angle, odometer on robot carrier provided in an embodiment of the present invention The detection of stroke distances, and equipped with the sensor for being able to detect metope distance, the sensor of detection metope distance can be super Sound wave range sensor, infrared intensity detection sensor, infrared distance sensor, physical switch detect crash sensor, capacitor Or resistance variations detection sensor etc., the light stream of detection robot relative displacement coordinate is also installed on the robot carrier Sensor.Mobile robot of the invention as shown in Figure 1, Fig. 1 does not represent the real structure of robot of the invention with appearance, A schematic diagram of the invention is only represented, light stream sensor is placed on robot base and (can be any position on pedestal).Fig. 1 In, left driving wheel 11 and right driving wheel 12 of the pedestal 4 of mobile robot for fixed placement control robot direction of advance;Top Spiral shell instrument 3 can be placed on any position in the control mainboard 2 of robot, may include one or more gyros in control mainboard 2 Instrument is used to sense the rotation of robot;Control mainboard 2 is capable of handling the parameter of related sensor, and can output a control signal to The execution unit of robot.Light stream module 7 can also be installed in any position on the pedestal 4 of mobile robot, the moving machine There are also universal wheels 6 by device people.Wherein left driving wheel 11 and right driving wheel 12 are equipped with code-disc, for detecting the speed of respective wheel rotation Degree;The camera lens for the light stream sensor installed in light stream module 7 is directed parallel to ground, also with light-emitting LED in light stream module 7, It is light-emitting LED to be automatically closed or open according to the brightness of environment light, when the brightness when ground is relatively low, LED light is turned on, When the brightness of environment light is relatively high, it is shut off LED light.
It is to be understood that robot can advance than based on driving wheel when the direction of Robot carpet texture is mobile The code-disc being equipped with rotates the distance of identified distance.On the other hand, when robot inverse carpet texture is in the fiber of setting When upper traveling, robot can advance the distance shorter than the distance based on determined by driving wheel be equipped with code-disc rotation.This two In the case of kind, the actual range that robot is advanced may be different at a distance from being measured by code-disc.Since robotically-driven wheel exists It skids and is affected when being moved on carpet, be not necessarily required to installing code-disc, be selectable inertial sensor;Therefore, in machine When device people passes through carpet, position estimation error may be accumulated at any time.Therefore, robot possibly can not establish accurate environment Map or possibly can not be effective, accurate and/or be safely navigated by water in environment, to cannot be used for execution task, such as very Empty dedusting.
The embodiment of the present invention provides a kind of recognition methods of the carpet offset of robot motion, which is applied to machine Device people is in carpet surface motion process, as shown in figure 5, the recognition methods includes:
Step S501, when control mainboard 2 detects robot to after carpet surface movement, control robot is in carpet surface along pre- Set direction is for linear motion, and wherein preset direction is the X-axis of global coordinate system YOX or the positive direction of Y-axis, while when starting record Between, the position coordinates of robot, to obtain the initial position co-ordinates and initial time of robot.In embodiments of the present invention, It is for linear motion along the X-axis positive direction of global coordinate system YOX to control robot, X-axis positive direction is that displacement side it is expected by robot To.During the motion due to carpet grain direction the effects of power and lead to carpet shift phenomenon, therefore need to constantly update position seat The time variable of mark and record, and enter step S502.
Step S502, the current position coordinates of robot are obtained according to the first preset time inner sensor fusion calculation, often It is spaced the data that the first preset time senses light stream sensor and the data that code-disc senses in the same time carries out fusion calculation, Obtain the current position coordinates of robot, i.e. robot centre coordinate, the reality that the driving wheel of robot advances on corresponding carpet Distance, and enter step S503.In embodiments of the present invention, first preset time is preferably set to 10ms, as biography Sensor senses the time interval of each data.Specifically, robot along X-axis positive direction in carpet surface motion process, Sensing data in each 10ms of light stream sensor are subjected to integral calculation on time dimension, then are transformed into from light stream coordinate system Coordinate system of machine obtains robot centre coordinate, the world coordinates position RO where corresponding robot center;Code-disc is each Sensing data in 10ms carry out integral calculation on time dimension, then are transformed into light stream coordinate system from coordinate system of machine and obtain light Flow deviation post coordinate, the i.e. offset coordinates of light stream sensor.
Step S503, according to the relative positional relationship of robot current position coordinates and initial position co-ordinates, computing machine Offset of people's current kinetic direction relative to the preset direction, then added up to obtain offset statistical value, subsequently into step Rapid S504;Wherein offset is the vertical range of straight line where robot current location and the preset direction.Due to carpet Active force causes robot to rotate, and therefore, robot actual displacement as shown in Figure 4 direction OR0 deviates expectation direction of displacement OM, then current position coordinates are yr-y0 relative to the offset of desired X-axis positive direction.Then to it is above-mentioned be calculated it is more Group offset is added up to obtain offset statistical value.
Step S504, judge the difference of record obtained current time and the initial time whether be greater than second it is default when Between, it is to enter step S505, otherwise return step S502.In embodiments of the present invention, the second preset time is preferably set up For 500ms, the time spent as detection machine human hair Radix Rehmanniae blanket offset.In second preset time, robot The force direction of left driving wheel 11 and right driving wheel 12 suffered by carpet surface can change because carpet texture acts on, so that Carpet offset direction can constantly change in second preset time, and the offset changes between positive and negative values, therefore It needs to acquire multiple groups offset and carries out cumulative summation and carry out judging that the accurate carpet of the second preset time inner machine people is inclined again Move direction.Wherein, second preset time is the detection time that carpet offset occurs for determining robot.
Step S505, the position coordinates of the first preset time inner machine people are calculated according to second preset time Then times of collection is averaging to obtain offset average value to times of collection using the offset statistical value, as carpet offset, And enter step S506;In embodiments of the present invention, second preset time is preferably set to 500ms, and described first is pre- If the time is preferably set to 10ms, therefore the times of collection of the position coordinates of the second preset time inner machine people is 50 times, Then it averages to 50 cumulative obtained offset statistical values, so that offset average value is obtained, as carpet offset. Since machine is unstable in carpet surface offset direction and amplitude, therefore the embodiment of the present invention is to the position obtained in 10ms time interval It sets coordinate and carries out 50 samplings, and accumulation process improves the identification to obtain one in 500ms determining carpet offset The robustness of method;Further, the light stream coordinate data and 50 of the light stream sensor is used due to seeking carpet offset Secondary statistical value, it is understood that there may be the interference of error variance, therefore to improve the detection accuracy of the recognition methods, it need to be further right The offset statistical value is averaged, and entire data operation processing is simpler, and is easy to get accurate carpet offset Amount.
Step S506, determine that the journey of the preset direction is deviateed in robot current kinetic direction according to the offset average value Degree, correspondingly, the change in coordinate axis direction that the positive and negative and world coordinates for deviating average value fastens robot deviation are related, institute The numerical values recited for stating offset average value determines that the amplitude of the preset direction is deviateed in robot current kinetic direction.
As one embodiment, robot judges it in the deviation angle of carpet surface also according to the angle change of gyroscope Degree and direction, i.e. the robot differential seat angle that gyroscope measures at the current position coordinates and the initial position co-ordinates.? In the embodiment of the present invention, as shown in figure 3, when the offset average value is positive number, it is determined that Y-axis of the robot toward global coordinate system Positive direction offset, offset amplitude size are the offset average value, machine known to the angle change obtained in conjunction with gyroscope measurement Device people deviates the angular dimension deviated toward the Y-axis positive direction of global coordinate system, so that it is determined that the carpet active force that robot is subject to Resultant force vector value;When the offset average value is negative, it is determined that the Y-axis negative direction of robot toward global coordinate system deviates, Offset amplitude size is the absolute value of the offset average value, robot known to the angle change obtained in conjunction with gyroscope measurement The angular dimension deviated toward the Y-axis negative direction of global coordinate system is deviated, so that it is determined that the conjunction for the carpet active force that robot is subject to Force vector magnitude.If the expectation direction of displacement is the X-axis positive direction of the global coordinate system, the offset average value is just The negative change in coordinate axis direction for fastening robot deviation to the world coordinates is related.
It should be noted that if robot initial pose, environment and target are it is known that navigation problem is converted into global path Planning problem, therefore the coordinate of the code-disc of robot and light stream sensor sensing requires to be transformed under global coordinate system and be melted again It is total to calculate, and the current position coordinates of finally obtained robot are the position coordinates under global coordinate system.
Specifically, the offset, the offset statistical value and the times of collection are all initialized in the initial time It is zero.First preset time is the renewal time completing Single cell fusion and calculating;Second preset time is determining robot spot The detection time of blanket offset;The initial position and current position coordinates of robot are all world coordinates.
Specifically, the distribution schematic diagram of robot coordinate system, light stream coordinate system and global coordinate system are as shown in Fig. 2, machine People's coordinate system is using robot center RO under current location as origin, and robot direction of advance is R_X axis under corresponding current location The coordinate system of positive direction, robot coordinate system further include the R_Y axis perpendicular to R_X axis direction;R0 pairs of robot coordinate system center Ying Yu is placed on the gyroscope 3 of 2 center of control mainboard at robot center.Global coordinate system is with robot initial position For origin, using robot from the direction of advance of initial position be X-axis positive direction, using perpendicular to X-direction as the coordinate of Y-axis System;Light stream coordinate system is pixel coordinate system, different from the unit of robot coordinate system and global coordinate system, is with light stream module 7 Center PO be coordinate system that origin, orthogonal P_X axis and P_Y axis are reference axis.Above three coordinate system is all abided by From the right-hand rule;Wherein robot coordinate system and light stream coordinate system are all relative coordinate systems, and origin is with robot present bit The variation set and change.It is first quartile on the left of Y-axis in the global coordinate system, rotation is followed successively by the second quadrant counterclockwise, Third quadrant, fourth quadrant, wherein the absolute value holding that the angle of the preset direction is deviateed in the robot motion direction is set It is set to
Specifically, light stream sensor passes through given pace continuous acquisition body surface image in light stream module 7, then by machine The control mainboard 2 of people analyzes generated image pixel point.Since always there are identical spies for adjacent two images Sign, so can judge the mean motion of body surface feature by the change in location information for comparing these characteristic points;So Afterwards according to pixel spot speed principle of identity in same pixel gray level principle of invariance and same image-region, light stream field equation is established And solve and obtain the movement velocity of pixel, integral calculation is then carried out, thus the image obtained using the light stream sensor Characteristic information integrating meter calculates the picture displacement amount that robot obtains in first preset time, and picture displacement amount is light The numerical value under coordinate system is flowed, unit need to be converted to chainage unit, therefore picture displacement amount is converted into and code-disc equal amount The displacement of guiding principle.
As the embodiment of the present invention, the rigid connection relationship at light stream sensor and robot center is the light of light stream sensor The relative positional relationship of the coordinate system of machine at coordinate system and robot center is flowed, in position and robot including light stream sensor Heart position apart from size, light stream sensor position and the line of robot center and the preset coordinate of coordinate system of machine The angle of axis;Wherein, the preset coordinate axis positive direction of coordinate system of machine is robot current kinetic direction;Coordinate system of machine it is pre- If the angle of reference axis positive direction and the preset coordinate axis positive direction of global coordinate system is calculated based on gyroscope detection numerical value It arrives, the deviation angle as robot current location relative to the preset direction.As shown in Figure 2 and Figure 4, robot coordinate system Origin RO and the relative positional relationship of point of origin P O of light stream coordinate system be the light stream sensor and the inertial sensor Rigid body connection relationship, point of origin P the O distance L and line segment PORO of origin RO and light stream coordinate system including robot coordinate system Angle absolute value with straight line where the R_X axis of robot coordinate system is, robot coordinate system is opposite with light stream coordinate system Positional relationship is remained unchanged in robot kinematics to form the rigid body connection relationship, the origin RO of robot coordinate system Physical location correspond to and be placed in the gyroscope 3 of robot center, the physical location pair of the point of origin P O of light stream coordinate system It should be in light stream module 7.
As shown in figure 4, the coordinate system conversion method based on above-mentioned rigid connection relationship: robot center R0 is in complete In the fourth quadrant of office's coordinate system, the light stream sensor in light stream module 7 senses the coordinate under light stream coordinate system, is transformed into complete Office's coordinate system obtains the first predicted position coordinate PO (xp4, yp4), in the fourth quadrant of global coordinate system, i.e. machine People's current kinetic direction is toward the angle that Y-axis negative direction deviates X-axis positive direction, which is effect of the robot by carpet Constant offset angle caused by power, can sense revolute angle by gyroscope 3 is.According to trigonometric function relational expression The first predicted position coordinate is obtained obtaining under the position of robot center by weight of rigid body connection relationship translation To the second predicted position coordinate, i.e. current position coordinates RO (xr4, yr4) of the robot center under global coordinate system, Ke Yiyou The example formula approximate representation of fourth quadrant:
The applicable specific embodiment of above-mentioned formula are as follows: gyroscope 3 is located at robot center, and light stream module 7 is located at robot center Lower right.The light stream coordinate shift amount that light stream sensor measurement obtains obtains in robot by aforementioned coordinate system conversion method Current position coordinates of the heart under global coordinate system be RO (xr4, yr4), body deviation of gravity center move desired locations (xr4, 0) angle is
It should be noted that robot center further include in the first quartile of global coordinate system, the second quadrant and The embodiment of third quadrant.In these embodiments, gyroscope 3 is located at robot center, and light stream module 7 is located at robot center Lower right, the expectation direction of displacement of robot is X-axis positive direction, i.e., the described preset direction is X-axis positive direction.
Robot center R1 is in the embodiment in the first quartile of global coordinate system: first predicted position is sat It marks P1 (xp1, yp1), is put down the first predicted position coordinate according to the rigid body connection relationship according to trigonometric function relational expression It moves conversion to obtain obtaining the second predicted position coordinate under the position of robot center, i.e., robot center is in global coordinate system the Current position coordinates R1 (xr1, yr1) in one quadrant, it is close by trigonometric function relational expression on the basis of the fourth quadrant Like expression:
Robot center R2 is in the embodiment in the second quadrant of global coordinate system: the first predicted position coordinate P2 (xp2, yp2) translates the first predicted position coordinate according to the rigid body connection relationship according to trigonometric function relational expression Conversion obtains obtaining the second predicted position coordinate under the position of robot center, i.e., robot center is in global coordinate system second Current position coordinates R2 (xr2, yr2) in quadrant, can be by following exemplary formula approximate representation:
Robot center R3 is in the embodiment in the third quadrant of global coordinate system, the first predicted position coordinate P3 (xp3, yp3) changes the first predicted position coordinate according to rigid body connection relationship translation according to trigonometric function relational expression Calculation obtains obtaining the second predicted position coordinate under the position of robot center, i.e., robot center global coordinate system third as Current position coordinates R3 (xr3, yr3) in limit, can be by following exemplary formula approximate representation:
In addition, that is, described preset direction is not X-axis positive direction if the expectation direction of displacement of robot is not X-axis positive direction, or Person, light stream module 7 are not located at the lower right at robot center, then according to the thinking of the example formula by fourth quadrant, and tie Close the center position coordinates that robot is calculated in corresponding trigonometric function relationship, the invention structure of their coordinate system conversion method Think of is identical, therefore repeats no more the other embodiments of the position of above-mentioned expectation direction of displacement and light stream module 7 herein.
As the embodiment of the present invention, since light stream sensor improves the positional accuracy of robot, but light stream sensor It is not necessarily reliable to sense data, therefore needs to carry out the fusion calculation by code-disc data, specifically, when light stream sensor exports Interrupt signal be high level, then the sensing data of light stream sensor are reliable;When the interrupt signal of light stream sensor output is low Level, then the sensing data of light stream sensor are unreliable;Wherein, the interrupt signal is the processing sense of light stream sensor built-in algorithm As a result, being prior art means, so it will not be repeated obtained from measured data.
Fusion calculation in the step S502 includes, as shown in Figure 6:
Step S601: code-disc sensing pulse data, while light stream sensor senses optical flow data, subsequently into step S602.
Step S602: judging whether the sensing data of light stream sensor are reliable, be to enter step S603, otherwise enters step Rapid S606.
Step S603: by the picture displacement amount that light stream sensor obtains in each first preset time be converted into The displacement of the identical dimension of code-disc, specifically, when updating map reference using the optical flow data, by the single arteries and veins of the code-disc Rush the offset number of distance values and relative coordinate of the light stream sensor within the identical pulse period measured in the period The ratio of value is multiplied by the Units conversion factor as Units conversion factor, then by the optical flow data, obtains unit after reunification Numerical value.Then the sensing data of light stream sensor in each first preset time are subjected to cumulative realize on time dimension Integral calculation corresponds to light stream sensing to obtain light stream deviation post coordinate of the light stream sensor relative to its initial position The measurement result that device currently exports.Subsequently into step S604.
Step S604: public according to the revealed aforementioned exemplary of the rigid connection relationship of light stream sensor and robot center Formula, i.e., the triangle geometrical relationship that angular relationship constructs at a distance from light stream coordinate system according to robot coordinate system, by the light stream Deviation post coordinate obtains robot location's coordinate by weight of aforementioned coordinate system conversion method carries out translation, corresponds on carpet The actual distance that the driving wheel of robot advances, subsequently into step S605;
Step S605: step S604 obtained robot location's coordinate updates the coordinate data that code-disc currently exports.Then it returns Step S601.Relative to the measurement result that code-disc before non-fusion treatment exports, the result of the step fusion calculation is relatively reliable steady It is fixed.
Step S606: time dimension integral is carried out to the pulse data of code-disc sensing, to obtain the robot center Coordinate, the coordinate data can be updated entering step in S605 next time by robot location's coordinate.Subsequently into step S607.Due to code-disc by the pulse number of generation per second come the movement velocity of recorder people, so by code-disc in each institute It states the pulse data sensed in the first preset time and carries out integral calculation on time dimension, the current location for obtaining robot is sat Mark corresponds to the actual distance that the driving wheel of robot on carpet advances.
Step S607: the result of integral calculation described in step S606 is updated into the coordinate data that code-disc currently exports.So After enter step S608.Before updating machine center coordinate described in step S604, in the machine as described in step S604 Heart coordinate can be the result (light stream sensor of sensing data reliable stage the sensed data Integral Transformation of light stream sensor In the measurement result that the sensing data reliable stage is exported), so the update operation guarantees the position of the robot measured Set the accuracy of coordinate.Simultaneously according to the revealed aforementioned exemplary of rigid connection relationship of light stream sensor and robot center The inverse operation formula of formula, i.e., the triangle geometry that angular relationship constructs at a distance from light stream coordinate system according to robot coordinate system close The machine center coordinate is carried out inverse conversion according to aforementioned coordinate system conversion method by system, obtains light stream sensor currently inclined Pan position coordinate.
Step S608: the offset coordinates of light stream sensor obtained in step S607 are updated into light stream sensor and are currently exported Coordinate data.Then return step S601.Since the light stream deviation post coordinate can be the sensing number of light stream sensor According to the integral calculation carried out on cumulative realization time dimension as a result, but since the sensing data there are light stream sensor are insecure Situation is changed so translation need to be carried out the machine center coordinate that the pulse data that code-disc in step S606 senses integrates It calculates, and the result of translation conversion is updated into the light stream deviation post coordinate being calculated in step S603, to improve light stream The accuracy of the sensing data integral operation of light stream sensor when the sensing data of sensor are reliable.
The embodiment of the present invention carries out reliability judgement by the data that built-in light stream sensor and code-disc sense in real time, so Afterwards according to the result of sensor reliability judgement select one of sensor sensing data be transformed under light stream coordinate system into The actual distance that row integral operation is advanced in the hope of the driving wheel of robot on more accurate carpet reduces carpet and deviates institute's band The error for the force effect come.
A kind of chip, for storing program, described program executes the recognition methods for controlling robot, to realize The intelligent cleaning of robot on carpeted surfaces, improves sweeping efficiency.The chip passes through light stream sensor, gyroscope and code Disk come determine straight line to be walked initial position message (X1, Y1, θ 1) and robot ambulation during specific current location Information (X2, Y2, θ 2), then by angle, θ 1 and the angle, θ 2 in current location information in initial position message difference and work as The vertical range of the straight line where the preset direction is deviateed in front position, to judge it is described straight whether the walking of robot deviates from Line.
Assemble robot of the chip as control chip, it is only necessary to by the testing number of light stream sensor and code-disc According to, so that it may judge whether robot ambulation deviation occurs, and robot can be efficiently controlled according to the numerical value of deviation and corrected Deviation, to keep preferable straight line walking effect, advantage of lower cost.Meanwhile it is related to correction about the detection of deviation Data are conciser, and data operation processing is also relatively simpler, do not need high performance processor, further reduce system operations The hardware cost of resource and robot.
The above embodiments are only used to illustrate the technical solution of the present invention., rather than its limitations;Although referring to aforementioned each reality Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each Technical solution documented by embodiment is modified, or equivalent substitution of some or all of the technical features;And These are modified or replaceed, the range for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution.

Claims (7)

1. the recognition methods that the carpet of robot motion a kind of deviates, robot make straight line in carpet surface since initial position Movement, wherein the coordinate of robot sensing requires to be transformed under global coordinate system, which is characterized in that the recognition methods includes:
Step S1, determine that a robot preset direction for linear motion on carpeted surfaces, the preset direction are global sit The preset coordinate axis positive direction of system, while the initial position co-ordinates and initial time of recorder people are marked, and enter step S2;
Step S2, the data that sense light stream sensor at interval of the first preset time and the data that code-disc senses in the same time Fusion calculation is carried out, the current position coordinates of robot are obtained, the actual distance of the driving wheel advance of robot on corresponding carpet, And enter step S3;
Step S3, according to the relative positional relationship of robot current position coordinates and initial position co-ordinates, calculating robot is current Offset of the direction of motion relative to the preset direction, then added up to obtain offset statistical value, subsequently into step S4;Its Middle offset is the vertical range of straight line where robot current location and the preset direction;
Step S4, judge whether the difference at the current time and the initial time that record obtains is greater than the second preset time, be S5 is then entered step, otherwise return step S2;
Step S5, the time interval of the sensing data based on the first preset time calculates the position of the second preset time inner machine people The times of collection of coordinate is set, then times of collection is averaging using the offset statistical value to obtain offset average value, as ground Blanket offset, and enter step S6;
Step S6, determine that the state of the preset direction is deviateed in robot current kinetic direction according to the offset average value, In, the change in coordinate axis direction that the positive and negative and world coordinates for deviating average value fastens robot deviation is related, the offset The numerical values recited of average value determines that the amplitude of the preset direction is deviateed in robot current kinetic direction;
Wherein, the offset, the offset statistical value and the times of collection are initialized to zero in the initial time;The One preset time is the time of each fusion calculation;When second preset time is the detection that carpet offset occurs for determining robot Between;The initial position and current position coordinates of robot are all world coordinates.
2. recognition methods according to claim 1, which is characterized in that further include robot according to the angle change of gyroscope come Judge its deviation angle and direction in carpet surface, i.e., the described current position coordinates and the initial position co-ordinates measure Differential seat angle.
3. recognition methods according to claim 1, which is characterized in that in the step S2, the fusion calculation process includes:
When the sensing data of light stream sensor are reliable, first light stream sensor is obtained in each first preset time Picture displacement amount is converted into the displacement of dimension identical as code-disc, then to the sensing data of light stream sensor on time dimension Cumulative integral is carried out, obtains light stream deviation post coordinate of the light stream sensor relative to its initial position;Then it is passed according to light stream The rigid connection relationship at sensor and robot center converts light stream deviation post coordinate translation to obtain the machine under current location Centre coordinate, the i.e. current position coordinates of robot correspond to the actual distance that the driving wheel of robot on carpet advances;
When the sensing data of light stream sensor are unreliable, umber of pulse that code-disc is sensed in each first preset time According to carrying out integral calculation on time dimension, and calculated result is updated into the machine center coordinate, to obtain robot Current position coordinates correspond to the actual distance that the driving wheel of robot on carpet advances;Simultaneously according to light stream sensor and machine The rigid connection relationship at people center converts the machine center coordinate translation, and the coordinate of translation conversion is updated the light stream Deviation post coordinate;
Wherein, the reliability of the sensing data of light stream sensor is obtained by the interrupt signal judgement of light stream sensor built-in algorithm, When the interrupt signal of light stream sensor output is high level, then the sensing data of light stream sensor are reliable, when light stream sensor is defeated Interrupt signal out is low level, then the sensing data of light stream sensor are unreliable.
4. recognition methods according to claim 3, which is characterized in that the rigid connection relationship is the light stream of light stream sensor The relative positional relationship of coordinate system and the coordinate system of machine at robot center, position and robot center including light stream sensor The preset coordinate axis apart from size, the position of light stream sensor and the line of robot center and coordinate system of machine of position Angle;Wherein, the preset coordinate axis positive direction of coordinate system of machine is robot current kinetic direction;Coordinate system of machine is preset The angle of reference axis positive direction and the preset coordinate axis positive direction of global coordinate system is calculated based on gyroscope detection numerical value, Deviation angle as robot current location relative to the preset direction.
5. recognition methods according to claim 1, which is characterized in that in the step S5, the times of collection is described The ratio of two preset times and first preset time.
6. a kind of chip, for storing program, which is characterized in that described program for control robot perform claim require 1 to The recognition methods of any one of claim 5.
7. a kind of clean robot, which is a kind of for cleaning the robot of carpet surface, which is characterized in that institute State chip described in claim 6 built in clean robot.
CN201811240369.4A 2018-10-23 2018-10-23 A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet Pending CN109358623A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811240369.4A CN109358623A (en) 2018-10-23 2018-10-23 A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811240369.4A CN109358623A (en) 2018-10-23 2018-10-23 A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet

Publications (1)

Publication Number Publication Date
CN109358623A true CN109358623A (en) 2019-02-19

Family

ID=65346268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811240369.4A Pending CN109358623A (en) 2018-10-23 2018-10-23 A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet

Country Status (1)

Country Link
CN (1) CN109358623A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110723484A (en) * 2019-09-26 2020-01-24 兰剑智能科技股份有限公司 Shuttle vehicle walking automatic deviation rectifying method and device, computer equipment and storage medium
CN111089595A (en) * 2019-12-30 2020-05-01 珠海市一微半导体有限公司 Detection data fusion method of robot, main control chip and robot
CN112336258A (en) * 2019-08-09 2021-02-09 松下知识产权经营株式会社 Mobile robot, control method, and storage medium
CN113625658A (en) * 2021-08-17 2021-11-09 杭州飞钛航空智能装备有限公司 Offset information processing method and device, electronic equipment and hole making mechanism
CN114355920A (en) * 2021-12-27 2022-04-15 深圳市银星智能科技股份有限公司 Method and device for controlling traveling direction, intelligent equipment and storage medium
CN114641229A (en) * 2019-08-26 2022-06-17 苏州宝时得电动工具有限公司 Cleaning robot and control method thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314176A (en) * 2010-07-01 2012-01-11 德国福维克控股公司 Self-propelled device and method for orienting such a device
WO2015162812A1 (en) * 2014-04-21 2015-10-29 シャープ株式会社 Road surface detection sensor and autonomous driving device equipped with said road surface detection sensor
CN105717924A (en) * 2012-06-08 2016-06-29 艾罗伯特公司 Carpet drift estimation using differential sensors or visual measurements
CN105973240A (en) * 2016-07-15 2016-09-28 哈尔滨工大服务机器人有限公司 Conversion method of navigation module coordinate system and robot coordinate system
CN107728616A (en) * 2017-09-27 2018-02-23 广东宝乐机器人股份有限公司 The map creating method and mobile robot of mobile robot
CN108638053A (en) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 A kind of detection method and its antidote of robot skidding

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314176A (en) * 2010-07-01 2012-01-11 德国福维克控股公司 Self-propelled device and method for orienting such a device
CN105717924A (en) * 2012-06-08 2016-06-29 艾罗伯特公司 Carpet drift estimation using differential sensors or visual measurements
WO2015162812A1 (en) * 2014-04-21 2015-10-29 シャープ株式会社 Road surface detection sensor and autonomous driving device equipped with said road surface detection sensor
CN105973240A (en) * 2016-07-15 2016-09-28 哈尔滨工大服务机器人有限公司 Conversion method of navigation module coordinate system and robot coordinate system
CN107728616A (en) * 2017-09-27 2018-02-23 广东宝乐机器人股份有限公司 The map creating method and mobile robot of mobile robot
CN108638053A (en) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 A kind of detection method and its antidote of robot skidding

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112336258A (en) * 2019-08-09 2021-02-09 松下知识产权经营株式会社 Mobile robot, control method, and storage medium
US11630463B2 (en) 2019-08-09 2023-04-18 Panasonic Intellectual Property Management Co., Ltd. Mobile robot, control method, and storage medium
CN112336258B (en) * 2019-08-09 2023-12-15 松下知识产权经营株式会社 Mobile robot, control method, and storage medium
CN114641229A (en) * 2019-08-26 2022-06-17 苏州宝时得电动工具有限公司 Cleaning robot and control method thereof
CN110723484A (en) * 2019-09-26 2020-01-24 兰剑智能科技股份有限公司 Shuttle vehicle walking automatic deviation rectifying method and device, computer equipment and storage medium
CN111089595A (en) * 2019-12-30 2020-05-01 珠海市一微半导体有限公司 Detection data fusion method of robot, main control chip and robot
CN113625658A (en) * 2021-08-17 2021-11-09 杭州飞钛航空智能装备有限公司 Offset information processing method and device, electronic equipment and hole making mechanism
CN113625658B (en) * 2021-08-17 2022-12-06 杭州飞钛航空智能装备有限公司 Offset information processing method and device, electronic equipment and hole making mechanism
CN114355920A (en) * 2021-12-27 2022-04-15 深圳市银星智能科技股份有限公司 Method and device for controlling traveling direction, intelligent equipment and storage medium
CN114355920B (en) * 2021-12-27 2024-02-02 深圳银星智能集团股份有限公司 Control method and device for traveling direction, intelligent equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109394095A (en) A kind of control method, chip and the clean robot of the offset of robot motion's carpet
CN109358623A (en) A kind of recognition methods, chip and the clean robot of the offset of robot motion's carpet
US20240241522A1 (en) Localization and mapping using physical features
Campbell et al. A robust visual odometry and precipice detection system using consumer-grade monocular vision
CN104302453B (en) Use the carpet bias estimation of differential pick-up or vision measurement
Burschka et al. Vision-based control of mobile robots
Lingemann et al. High-speed laser localization for mobile robots
US9020637B2 (en) Simultaneous localization and mapping for a mobile robot
CN109506652A (en) A kind of optical flow data fusion method and clean robot based on carpet offset
KR101338143B1 (en) Apparatus and Method for Detecting Slip of a Mobile Robot
US20150212521A1 (en) Simultaneous Localization And Mapping For A Mobile Robot
Wulf et al. Benchmarking urban six‐degree‐of‐freedom simultaneous localization and mapping
CN108481327A (en) A kind of positioning device, localization method and the robot of enhancing vision
US20230278214A1 (en) Robot localization using variance sampling
CN105606092A (en) Method and system for locating indoor robot
Johnson Vision-assisted control of a hovering air vehicle in an indoor setting
Brooks et al. Humanoid robot navigation and obstacle avoidance in unknown environments
Jesus et al. Simultaneous localization and mapping for tracked wheel robots combining monocular and stereo vision
Rybski et al. Appearance-based mapping using minimalistic sensor models
Zhang et al. Plane-aided visual-inertial odometry for pose estimation of a 3d camera based in-door blind navigation
Basit et al. Joint localization and target tracking with a monocular camera
Bais et al. Location tracker for a mobile robot
Zhu et al. Indoor Robot Localization Based on Visual Perception and on Particle Filter Algorithm of Increasing Priority Particles
Khattak Multi-Modal Landmark Detection and Tracking for Odometry Estimation in Degraded Visual Environments
Jordan Visual Odometry and Traversability Analysis for Wheeled Robots in Complex Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190219