Nothing Special   »   [go: up one dir, main page]

CN115164882B - Laser distortion removal method, device and system and readable storage medium - Google Patents

Laser distortion removal method, device and system and readable storage medium Download PDF

Info

Publication number
CN115164882B
CN115164882B CN202210826873.2A CN202210826873A CN115164882B CN 115164882 B CN115164882 B CN 115164882B CN 202210826873 A CN202210826873 A CN 202210826873A CN 115164882 B CN115164882 B CN 115164882B
Authority
CN
China
Prior art keywords
laser
robot
error
distortion removal
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210826873.2A
Other languages
Chinese (zh)
Other versions
CN115164882A (en
Inventor
刘心怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN202210826873.2A priority Critical patent/CN115164882B/en
Publication of CN115164882A publication Critical patent/CN115164882A/en
Application granted granted Critical
Publication of CN115164882B publication Critical patent/CN115164882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a laser distortion removal method, a laser distortion removal device, a laser distortion removal system and a readable storage medium. Wherein the method comprises the following steps: confirming whether the robot is currently in an inclined state or not based on an inertial measurement device; calculating the horizontal distance of the error laser projection of the robot under the condition that the robot is in an inclined state; and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser. According to the method provided by the invention, the gesture of the robot can be obtained in real time or at regular time, when the gesture is in an inclined state, the error target position is filtered through the least square method according to the calculated horizontal distance of the error laser projection, namely the error projection laser is filtered, so that the situation that the robot takes an uneven ground as an obstacle in the running process to cause navigation failure can be avoided, and the positioning precision and the navigation planning capacity of the robot in navigation are improved.

Description

Laser distortion removal method, device and system and readable storage medium
Technical Field
The present invention relates to the field of laser technology, and in particular, to a method, an apparatus, a system, and a readable storage medium for removing laser distortion.
Background
The laser radar sensor has the characteristics of accurate ranging and wide range, and is widely applied to various fields of robots, such as the fields of map building, positioning and navigation of robots.
According to the difference of the fields of view of the lidar sensor, the radar sensor can be further divided into a 2D lidar and a 3D lidar. The 2D laser radar sensor is widely applied to mapping, positioning and navigation of indoor robots, and compared with the 3D laser radar sensor, the 2D laser radar sensor has no height information, and only sets a 2D coordinate mapping for a detected object, and is mainly used for creating a two-dimensional map and identifying obstacles.
However, when the robot is applied to a small robot and the installation position is low, the attitude of the robot is easily affected by the environment, when the robot is provided with a high ground through carpets, slopes, floor tiles with special decorative surfaces and the like, the robot body is easily inclined, the laser radar sensor is easily affected by the inclination of the body, and the laser projection is caused to be erroneously recognized as an obstacle at the position close to the ground, so that the positioning precision and the navigation planning of the robot in navigation are seriously affected.
Disclosure of Invention
In view of this, the present invention provides a laser distortion removal method, comprising:
confirming whether the robot is currently in an inclined state or not based on an inertial measurement device;
calculating the horizontal distance of the error laser projection of the robot under the condition that the robot is in an inclined state;
And screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser.
Preferably, the determining the current tilt state of the robot based on the inertial measurement unit includes:
Acquiring current pose data of the robot based on the inertial measurement device; the current pose data comprise pitch angles; wherein the pitch angle is the pitch angle of the inertial measurement unit;
Judging whether the pitch angle is larger than a preset angle threshold value or not;
If yes, judging that the robot is in an inclined state.
Preferably, the current pose data further comprises a vertical height of the inertial measurement device from the ground;
the calculating the horizontal distance of the error laser projection of the robot comprises:
and calculating the horizontal distance of the error laser projection according to the vertical height and the pitch angle.
Preferably, the horizontal distance of the erroneous laser projection is calculated using the following formula:
Wherein L is the horizontal distance of the error laser projection, H is the vertical height, and θ is the pitch angle.
Preferably, the screening the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection includes:
Acquiring a plurality of actual position points based on a polar coordinate system of the advancing direction of the robot based on the horizontal distance of the error laser projection;
and screening out the wrong point in the actual position points by using a least square method as the position of the wrong laser projection data.
Preferably, the obtaining the plurality of actual position points based on the polar coordinate system of the robot advancing direction based on the horizontal distance of the error laser projection includes:
and acquiring a plurality of actual position points in a rectangular area with the length L of the advancing direction of the robot based on the polar coordinate system.
Preferably, the step of using the least square method to screen out the wrong point of the actual position points as the position of the data of the wrong laser projection includes:
calculating a linear equation by using a least square method to obtain a theoretical value;
bringing the actual position point into the linear equation to obtain an actual value;
Judging whether the difference value between the actual value and the theoretical value is smaller than a preset comparison threshold value or not;
If yes, judging that the actual position point corresponding to the actual value is the wrong point, and taking the wrong point as the wrong target position.
In addition, in order to solve the above problems, the present application also provides a laser distortion removal apparatus, including:
the confirmation module is used for confirming whether the robot is in an inclined state currently or not based on the inertia measurement device;
the calculation module is used for calculating the horizontal distance of the error laser projection of the robot under the condition that the robot is in an inclined state;
and the filtering module is used for screening the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser.
In addition, in order to solve the above problems, the present application also provides a laser distortion removal system, including a memory for storing a laser distortion removal program, and a processor that runs the laser distortion removal program to cause the laser distortion removal system to execute the laser distortion removal method as described above.
In addition, in order to solve the above-mentioned problems, the present application also provides a computer-readable storage medium having stored thereon a laser distortion removal program which, when executed by a processor, implements the laser distortion removal method as described above.
The invention provides a laser distortion removal method, a device, a system and a readable storage medium, wherein the laser distortion removal method comprises the following steps: confirming whether the robot is currently in an inclined state or not based on an inertial measurement device; calculating the horizontal distance of the error laser projection of the robot under the condition that the robot is in an inclined state; and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser. According to the method provided by the invention, the gesture of the robot can be obtained in real time or at regular time, when the gesture is in an inclined state, the error target position is filtered through the least square method according to the calculated horizontal distance of the error laser projection, namely the error projection laser is filtered, so that the situation that the robot takes an uneven ground as an obstacle in the running process to cause navigation failure can be avoided, and the positioning precision and the navigation planning capacity of the robot in navigation are improved.
Drawings
FIG. 1 is a schematic diagram of a hardware operating environment according to an embodiment of the laser distortion removal method of the present invention;
FIG. 2 is a flow chart of the laser distortion removal method according to embodiment 1 of the present invention;
FIG. 3 is a schematic view of a laser channel when a robot tilts in embodiment 1 of the laser distortion removal method of the present invention;
FIG. 4 is a flow chart of the laser distortion removal method according to embodiment 2 of the present invention;
FIG. 5 is a flow chart of the laser distortion removal method according to embodiment 3 of the present invention;
FIG. 6 is a schematic diagram showing a calculation method of L when the robot tilts in embodiment 3 of the laser distortion removal method of the present invention;
FIG. 7 is a flowchart of the laser distortion removal method according to embodiment 4 of the present invention;
FIG. 8 is a schematic diagram showing the calculation of a linear equation in embodiment 4 of the laser distortion removal method of the present invention;
fig. 9 is a schematic block diagram of a laser distortion removal apparatus according to the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
Embodiments of the present invention are described in detail below, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a schematic structural diagram of a hardware operating environment of a terminal according to an embodiment of the present invention.
The laser distortion removal system provided by the invention can be a PC, a mobile terminal device such as a smart phone, a tablet computer or a portable computer, and the like. The laser distortion removal system may include: a processor 1001, e.g. a CPU, a network interface 1004, a user interface 1003, a memory 1005 and a communication bus 1002. Wherein the communication bus 1002 is used to enable connected communication between these components. The user interface 1003 may include a display screen, an input unit such as a keyboard, a remote control, and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a stable memory such as a disk memory. The memory 1005 may also optionally be a storage device separate from the processor 1001 described above. Optionally, the laser distortion removal system may also include RF (Radio Frequency) circuitry, audio circuitry, wiFi modules, and the like. In addition, the laser distortion removing system can be further provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor and the like, and the description is omitted here.
Those skilled in the art will appreciate that the laser distortion removal system shown in fig. 1 is not limiting and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. As shown in fig. 1, an operating system, a data interface control program, a network connection program, and a laser distortion removal program may be included in a memory 1005 as one type of computer-readable storage medium.
The invention provides a laser distortion removal method, a device, a system and a readable storage medium. The method can confirm the state of the robot in real time according to the inertial measurement device, and when the robot is in an inclined state, the horizontal distance of the obtained error laser projection is utilized to calculate and filter unnecessary error projection laser by a least square method.
Example 1:
referring to fig. 2, embodiment 1 of the present invention provides a laser distortion removal method, including:
Step S100, confirming whether the robot is in an inclined state currently based on an inertial measurement device;
The inertial measurement device is IMU (Inertial Measurement Unit), also called an inertial measurement unit, and is used for measuring the three-axis attitude angle (or angular velocity) and acceleration of the object. Gyroscopes and accelerometers are the primary elements of an IMU, whose accuracy directly affects the accuracy of the inertial system. Generally, the IMU is to be mounted on the center of gravity of the object under test.
The robot may be a robot capable of moving in a plane within a certain range, and may include, for example, but not limited to, a floor sweeping robot, a floor mopping robot, a cleaning robot, a showroom robot, a sterilizing robot, and the like.
The inclined state refers to a state that the robot body is inclined relative to the horizontal plane due to the fact that the robot encounters an uneven plane in the running process. For example, the sweeping robot is low in position, and when the sweeping robot advances, the sweeping robot encounters a plane such as a carpet, a ground mat, a fiber fabric and the like, or advances on a floor tile with an uneven front surface, or a part of a small area on the ground plane has an inclination angle, so that the robot is inclined in the area, and the state is an inclined state.
In this embodiment, a robot provided with a 2D lidar is particularly aimed at. In the moving process of the robot, laser is projected towards the advancing direction through the 2D laser radar, and as the robot does not have height information, two-dimensional coordinate mapping can only be performed on the detected object mark relative to the advancing plane of the robot, if an uneven plane is encountered, an inclined state is caused to occur to the robot, and if the robot moves towards the advancing direction by taking the position of the laser radar as a reference, two situations can exist: in the state A (refer to FIG. 3), the laser radar of the robot is projected to the near ground, and the whole robot is inclined forwards and downwards at the moment, for example, the tail end (relative to the advancing direction) is tilted to form an included angle of 30 degrees with the horizontal plane; and in the B state, the whole robot tilts forwards and upwards, the laser radar projects into the air, and at the moment, the whole robot tilts forwards, for example, the front section of the robot tilts to form an included angle of 30 degrees with the horizontal plane.
If the laser radar of the robot projects to the near ground in the state A, the whole robot tilts downwards forwards and the tail end tilts, the laser radar projected to the advancing direction is projected to the right ahead of the horizontal plane, but only can be projected to the near ground due to tilting, so that the system judges that an obstacle which should not appear is generated due to incorrect identification, and the situation is aimed at in the embodiment.
The B state can also be corrected by the algorithm in the present embodiment.
Step S200, calculating the horizontal distance of the error laser projection of the robot under the condition that the robot is in an inclined state;
The horizontal distance of the error laser projection is the distance between the point where the laser is still projected forward (near the ground) on the plane where the laser is located according to the original path and the intersection point between the position where the laser radar is located and the plane when the laser is in the inclined state (A state), and is the horizontal distance of the laser projection.
Step S300, based on the horizontal distance of the error laser projection, the range of error laser data is screened out by using a least square method and is used as error projection laser and filtered.
The least square method is a mathematical tool widely used in many fields of data processing such as error estimation, uncertainty, system identification, prediction, and forecasting.
The calculation and screening can be performed by the least square method based on the horizontal distance of the erroneous laser projection. For example, the calculation is performed by using a least square method, and by comparing the predicted data and the actual data, further screening is performed, a laser path or point in which distortion occurs can be found, and removal and filtering of the erroneous target position are achieved.
According to the method provided by the embodiment, the gesture of the robot can be acquired in real time or at regular time, when the gesture is in an inclined state, the error target position is filtered through the least square method according to the calculated horizontal distance of the error laser projection, namely the error projection laser is filtered, so that the situation that the robot takes uneven ground as an obstacle in the running process to cause navigation failure can be avoided, and the positioning precision and the navigation planning capacity of the robot in navigation are improved.
Example 2:
Referring to fig. 4, based on embodiment 1, embodiment 2 of the present invention provides a laser distortion removal method, wherein the step S100 of confirming the current tilt state of the robot based on the inertial measurement device includes:
step S110, acquiring current pose data of the robot based on the inertial measurement device; the current pose data comprise pitch angles; wherein the pitch angle is the pitch angle of the inertial measurement unit;
the inertial measurement unit is an IMU sensor unit, and one IMU sensor unit includes components such as an accelerometer and a gyroscope. For example, three single-axis accelerometers and three single-axis gyroscopes may be included, where the accelerometers detect acceleration signals of the object in the carrier coordinate system on three independent axes, and the gyroscopes detect angular velocity signals of the carrier relative to the navigation coordinate system, measure angular velocity and acceleration of the object in three-dimensional space, and calculate the pose of the object therefrom.
The position and posture data of the current robot, namely the pitch angle data of the robot, can be acquired in real time or in a fixed time through the inertial measurement device.
Step S120, judging whether the pitch angle is larger than a preset angle threshold value;
the obtained value of the pitch angle is compared with the angle threshold value at the time of judgment.
The pitch angle is the included angle between the x axis of the robot coordinate system and the horizontal plane. When the x axis of the robot body coordinate system is above the inertial coordinate system XOY plane, the pitch angle is positive, the robot body is inclined upwards, and the situation of false recognition obstacle is avoided; otherwise, the machine body tilts downwards, and the laser tilts to the bottom surface and is negative. I.e. the angle between the vector parallel to the axis of the base body of the robot and pointing in the direction of advance of the robot and the ground.
The angle threshold is a preset threshold for evaluating the state of the pose of the current robot. The threshold may be a number or a range of numbers. For example, the angle threshold may be-3 °.
And step S130, if yes, judging that the robot is in an inclined state.
In this embodiment, when the pitch angle obtained or obtained according to the inertial measurement device is greater than or exceeds the angle threshold, the robot is determined to be in an inclined state, the IMU is used to acquire data in real time or at fixed time, and the angle threshold is preset and compared to determine the current posture of the robot, so that the defect that the 2D laser radar does not have height information can be avoided, and the accuracy of robot state identification is improved.
Example 3:
referring to fig. 5, based on embodiment 2, embodiment 3 of the present invention provides a laser distortion removal method. The current pose data further comprise the vertical height of the inertial measurement device from the ground.
In the step S200, calculating a horizontal distance of the error laser projection of the robot includes:
Step S210 calculates a horizontal distance of the error laser projection according to the vertical height and the pitch angle.
The vertical height is the distance between the position of the laser radar in the robot and the vertical direction of the horizontal plane, namely the vertical height.
The horizontal distance of the error laser projection is the distance between the point projected by the ray projected by the laser to the advancing direction of the robot and the intersection point of the horizontal plane of the vertical direction of the position of the laser radar. The laser, if misprojected, is at about the distance from the fuselage
According to the vertical height and the pitch angle, the horizontal distance of the error laser projection can be calculated.
Further, referring to fig. 6, in the step S210, the horizontal distance of the error laser projection is calculated using the following formula: Wherein L is the horizontal distance of the error laser projection, H is the vertical height, and θ is the pitch angle.
The value of L can be calculated using the tan tangent in the trigonometric function.
In this embodiment, the vertical height H (i.e., the height of the sensor from the bottom surface) obtained by the inertial measurement device and the pitch angle are calculated by using a trigonometric function, so that the horizontal distance L of the false laser projection is obtained by using a tangent function, the projection L value is obtained by calculating in real time or at fixed time when the robot is in an inclined state and the laser is projected on the bottom surface in the forward direction, the defect that no height information exists in the 2D laser radar can be avoided, and the accuracy of the state identification of the robot is improved.
Example 4:
referring to fig. 7, based on embodiment 3, embodiment 4 of the present invention provides a laser distortion removal method. In step S300, the screening the range of the error laser data by using the least square method based on the horizontal distance of the error laser projection includes:
Step S310, based on the horizontal distance of the error laser projection, acquiring a plurality of actual position points based on a polar coordinate system of the advancing direction of the robot;
The polar coordinate system (polar coordinates) refers to a coordinate system consisting of a pole, a polar axis, and a polar diameter in a plane. A point O is taken on the plane and is called a pole. A ray Ox, called polar axis, is directed from O. And then a unit length is determined, and the prescribed angle is normally positive in the anticlockwise direction. Thus, the position of any point P on the plane can be determined by the length ρ of the line segment OP and the angle θ from Ox to OP, the ordered pair (ρ, θ) is called the polar coordinate of the point P, and is denoted as P (ρ, θ); ρ is the polar diameter of the point P and θ is the polar angle of the point P.
The horizontal distance L of the error laser projection is taken as the length distance, and the actual position point of the robot in the advancing direction is obtained based on the polar coordinate system.
The actual position point is a position point obtained or actually obtained by a plurality of lasers projected to the front of the robot during the travel of the robot by the laser radar.
And step S320, the error point in the actual position points is selected by using a least square method as the position of the error laser projection data.
Further, the step S310, based on the horizontal distance of the error laser projection, obtains a plurality of actual position points based on a polar coordinate system of the robot advancing direction, including:
Step S311, acquiring a plurality of the actual position points in a rectangular area with a length L in the forward direction of the robot based on the polar coordinate system.
Since the advancing direction of the robot is already determined, a virtual rectangular area is set for more accurate data screening and processing, and the rectangular area is a rectangular area with the main body of the robot as a starting point and the horizontal distance L of the error laser projection as a length.
All the actual position points in the advancing direction and within the rectangular area are obtained as described above.
Further, referring to fig. 8, the step S320 of screening out the error point from the actual position points by using the least square method as the position of the error laser projection data includes:
step S321, calculating a linear equation by using a least square method to obtain a theoretical value;
the step of filtering corresponds to a process of converting the point projected by the laser into a polar coordinate system.
The linear equation y=ax+b in the rectangular region can be calculated by the least square method, and the values of a and b can be obtained. Wherein x and y are the corresponding coordinates of the theoretical value.
Step S322, the actual position point is brought into the linear equation to obtain an actual value;
and the points projected by the laser in the forward direction are all the actual position points and are brought into the linear equation, so that the actual coordinates corresponding to the actual position points are obtained, and the actual coordinates are the actual values.
Step S323, judging whether the difference between the actual value and the theoretical value is smaller than a preset comparison threshold;
the comparison threshold is a preset threshold for evaluating the difference between the theoretical value and the actual value. For example, it may be 0.2m.
And step S324, if yes, judging that the actual position point corresponding to the actual value is an error point, and taking the error point as the error target position.
If the difference between the actual value and the theoretical value is smaller than the comparison threshold, it may be indicated that the actual value is close to the theoretical value, and the actual position point corresponding to the actual value is the point of the laser error projection, and the point is the wrong target position, so that the point needs to be filtered.
For example, referring to fig. 3, when the laser sensor is located at a distance H (machine characteristic data) from the ground, the IMU (inertial measurement unit) data is read in azimuth, and the pitch angle is \theta, and if the theta angle is greater than the preset angle threshold, it can be determined that there is a possibility of projecting the ground obliquely. At this time according toAnd obtaining the approximate distance between the laser error projection and the robot, and screening out corresponding possible error laser data according to the distance. All possible erroneous laser data are resolved into corresponding x, y coordinates based on the fuselage coordinate system. A matrix [ x ] +b=matrix [ y ], the most probable a and b can be found. Bringing each piece of data which is possibly wrong into the equation, and if the difference between the predicted data and the actual data of y is too large, indicating that the data point is not on the straight line of the laser error projection; otherwise, on the straight line of the laser error projection; and finally filtering the data on the laser error projection straight line.
In summary, the technical problem to be solved in this embodiment is: the problem that a small service robot can not be planned because the robot detects wrong obstacles to block a path due to the fact that the laser is wrongly projected onto the ground because of inclination when the small service robot navigates through the inclined ground is solved.
In the present embodiment, by adopting the method provided in the present embodiment, a simple and effective detection of the inclination of the robot based on the mobile robot apparatus is proposed, and the laser light that is erroneously projected to the ground due to the inclination is filtered; the method adopts a least square method for detecting the filtered laser; the method can be used for slope, mobile robot, drawing, path planning and control.
In the embodiment, whether the robot tilts or not is detected by reading data of the inertial measurement device; calculating the approximate position of the laser error projected onto the ground according to the height of the robot laser radar sensor and the inclination angle of the machine; and calculating an equation of a point which is erroneously projected onto the ground according to a least square method, and filtering distortion points according to the equation, so that the robot cannot be influenced by an inclined state caused by uneven ground in the running process, and the positioning precision of the robot in navigation and the navigation planning capacity are improved.
In addition, referring to fig. 9, the present invention also provides a laser distortion removal apparatus, including:
a confirmation module 10 for confirming whether the robot is currently in a tilted state based on the inertial measurement unit;
A calculation module 20 for calculating a horizontal distance of an erroneous laser projection of the robot in a case where the robot is in a tilted state;
And a filtering module 30, configured to screen out the range of the error laser data as the error projection laser by using a least square method based on the horizontal distance of the error laser projection, and filter the range.
In addition, the invention also provides a laser distortion removal system, which comprises a memory and a processor, wherein the memory is used for storing a laser distortion removal program, and the processor runs the laser distortion removal program to enable the laser distortion removal system to execute the laser distortion removal method.
In addition, the present invention also provides a computer-readable storage medium having stored thereon a laser distortion removal program which, when executed by a processor, implements the laser distortion removal method as described above.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention. The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (7)

1. A laser distortion removal method, comprising:
Acquiring current pose data of the robot based on an inertial measurement device; the current pose data comprise a pitch angle and the vertical height of the inertial measurement device from the ground; wherein the pitch angle is the pitch angle of the inertial measurement unit;
Judging whether the pitch angle is larger than a preset angle threshold value or not;
if yes, judging that the robot is in an inclined state;
under the condition that the robot is in an inclined state, calculating to obtain the horizontal distance of the error laser projection according to the vertical height and the pitch angle; the horizontal distance of the erroneous laser projection is calculated using the following formula:
Wherein L is the horizontal distance of the error laser projection, H is the distance of the position of the laser radar in the robot in the vertical direction from the horizontal plane, and θ is the pitch angle;
And screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser.
2. The laser distortion removal method of claim 1, wherein the screening out the range of erroneous laser data using a least squares method based on the horizontal distance of the erroneous laser projections comprises:
Acquiring a plurality of actual position points based on a polar coordinate system of the advancing direction of the robot based on the horizontal distance of the error laser projection;
and screening out the wrong point in the actual position points by using a least square method as the position of the wrong laser projection data.
3. The laser distortion removal method of claim 2, wherein the obtaining a plurality of actual position points based on a polar coordinate system of the robot forward direction based on the horizontal distance of the erroneous laser projection comprises:
and acquiring a plurality of actual position points in a rectangular area with the length L of the advancing direction of the robot based on the polar coordinate system.
4. The laser distortion removal method as set forth in claim 2, wherein the screening out the erroneous point among the actual position points as the position where the erroneous laser projection data is located by using a least square method includes:
calculating a linear equation by using a least square method to obtain a theoretical value;
bringing the actual position point into the linear equation to obtain an actual value;
Judging whether the difference value between the actual value and the theoretical value is smaller than a preset comparison threshold value or not;
If yes, judging that the actual position point corresponding to the actual value is the wrong point, and taking the wrong point as the position of the wrong laser projection data.
5. A laser distortion removal apparatus, comprising:
the confirmation module is used for acquiring current pose data of the robot based on the inertial measurement device; the current pose data comprise a pitch angle and the vertical height of the inertial measurement device from the ground; wherein the pitch angle is the pitch angle of the inertial measurement unit; judging whether the pitch angle is larger than a preset angle threshold value or not; if yes, judging that the robot is in an inclined state;
the calculation module is used for calculating the horizontal distance of the error laser projection according to the vertical height and the pitch angle under the condition that the robot is in an inclined state; the horizontal distance of the erroneous laser projection is calculated using the following formula:
Wherein L is the horizontal distance of the error laser projection, H is the distance of the position of the laser radar in the robot in the vertical direction from the horizontal plane, and θ is the pitch angle;
and the filtering module is used for screening the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filtering the range as error projection laser.
6. A laser distortion removal system comprising a memory for storing a laser distortion removal program and a processor that runs the laser distortion removal program to cause the laser distortion removal system to perform the laser distortion removal method of any of claims 1-4.
7. A computer-readable storage medium, wherein a laser distortion removal program is stored on the computer-readable storage medium, which when executed by a processor, implements the laser distortion removal method according to any one of claims 1 to 4.
CN202210826873.2A 2022-07-13 2022-07-13 Laser distortion removal method, device and system and readable storage medium Active CN115164882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210826873.2A CN115164882B (en) 2022-07-13 2022-07-13 Laser distortion removal method, device and system and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210826873.2A CN115164882B (en) 2022-07-13 2022-07-13 Laser distortion removal method, device and system and readable storage medium

Publications (2)

Publication Number Publication Date
CN115164882A CN115164882A (en) 2022-10-11
CN115164882B true CN115164882B (en) 2024-10-22

Family

ID=83492629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210826873.2A Active CN115164882B (en) 2022-07-13 2022-07-13 Laser distortion removal method, device and system and readable storage medium

Country Status (1)

Country Link
CN (1) CN115164882B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782465A (en) * 2019-12-30 2020-02-11 中智行科技有限公司 Ground segmentation method and device based on laser radar and storage medium
CN111435163A (en) * 2020-03-18 2020-07-21 深圳市镭神智能系统有限公司 Ground point cloud data filtering method and device, detection system and storage medium
CN117890878A (en) * 2023-12-13 2024-04-16 深圳市优必选科技股份有限公司 Filtering method and device for two-dimensional laser point cloud, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101547940B1 (en) * 2014-12-17 2015-08-28 가톨릭관동대학교산학협력단 An error correction system for data of terrestrial LiDAR on the same plane and the method thereof
CN107390679B (en) * 2017-06-13 2020-05-05 合肥中导机器人科技有限公司 Storage device and laser navigation forklift
CN108152831B (en) * 2017-12-06 2020-02-07 中国农业大学 Laser radar obstacle identification method and system
CN113093218A (en) * 2021-05-14 2021-07-09 汤恩智能科技(苏州)有限公司 Slope detection method, drive device, and storage medium
CN114488183A (en) * 2021-12-29 2022-05-13 深圳优地科技有限公司 Obstacle point cloud processing method, device and equipment and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782465A (en) * 2019-12-30 2020-02-11 中智行科技有限公司 Ground segmentation method and device based on laser radar and storage medium
CN111435163A (en) * 2020-03-18 2020-07-21 深圳市镭神智能系统有限公司 Ground point cloud data filtering method and device, detection system and storage medium
CN117890878A (en) * 2023-12-13 2024-04-16 深圳市优必选科技股份有限公司 Filtering method and device for two-dimensional laser point cloud, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN115164882A (en) 2022-10-11

Similar Documents

Publication Publication Date Title
CN108290294B (en) Mobile robot and control method thereof
CN111928811B (en) Ground detection method, device, equipment and storage medium
KR20230050396A (en) Obstacle detection method, device, autonomous walking robot and storage medium
JP7082545B2 (en) Information processing methods, information processing equipment and programs
US20220036574A1 (en) System and method for obstacle avoidance
US20200108499A1 (en) Localization and Mapping Using Physical Features
US8903589B2 (en) Method and apparatus for simultaneous localization and mapping of mobile robot environment
KR101739996B1 (en) Moving robot and simultaneous localization and map-buliding method thereof
US9625912B2 (en) Methods and systems for mobile-agent navigation
US7752008B2 (en) Method and apparatus for determining position and orientation
US20160245918A1 (en) Directed registration of three-dimensional scan measurements using a sensor unit
US8467612B2 (en) System and methods for navigation using corresponding line features
EP3155369B1 (en) System and method for measuring a displacement of a mobile platform
CN113175925B (en) Positioning and navigation system and method
CN110736456A (en) Two-dimensional laser real-time positioning method based on feature extraction in sparse environment
CN111990930A (en) Distance measuring method, device, robot and storage medium
CN115436955A (en) Indoor and outdoor environment positioning method
JP2021077003A (en) Travel-through propriety determination device and control device for moving body
CN115164882B (en) Laser distortion removal method, device and system and readable storage medium
JP4116116B2 (en) Ranging origin recognition device for moving objects
CN112731451A (en) Method and system for detecting ground obstacle based on laser radar
CN115718487A (en) Self-moving equipment pose determining method and device, self-moving equipment and storage medium
CN114911223A (en) Robot navigation method and device, robot and storage medium
CN111897337A (en) Obstacle avoidance control method and control system for robot walking along edge
JP2019148456A (en) Calculation device, self-location calculation method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant