Nothing Special   »   [go: up one dir, main page]

CN112987728A - Robot environment map updating method, system, equipment and storage medium - Google Patents

Robot environment map updating method, system, equipment and storage medium Download PDF

Info

Publication number
CN112987728A
CN112987728A CN202110175609.2A CN202110175609A CN112987728A CN 112987728 A CN112987728 A CN 112987728A CN 202110175609 A CN202110175609 A CN 202110175609A CN 112987728 A CN112987728 A CN 112987728A
Authority
CN
China
Prior art keywords
robot
environment
map
grid
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110175609.2A
Other languages
Chinese (zh)
Inventor
颜炳姜
朱小康
郑贵刚
曾志威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smartguy Intelligent Equipment Co ltd
Conprofe Technology Group Co Ltd
Smartguy Intelligent Equipment Co Ltd Guangzhou Branch
Original Assignee
Smartguy Intelligent Equipment Co ltd
Conprofe Technology Group Co Ltd
Smartguy Intelligent Equipment Co Ltd Guangzhou Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smartguy Intelligent Equipment Co ltd, Conprofe Technology Group Co Ltd, Smartguy Intelligent Equipment Co Ltd Guangzhou Branch filed Critical Smartguy Intelligent Equipment Co ltd
Priority to CN202110175609.2A priority Critical patent/CN112987728A/en
Publication of CN112987728A publication Critical patent/CN112987728A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot environment map updating method, a system, equipment and a storage medium, wherein the method comprises the following steps: acquiring sensed environmental measurement data; acquiring pose data of the robot; updating a pre-stored environment map according to the environment measurement data and the pose data; carrying out passable detection on the obstacles in the set range taking the robot as the center according to the updated environment map to obtain a passable obstacle detection result; generating a local environment reachability map according to the barrier passable detection result and the updated environment map; the local environment accessibility map is provided with an obstacle passable mark and an obstacle impassable mark; after the environment map is locally updated based on the environment measurement data and the pose data of the robot, barrier passable detection is carried out, barrier passable and impassable are marked in the map, the problem that the robot loses terrain and barrier details in the process of sensing the environment is effectively solved, and the navigation safety of the robot is improved.

Description

Robot environment map updating method, system, equipment and storage medium
Technical Field
The present invention relates to the field of robot navigation technologies, and in particular, to a method, a system, a device, and a storage medium for updating an environment map of a robot.
Background
Autonomy of robots has been a goal pursued in the field of mobile robots. With the application scenes of the mobile robot becoming more and more extensive, the working environment becomes more and more complex, and the requirement on the autonomy of the mobile robot becomes higher and higher. As a bottom layer support of the autonomous behavior, the mobile robot can effectively sense and express the complex three-dimensional environment on the premise of ensuring that the mobile robot can avoid obstacles and safely operate.
At present, the expression of the mobile robot on the terrain and the obstacles is mainly realized in a flat structured terrain environment by converting the obstacles in the sensed environment information into the impassable points in the two-dimensional grid, the technical scheme based on the two-dimensional grid map can only model one tangent plane in the space, the method loses a lot of detailed information of the terrain and the obstacles, and if the obstacles are not in the sensing plane of the map, the loss of the actual obstacles is caused, so that the possibility of collision of the mobile robot is caused. Or sense the distance of an obstacle directly through a sensor to make an emergency stop soon after a collision occurs. The mobile robot has limited perception capability to local environment by using the scheme, and cannot be competent for related tasks in some unstructured and terrain-complex environments.
Disclosure of Invention
In view of the above problems, an object of the present invention is to provide a method, a system, a device, and a storage medium for updating an environment map of a robot, which can effectively avoid the problem that the robot loses details of terrain and obstacles in the process of sensing the environment, and improve the navigation safety of the robot.
In a first aspect, an embodiment of the present invention provides an environment map updating method for a robot, including:
acquiring sensed environmental measurement data;
acquiring pose data of the robot;
updating a pre-stored environment map according to the environment measurement data and the pose data;
carrying out passable detection on the obstacles in the set range taking the robot as the center according to the updated environment map to obtain a passable obstacle detection result;
generating a local environment reachability map according to the barrier passable detection result and the updated environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
As an improvement of the above solution, the updating a pre-stored environment map according to the environment measurement data and the pose data includes:
locally updating the environment map according to the environment measurement data, wherein the environment map is an elevation grid map;
and performing incremental updating on the environment map after local updating according to the pose data to obtain an extended elevation grid map.
As an improvement of the above solution, before the locally updating the environment map according to the environment measurement data, the method further includes:
performing point cloud conversion on the environment measurement data to obtain point cloud data;
filtering the point cloud data;
and performing down-sampling processing on the filtered point cloud data.
As an improvement of the above solution, the locally updating the environment map according to the environment measurement data includes:
acquiring the distance from the distance sensor to any measuring point in a real scene according to the environment measuring data; wherein the distance sensor is to sense the environmental measurement data;
performing coordinate conversion on the obtained distance to obtain a height measurement value corresponding to each measurement point in a first coordinate system; wherein the first coordinate system is a coordinate system centered on the distance sensor;
mapping the height measurement value to a corresponding height scalar measurement value in the environment map according to a preset projection matrix;
calculating a variance corresponding to the height measurement value according to the height measurement value and a preset coordinate transformation matrix;
fusing the height measurement value and the corresponding variance with a height scalar measurement value in the environment map and the corresponding variance to obtain a fused height measurement value and a fused variance;
calculating a mahalanobis distance according to the height measurement value and the corresponding height scalar measurement value;
determining a final height update value for each cell grid in the environment map based on the mahalanobis distance, the fused height measurement value, the height measurement value, and the height scalar measurement value;
and determining a final variance updating value of each unit grid in the environment map according to the Mahalanobis distance, the fusion variance, the variance corresponding to the height measurement value and the variance corresponding to the height scalar measurement value.
As an improvement of the above solution, the calculating a variance corresponding to the height measurement value according to the height measurement value and a preset coordinate transformation matrix includes:
respectively calculating the Jacobian of the distance sensor and the rotating Jacobian of a first coordinate system according to the height measurement value and a preset coordinate transformation matrix; wherein the first coordinate system and the second coordinate system are converted by the coordinate transformation matrix, and the second coordinate system is a coordinate system centered by the robot;
and calculating the variance corresponding to the height measurement value according to the Jacobian of the distance sensor and the rotating Jacobian of the first coordinate system.
As an improvement of the above scheme, the robot performs incremental update on the locally updated environment map according to the pose data to obtain an extended elevation grid map, including:
under the condition that the robot moves, according to the pose data, the third coordinate system at the current moment is overlapped into the third coordinate system at the previous moment to obtain a map common reference coordinate system;
the pose data comprise relative translation vectors and rotation of the third coordinate system and the second coordinate system at the current moment, and translation vectors and rotation of the robot from the last moment to the current moment in the second coordinate system; the third coordinate system is a map coordinate system with the robot as a center;
acquiring a variance updating value of each grid in the environment map, and calculating a spatial covariance matrix of the corresponding grid at the previous moment according to the variance updating value;
based on the map common reference coordinate system, calculating a spatial covariance matrix of the corresponding grid at the current moment according to the spatial covariance matrix of each grid at the previous moment;
calculating Gaussian distribution among a spatial covariance matrix of each grid at the current moment, a second coordinate system corresponding to the previous moment and a second coordinate system corresponding to the current moment to obtain uncertain estimation of the robot motion;
and calculating an updated value of the covariance matrix of each grid according to the uncertain estimation, and determining a final updated value of the variance of each grid according to the updated value of the covariance matrix of each grid.
As an improvement of the above, the obstacle passable detection result includes: a detection result that an inclined obstacle can pass through and a detection result that a vertical obstacle can pass through;
the method for detecting the barrier in the set range with the robot as the center in a passable manner according to the updated environment map to obtain a passable barrier detection result comprises the following steps:
acquiring a height measurement value within a set range centered on the robot;
calculating the slope value of the set range according to the height measurement value in the set range;
determining a passable detection result of the inclined barrier of the grid where the robot is located according to the slope value and a preset slope threshold;
calculating the maximum height difference between all grids in the footprint model coverage area and the central grid of the footprint model coverage area according to a preset footprint model of the robot;
and determining a detection result that the vertical barrier of the grid where the robot is located can pass through according to the maximum height difference value and a preset critical height value.
As an improvement of the above, the method further comprises:
comparing the slope value with a preset slope threshold value;
determining that a tilted obstacle of a grid on which the robot is located cannot pass through if the slope value is larger than the slope threshold value;
and determining that the inclined obstacle of the grid where the robot is located can pass through when the slope value is smaller than or equal to the slope threshold value.
As an improvement of the above, the method further comprises:
comparing the maximum height difference value with a preset critical height value;
determining that a vertical obstacle of a grid on which the robot is located cannot pass through if the maximum height difference value is larger than the critical height value;
and determining that the vertical obstacle of the grid where the robot is located can pass through if the maximum height difference value is smaller than or equal to the critical height value.
As an improvement of the above, the generating a local environment reachability map based on the obstacle passable detection result and the updated environment map includes:
weighting and summing the detection result of the inclined barrier passable of any grid and the detection result of the vertical barrier passable of any grid in the updated environment map to obtain the passable result of any grid;
and marking the passable result of any grid into the updated environment map, and generating the local environment reachability map.
In a second aspect, an embodiment of the present invention provides an environment map updating system for a robot, including:
the environment measurement data acquisition module is used for acquiring the sensed environment measurement data;
the pose data acquisition module is used for acquiring pose data of the robot;
the local environment sensing module is used for updating a pre-stored environment map according to the environment measurement data and the pose data;
the obstacle detection module is used for carrying out passable detection on obstacles in a set range taking the robot as a center according to the updated environment map to obtain a passable obstacle detection result;
the reachability estimation module is used for generating a local environment reachability map according to the barrier passable detection result and the environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
In a third aspect, an embodiment of the present invention provides an environment map updating apparatus for a robot, including a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor implements the environment map updating method for a robot according to any one of the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program, where when the computer program runs, the apparatus in the computer-readable storage medium is controlled to execute the method for updating an environment map of a robot according to any one of the first aspect.
Compared with the prior art, the embodiment of the invention has the beneficial effects that: the robot environment map updating method comprises the following steps: acquiring sensed environmental measurement data; acquiring pose data of the robot; updating a pre-stored environment map according to the environment measurement data and the pose data; carrying out passable detection on the obstacles in the set range taking the robot as the center according to the updated environment map to obtain a passable obstacle detection result; generating a local environment reachability map according to the barrier passable detection result and the updated environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark. The method comprises the steps of performing local updating on an environment map within a set range with a robot as a center based on environment measurement data and pose data of the robot, performing barrier passable detection on the updated environment map, marking a barrier passable mark and a barrier impassable mark in the environment map, generating a local environment reachability map, effectively avoiding the problem that the robot loses terrain and barrier details in the process of sensing the environment, and improving the navigation safety of the robot.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of an environment map updating method for a robot according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating an environment map updating process according to a first embodiment of the present invention;
FIG. 3 is a schematic diagram of a pre-processing flow of environment measurement data according to a first embodiment of the present invention;
fig. 4 is a schematic diagram of an environment map updating system of a robot according to a second embodiment of the present invention;
fig. 5 is a schematic diagram of an environment map updating apparatus of a robot according to a third embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Please refer to fig. 1, which is a flowchart illustrating an environment map updating method for a robot according to a first embodiment of the present invention, the method includes:
s1: acquiring sensed environmental measurement data;
s2: acquiring pose data of the robot;
s3: updating a pre-stored environment map according to the environment measurement data and the pose data;
s4: carrying out passable detection on the obstacles in the set range taking the robot as the center according to the updated environment map to obtain a passable obstacle detection result;
s5: generating a local environment reachability map according to the barrier passable detection result and the updated environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
In the embodiment of the present invention, the environment map may adopt an elevation grid map. The environment measurement data includes data sensed from a distance sensor provided on the robot, such as distance information sensed by ultrasonic waves, point cloud information photographed by a single laser radar, eye vision, stereoscopic vision, and the like. The pose data includes mileage data detected by odometers such as a wheel type odometer, an inertial measurement unit, a vision/laser odometer and the like, and position information detected by a GPS and the like, which are provided on the robot.
In the embodiment of the invention, the environment map is locally updated within a set range by taking a robot as a center based on the environment measurement data and the pose data of the robot, then the updated environment map is subjected to barrier passable detection, barrier passable marks and barrier impassable marks are marked in the environment map, the environment map is expanded by combining the environment measurement data and the pose data of the robot, a local environment reachability map is generated, the problem that the robot loses the details of the terrain and the barriers in the process of sensing the environment can be effectively avoided, and the navigation safety of the robot is improved.
In an optional embodiment, the updating the pre-stored environment map according to the environment measurement data and the pose data includes:
locally updating the environment map according to the environment measurement data, wherein the environment map is an elevation grid map;
and performing incremental updating on the environment map after local updating according to the pose data to obtain an extended elevation grid map.
In the embodiment of the invention, as shown in fig. 2, the updating of the environment map includes two parts, the first part is based on the updating of environment measurement data, and the second part is based on the incremental updating of robot posture data, so that the updated environment map presents more comprehensive terrain and obstacle details, and the perception expression of the robot on the terrain and the obstacle in the complex ground environment is improved.
In an optional embodiment, before the locally updating the environment map according to the environment measurement data, the method further includes:
performing point cloud conversion on the environment measurement data to obtain point cloud data;
filtering the point cloud data;
and performing down-sampling processing on the filtered point cloud data.
Because the noise and dense point cloud existing in the distance sensor during measurement are too redundant to the perception of local terrain and obstacles, the calculation burden is increased, and before environment map updating is carried out by using environment measurement data, the sensed environment measurement data needs to be preprocessed to obtain proper point cloud information, so that the calculation burden is effectively reduced. The environmental measurement data preprocessing process comprises the following steps:
as shown in fig. 3, first, the environment measurement data obtained from the distance sensor is subjected to a preliminary point cloud conversion to obtain original point cloud data, and the original point cloud data is subjected to statistical filtering, that is, a neighborhood of each point in the original point cloud data is subjected to a statistical analysis, and some points which do not meet a preset standard range are trimmed off.
After noise points and outliers in the original point cloud data are removed through statistical filtering, the current point cloud data are subjected to down-sampling, the point cloud distribution density in a unit area is reduced, and the real-time processing efficiency of the subsequent local elevation map construction is guaranteed.
In an optional embodiment, the locally updating the environment map according to the environment measurement data includes:
acquiring the distance from the distance sensor to any measuring point in a real scene according to the environment measuring data; wherein the distance sensor is to sense the environmental measurement data;
performing coordinate conversion on the obtained distance to obtain a height measurement value corresponding to each measurement point in a first coordinate system; wherein the first coordinate system is a coordinate system centered on the distance sensor;
mapping the height measurement value to a corresponding height scalar measurement value in the environment map according to a preset projection matrix;
calculating a variance corresponding to the height measurement value according to the height measurement value and a preset coordinate transformation matrix;
fusing the height measurement value and the corresponding variance with a height scalar measurement value in the environment map and the corresponding variance to obtain a fused height measurement value and a fused variance;
calculating a mahalanobis distance according to the height measurement value and the corresponding height scalar measurement value;
determining a final height update value for each cell grid in the environment map based on the mahalanobis distance, the fused height measurement value, the height measurement value, and the height scalar measurement value;
and determining a final variance updating value of each unit grid in the environment map according to the Mahalanobis distance, the fusion variance, the variance corresponding to the height measurement value and the variance corresponding to the height scalar measurement value.
If multiple height measurements at different heights are located within the same cell grid (e.g., when a vertical obstacle is detected), using the mahalanobis distance to constrain the updates of the kalman filter; if the mahalanobis distance between the estimate and the new measurement is below a threshold, the observations are considered to be within the same altitude. The rule based on the mahalanobis distance fuses the height measurement value of the highest altitude, and discards the height measurement values below a certain distance from the current estimation value, so that a reliable and most representative height value can be obtained from all the measurement values in a grid area.
In an optional embodiment, the calculating, according to the height measurement value and a preset coordinate transformation matrix, a variance corresponding to the height measurement value includes:
respectively calculating the Jacobian of the distance sensor and the rotating Jacobian of a first coordinate system according to the height measurement value and a preset coordinate transformation matrix; wherein the first coordinate system and the second coordinate system are converted by the coordinate transformation matrix, and the second coordinate system is a coordinate system centered by the robot;
and calculating the variance corresponding to the height measurement value according to the Jacobian of the distance sensor and the rotating Jacobian of the first coordinate system.
The measurement of the distance sensor and the measurement of the self-pose sensor (IMU or odometer) introduce some noise due to the influence of some practical situations, which causes the distortion of the measured value, and thus the deviation of the final result from the practical environment affects the accuracy of the subsequent reachability estimation. Therefore, uncertainty of measurement of the robot is estimated and updated by solving a Jacobian matrix of the robot and deducing a variance propagation process of the robot, and the problem of uncertainty of environmental measurement and self motion estimation which often occur on a robot with proprioceptive state estimation is solved. At the same time, because the embodiment of the invention is an elevation map in a coordinate system with a robot as a center, the process of integrating new measurement results into the map is only affected by range sensor noise and the uncertainty of observable roll and pitch angles.
In the embodiment of the present invention, in order to describe the robot-centric environment sensing/map updating process more specifically, the following coordinate systems are introduced: a first coordinate system S, a second coordinate system B, a coordinate system with the robot as the center, and a third coordinate system M, a map coordinate system with the robot as the center, and an inertial coordinate system I; the inertial coordinate system I is fixed in the environment, and it is assumed that the change of the actual environment is stationary with respect to the inertial coordinate system I. The second coordinate system B and the first coordinate system S are two coordinate systems respectively fixed on the center position of the robot and the center position of the distance sensor. It is assumed that there is a known transformation (y) between the second coordinate system B and the first coordinate system SBS,φBS) Wherein γ isBSRepresenting a translation vector of the first coordinate system S in three-dimensional space relative to the second coordinate system B; phi is aBSRepresents a rotation of the distance sensor coordinate system S in three-dimensional space with respect to the body coordinate system B, and phiBSE SO (3) (three-dimensional rotating group).
Based on the introduced first coordinate system S, the second coordinate system B, the map coordinate system M and the inertial coordinate system I, the environment measurement data are mapped into a local environment map with the robot as the center through coordinate transformation after being preprocessed. Specifically, the height measurement value is constructed into the (x, y) grid corresponding to the environment map, the two-dimensional grid map is expanded into the 2.5-dimensional elevation grid map with the height information, the efficient and compact representation of the two-dimensional grid map on a three-dimensional complex structure is expanded, and the problems that the details of terrain and obstacles are lost when the two-dimensional grid map is used for a ground mobile robot and the environment expression information of the three-dimensional occupied map is excessively redundant and the calculation complexity is excessively high during traversal are solved. The following describes in detail a map update process based on environmental measurement data, such as altitude measurements:
in the map coordinate system M, the height measurements are approximated as a mean p and a variance
Figure BDA0002940601870000111
Gaussian probability distribution of
Figure BDA0002940601870000112
The distance sp from the distance sensor to the point P of the real scene is expressed in a first coordinate systemSγSPThe distance sensed by the distance sensor can be converted into a height measurement p,
Figure BDA0002940601870000113
MγSMrepresents a translation vector, phi, of the map coordinate system M in three-dimensional space relative to the first coordinate system S in the transformation between the first coordinate system S and the map coordinate system MSMWhich represents the rotation of the map coordinate system M in three-dimensional space relative to the first coordinate system S in the transformation between the first coordinate system S and the map coordinate system M.
And mapping the three-dimensional space measurement point P to a corresponding height scalar measurement value of a map coordinate system M by using a preset projection matrix P as [ 001 ], so as to increase the height measurement value in the environment map and construct a three-dimensional elevation grid map.
To obtain the variance of the altitude measurement, further calculation of the jacobian J of the distance sensor is requiredSAnd a rotational jacobian J of the first coordinate system SΦThe following were used:
Figure BDA0002940601870000121
where C (φ) represents a mapping of the rotation matrix. In summary, the variance is
Figure BDA0002940601870000122
Can be expressed by equation (3):
Figure BDA0002940601870000123
wherein phi isISRepresenting a rotation of the first coordinate system S in three-dimensional space with respect to the inertial coordinate system I in a transformation between the inertial coordinate system I and the first coordinate system S. SigmaSA covariance matrix representing a distance sensor error model,
Figure BDA0002940601870000124
covariance matrix representing uncertainty in rotation estimate of range sensor: (
Figure BDA0002940601870000125
The submatrix of (a). The definition mode with the map coordinate system M as the center is set, and the distance sensor is not considered in the case
Figure BDA0002940601870000126
The position uncertainty of (a). Because of this definition and the application of the projection matrix P, uncertainties in the map representation of the measurement of the yaw angle a can also be excluded. Height measurement and variance thereof
Figure BDA0002940601870000127
Height scalar measurement value and variance with existing environment map
Figure BDA0002940601870000128
The fusion estimation is performed by a one-dimensional kalman filter as shown in equation (4):
Figure BDA0002940601870000129
wherein the non-updated state is denoted by superscript-, and the updated state is denoted by superscript +. The mahalanobis distance is used to constrain the kalman filter updates if multiple height measurements at different heights are located within the same cell grid (e.g., when a vertical obstacle is detected). If the mahalanobis distance between the estimate and the new measurement is below a threshold, the height measurements are considered to be within the same height. Making the threshold c 1 allows the distances between all estimates to be less than the variance and is incorporated into
Figure BDA00029406018700001210
For the final height update value and variance update value, the following equation (5) is shown:
Figure BDA0002940601870000131
wherein,
Figure BDA0002940601870000132
represents the mahalanobis distance function, which can be defined as:
Figure BDA0002940601870000133
the mahalanobis distance based rule fuses the height measurements at the highest altitude and discards height measurements below a certain distance from the current estimate. This method of combining multiple height measurements is the same when continuously taking range measurements for the same area or re-traversing an area (e.g., when the robot is rotating or moving backward).
In an optional embodiment, the incrementally updating, by the robot, the locally updated environment map according to the pose data to obtain an extended elevation grid map, including:
under the condition that the robot moves, according to the pose data, the third coordinate system at the current moment is overlapped into the third coordinate system at the previous moment to obtain a map common reference coordinate system;
the pose data comprise relative translation vectors and rotation of the third coordinate system and the second coordinate system at the current moment, and translation vectors and rotation of the robot from the last moment to the current moment in the second coordinate system; the third coordinate system is a map coordinate system with the robot as a center;
acquiring a variance updating value of each grid in the environment map, and calculating a spatial covariance matrix of the corresponding grid at the previous moment according to the variance updating value;
based on the map common reference coordinate system, calculating a spatial covariance matrix of the corresponding grid at the current moment according to the spatial covariance matrix of each grid at the previous moment;
calculating Gaussian distribution among a spatial covariance matrix of each grid at the current moment, a second coordinate system corresponding to the previous moment and a second coordinate system corresponding to the current moment to obtain uncertain estimation of the robot motion;
and calculating an updated value of the covariance matrix of each grid according to the uncertain estimation, and determining a final updated value of the variance of each grid according to the updated value of the covariance matrix of each grid.
The spatial covariance matrix of each grid at the previous moment to the spatial covariance matrix at the current moment are in accordance with Gaussian distribution;
and calculating a covariance matrix of each grid in the environment map to obtain an extended elevation grid map.
In an embodiment of the present invention, since the map coordinate system M is defined with respect to the pose of the distance sensor/robot, the elevation grid map needs to be updated each time the robot moves with respect to the inertial coordinate system I. Since the real environment is stationary relative to the inertial frame I, the map needs to be estimated while moving the map frame M, at which time the height update value needs to be updated according to the change in the pose of the robot
Figure BDA0002940601870000141
Sum variance update value
Figure BDA0002940601870000142
And (6) updating.
Ideally the mean and variance of each grid can be updated based on the uncertainty of the robot motion and the estimated values of the surrounding grids. The amount of computation required to perform such an update on each grid cell in the map is relatively large in view of the amount of computation. Therefore, it is necessary to utilize the spatial covariance matrix with respect to the real environment information mapped in each grid i
Figure BDA0002940601870000143
To extend the structure of the elevation grid map.
In a map coordinate system M defined with respect to a known transformation of the second coordinate system B, the robot-mounted distance sensor can accurately measure environmental information within its sensing range. Whereby when a grid i receives a measurement update, the spatial covariance matrix is updated
Figure BDA0002940601870000144
Is arranged as
Figure BDA0002940601870000145
Wherein the variance of the altitude estimate
Figure BDA0002940601870000146
Calculated by the formula (6). If grid i does not receive a new height measurement update, the spatial covariance matrix needs to be continuously updated according to the relative motion of the robot from its previous pose to the current pose
Figure BDA0002940601870000147
The relative translation vector and rotation of the map coordinate system M and the second coordinate system B in the time t are defined as
Figure BDA0002940601870000151
The corresponding map coordinate system M at time t22Corresponding map coordinate system M of midpoint P at time t11Translation vector of relative movement in
Figure BDA0002940601870000152
Can be expressed as:
Figure BDA0002940601870000153
wherein,
Figure BDA0002940601870000154
indicating the corresponding map coordinate system M at time t22The midpoint P relative to a second coordinate system B2The translation vector of (a) is calculated,
Figure BDA0002940601870000155
indicating the corresponding second coordinate system B at time t22The middle point P is relative to the second coordinate system B corresponding to the time t11The translation vector of (a) is calculated,
Figure BDA0002940601870000156
indicating the corresponding map coordinate system M at time t11The middle point P is relative to the second coordinate system B corresponding to the time t22The translation vector of (a) is calculated,
Figure BDA0002940601870000157
indicating the corresponding map coordinate system M at time t11The midpoint P is the translation vector in the corresponding map coordinate system at the previous time.
Equation (8) above gives the map coordinate system M corresponding to the position of P point at time t22And the corresponding map coordinate system M at time t11And equation (8) can be transformed into:
Figure BDA0002940601870000158
wherein,
Figure BDA0002940601870000159
indicating the corresponding map coordinate system M at time t22The midpoint P relative to a second coordinate system B2The flat-turn of the rotating shaft is performed,
Figure BDA00029406018700001510
indicating the corresponding map coordinate system M at time t22The center point P corresponds to the map coordinate system M at the time t11The rotation of (2).
Selecting the corresponding map coordinate system M at time t22The pose of (2):
Figure BDA00029406018700001511
aligning the corresponding map coordinate system M at time t11Map coordinate system M corresponding to time t22The rotation and translation of (a) may result in:
Figure BDA00029406018700001512
at this time, the corresponding map coordinate system M at time t11Map coordinate system M corresponding to time t22The coincidence can be unified into a map common reference coordinate system M on the premise of not moving any data in the mapThe moments 1 to 2 follow a Gaussian distribution
Figure BDA0002940601870000161
The variance of (c) is propagated as follows:
Figure BDA0002940601870000162
at time t1, the covariance matrix ΣP,1Either initialized in the form of equation (12) or obtained from a previous update. And the covariance matrix sum represents the corresponding second coordinate system B at time t22And a second coordinate system B corresponding to the time t11The uncertainty estimate of the motion estimation is represented as follows:
Figure BDA0002940601870000163
the uncertainty estimate shown in equation (13) represents the second coordinate system B at time t11And a second coordinate B at time t22Relative translation vector therebetween
Figure BDA0002940601870000164
And rotation
Figure BDA0002940601870000165
Respectively obey mean value of
Figure BDA0002940601870000166
And
Figure BDA0002940601870000167
variance is sigmaγSum ΣφA gaussian distribution of (a).
Wherein the form of the jacobian matrix is as follows:
Figure BDA0002940601870000168
wherein C represents the Euler angle or the unit of fourRotation of an element parameterization to its R3X3Mapping of matrix representation. The jacobian matrix is a matrix in which the first partial derivatives are arranged in a certain manner, and its determinant is called jacobian. The significance of the jacobian matrix is that it embodies an optimal linear approximation of a given point to a differentiable equation.
Superscript x represents Euclidean cross-product.
In order to propagate the attitude uncertainty of the robot from time t to time t +1, the points P of each grid i in the map are evaluated for the map coordinate system MiThe spatial covariance matrix of (a) is:
Figure BDA0002940601870000169
wherein,
Figure BDA0002940601870000171
when the robot is in motion, though the whole robot position covariance matrix ΣγAll of which propagate over the variance of the local elevation grid map, but in practice only from the sensor rotational covariance matrix ΣφThe variance of the yaw angle phi. The embodiment of the invention enables the position covariance of the distance sensor and the uncertainty of the yaw rotation of the distance sensor on the machine body to be excluded from the updating step in the measurement updating of the distance sensor by supplementing the measurement updating based on the distance sensor.
From the spatial covariance matrix according to the above equations (7), (15)
Figure BDA0002940601870000172
And extracting a final variance updating value to update the variance of the corresponding grid of the environment map.
In an alternative embodiment, the obstacle passable detection result includes: a detection result that an inclined obstacle can pass through and a detection result that a vertical obstacle can pass through;
the method for detecting the barrier in the set range with the robot as the center in a passable manner according to the updated environment map to obtain a passable barrier detection result comprises the following steps:
acquiring a height measurement value within a set range centered on the robot;
calculating the slope value of the set range according to the height measurement value in the set range;
determining a passable detection result of the inclined barrier of the grid where the robot is located according to the slope value and a preset slope threshold;
calculating the maximum height difference between all grids in the footprint model coverage area and the central grid of the footprint model coverage area according to a preset footprint model of the robot;
and determining a detection result that the vertical barrier of the grid where the robot is located can pass through according to the maximum height difference value and a preset critical height value.
A vertical obstacle is generally an obstacle with a particularly large angle to the passable ground, which is safe for the robot when its height is below the maximum obstacle surmountable height of the mobile robot. However, when the height of the robot exceeds the maximum obstacle-surmounting height of the robot, the mobile robot will collide when contacting the robot. Therefore, in the embodiment of the invention, the vertical barrier is introduced to ensure that the passing efficiency of the robot can be improved greatly on the premise of ensuring safe passing when the robot faces a complex terrain environment.
In an optional embodiment, the method further comprises:
comparing the slope value with a preset slope threshold value;
determining that a tilted obstacle of a grid on which the robot is located cannot pass through if the slope value is larger than the slope threshold value;
and determining that the inclined obstacle of the grid where the robot is located can pass through when the slope value is smaller than or equal to the slope threshold value.
In the embodiment of the invention, the extended elevation grid map obtained after updating based on the environment measurement data and the pose data is a circular area which takes a robot as a center and has a set radius and is used for fitting an circumscribed circle which takes a center point of the robot as a round point and covers a machine body, and the diameter of the circular area corresponds to the diameter of the circumscribed circle of the robot. For each grid of the expanded elevation grid map, a slope value K may be defined to describe and determine the slope obstacles around the current grid. For the slope value K of each grid in the local elevation map, the specific calculation process is as follows:
extracting height measurements of all grids within the circular region
Figure BDA0002940601870000181
From height measurements of all grids in the circular area
Figure BDA0002940601870000182
Calculating a normal vector of the circular area as a normal vector of a grid where the circle center of the circular area is located;
calculating a corresponding normal vector for each grid in the elevation grid map in a convolution sliding window mode;
comparing the normal vector direction in each grid with a Z axis perpendicular to the motion plane of the robot in an inertial coordinate system I and calculating a slope value K of the grid where the robot is located; taking a grid where the center point of the robot is located as the grid where the robot is located;
the height condition in a certain range around is considered by calculating the normal vector of each grid, so that the inclination degree of the current grid can be really restored;
wherein the value range of the slope value K is within
Figure BDA0002940601870000191
In between, pi is a constant value. Because the obstacle crossing performance of different robots is greatly different, the embodiment of the invention is provided with an allowable slope threshold value KthresholdWhen is coming into contact withThe slope value K of the grid exceeds a slope threshold value KthresholdIf so, the data is not passed, otherwise, the data is passed. The detection result of the i-th grid for the passability of the inclined obstacles is
Figure BDA0002940601870000192
In an optional embodiment, the method further comprises:
comparing the maximum height difference value with a preset critical height value;
determining that a vertical obstacle of a grid on which the robot is located cannot pass through if the maximum height difference value is larger than the critical height value;
and determining that the vertical obstacle of the grid where the robot is located can pass through if the maximum height difference value is smaller than or equal to the critical height value.
Vertical obstacle passability detection based on elevation grid maps may be computed in two steps. First, the local height differences of the grids are calculated for all grids in the footprint model footprint area preset by the robot. Then comparing whether the maximum height difference between all grids in the region and the central grid of the region is greater than a critical height value VthresholdThe slope value K represented by two grids separated at the same time needs to be larger than the slope threshold value KthresholdThe grid is considered to be an active grid.
In an embodiment of the invention, the maximum height difference value uses the temporary vertical obstacle height of the central grid within the footprint model area. The temporary vertical obstacle height of the center grid of the footprint model area is calculated as:
Figure BDA0002940601870000193
wherein, VmaxMaximum value, N, representing the height of an obstacle within the footprint model footprint areastRepresenting temporary obstacle height in footprint model coverage area greater than critical height value VthresholdThe number of grids; n is a radical ofthresholdIndicating temporaryIs higher than the critical height value NthresholdThe number of effective grids. When the maximum height difference V is larger than the critical height value VthresholdThen the center grid is deemed to be not passable, otherwise it is passable. A small steep slope can be detected as an obstacle by the above calculation and is robust to missing terrain information. The detection result of the i-th grid for the vertical barrier to pass is
Figure BDA0002940601870000201
In an optional embodiment, the generating a local environment reachability map according to the obstacle passable detection result and the updated environment map includes:
weighting and summing the detection result of the inclined barrier passable of any grid and the detection result of the vertical barrier passable of any grid in the updated environment map to obtain the passable result of any grid;
and marking the passable result of any grid into the updated environment map, and generating the local environment reachability map.
In the embodiment of the invention, the detection result of the calculated inclined barrier passable of the ith grid
Figure BDA0002940601870000202
And vertical obstacle passable detection result
Figure BDA0002940601870000203
And carrying out weighted summation to obtain a passable result of the ith grid:
Figure BDA0002940601870000204
wherein, λ is the weight assigned to the oblique obstacle, (1- λ) is the weight assigned to the vertical obstacle, λ ∈ [0, 1 ]. When the lambda is larger, the proportion of the inclined barriers in the global passability is larger, the passability becomes more aggressive, and certain probability of slopes with certain angles is marked as allowing passage; when λ is smaller, the proportion of the vertical obstacle in the overall passability is larger, and the passability becomes more conservative, and the obstacle is marked as failing to pass only when the obstacle is higher than the passable height of the mobile robot.
In the embodiment of the invention, the passable result of each grid is marked in the environment map, the local environment reachability map is generated, the robot can perform passable obstacle navigation through the local environment reachability map, the reachability estimation strategy of the robot using the elevation grid map navigation can be realized, and the safety and the efficiency of the robot navigation can be well ensured.
Compared with the prior art, the embodiment of the invention has the beneficial effects that:
1. in the aspect of environment expression, a height measurement value is introduced into an original data structure of a two-dimensional grid map and is expanded into a compact and efficient 2.5-dimensional elevation grid map with height information to represent a complex structure in a three-dimensional environment, so that the problems of terrain and obstacle detail loss caused by the fact that the two-dimensional grid map is used for a ground mobile robot and the problems of excessive redundancy of environment expression information and excessive calculation complexity in traversal caused by the fact that only a three-dimensional occupied map is used are solved.
2. The uncertainty of the grid height measurement value is represented by a covariance matrix by considering a definition mode taking the map coordinate as the center, and the error propagation process of each grid in the local elevation grid map is deduced, so that the uncertainty of the position of a distance sensor is not considered, and the uncertainty problem of the measurement of the sensor in a digital elevation mapping model is solved.
3. Compared with the traditional occupation judgment detection method, the embodiment of the invention additionally adds two detection schemes of the passable detection of the inclined barrier and the passable detection of the vertical barrier in the reachability estimation part, so that the safety and the efficiency of the robot navigation can be well ensured.
Referring to fig. 4, an environment map updating system for a robot includes:
the environment measurement data acquisition module 1 is used for acquiring the sensed environment measurement data;
the pose data acquisition module 2 is used for acquiring pose data of the robot;
the local environment sensing module 3 is used for updating a pre-stored environment map according to the environment measurement data and the pose data;
the obstacle detection module 4 is used for detecting the obstacles in the set range with the robot as the center in a passable manner according to the updated environment map to obtain a passable obstacle detection result;
a reachability estimation module 5, configured to generate a local environment reachability map according to the obstacle passable detection result and the environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
In an alternative embodiment, the local context awareness module 3 comprises:
the first updating module is used for locally updating the environment map according to the environment measurement data, wherein the environment map is an elevation grid map;
and the second updating module is used for performing incremental updating on the locally updated environment map according to the pose data to obtain an extended elevation grid map.
In an alternative embodiment, the system comprises:
the point cloud conversion module is used for carrying out point cloud conversion on the environment measurement data to obtain point cloud data;
the filtering module is used for filtering the point cloud data;
and the down-sampling module is used for performing down-sampling processing on the filtered point cloud data.
In an alternative embodiment, the first update module comprises:
the distance calculation unit is used for acquiring the distance from the distance sensor to any measuring point in a real scene according to the environment measuring data; wherein the distance sensor is to sense the environmental measurement data;
the height measurement value calculation unit is used for carrying out coordinate conversion on the acquired distance to obtain a height measurement value corresponding to each measurement point in a first coordinate system; wherein the first coordinate system is a coordinate system from the center of the sensor;
the mapping unit is used for mapping the height measurement value to a corresponding height scalar measurement value in the environment map according to a preset projection matrix;
the first variance calculating unit is used for calculating the variance corresponding to the height measurement value according to the height measurement value and a preset coordinate transformation matrix;
the data fusion unit is used for fusing the height measurement value and the corresponding variance with the height scalar measurement value and the corresponding variance in the environment map to obtain a fusion height measurement value and a fusion variance;
the Mahalanobis distance calculating unit is used for calculating the Mahalanobis distance according to the height measurement value and the corresponding height scalar measurement value;
a height value updating unit, configured to determine a final height updating value of each cell grid in the environment map according to the mahalanobis distance, the fused height measurement value, the height measurement value, and the height scalar measurement value;
and the variance updating unit is used for determining a final variance updating value of each unit grid in the environment map according to the Mahalanobis distance, the fusion variance, the variance corresponding to the height measurement value and the variance corresponding to the height scalar measurement value.
In an alternative embodiment, the variance calculation unit includes:
the jacobian calculating subunit is used for respectively calculating the jacobian of the distance sensor and the rotating jacobian of the first coordinate system according to the height measurement value and a preset coordinate transformation matrix; the first coordinate system and the second coordinate system are converted through the coordinate transformation matrix, and the second coordinate system is a coordinate system centered by the robot;
and the variance calculating subunit is used for calculating the variance corresponding to the height measurement value according to the Jacobian of the distance sensor and the rotating Jacobian of the first coordinate system.
In an alternative embodiment, the second update module comprises:
the map coordinate system overlapping unit is used for overlapping the third coordinate system at the current moment into the third coordinate system at the previous moment according to the pose data under the condition that the robot moves to obtain a map common reference coordinate system;
the pose data comprise relative translation vectors and rotation of the third coordinate system and the second coordinate system at the current moment, and translation vectors and rotation of the robot from the last moment to the current moment in the second coordinate system; the third coordinate system is a map coordinate system with the robot as a center;
the first covariance matrix calculation unit is used for acquiring a variance update value of each grid in the environment map and calculating a spatial covariance matrix of the corresponding grid at the previous moment according to the variance update value;
the second covariance matrix calculation unit is used for calculating a spatial covariance matrix of each grid at the current moment according to the spatial covariance matrix of each grid at the previous moment based on the map common reference coordinate system;
the uncertain estimation unit is used for calculating Gaussian distribution among a spatial covariance matrix of each grid at the current moment, a second coordinate system corresponding to the previous moment and a second coordinate system corresponding to the current moment to obtain uncertain estimation of the robot motion;
and the second variance updating unit is used for calculating an updated value of the covariance matrix of each grid according to the uncertain estimation and determining a final updated value of the variance of each grid according to the updated value of the covariance matrix of each grid.
In an alternative embodiment, the obstacle passable detection result includes: a detection result that an inclined obstacle can pass through and a detection result that a vertical obstacle can pass through;
the obstacle detection module 4 includes:
a height value acquisition unit for acquiring a height measurement value within a set range centered on the robot;
the slope calculation unit is used for calculating the slope value of the set range according to the height measurement value in the set range;
the detection result determining unit is used for determining the detection result that the inclined barrier of the grid where the robot is located can pass according to the slope value and a preset slope threshold;
the height difference calculating unit is used for calculating the maximum height difference between all grids in the footprint model coverage area and the central grid of the footprint model coverage area according to a preset footprint model of the robot;
and the vertical obstacle passable detection result determining unit is used for determining the vertical obstacle passable detection result of the grid where the robot is located according to the maximum height difference value and a preset critical height value.
In an alternative embodiment, the system further comprises:
the first comparison module is used for comparing the slope value with a preset slope threshold value; when the slope value is larger than the slope threshold value, determining that the inclined obstacle of the grid where the robot is located cannot pass through; and when the slope value is smaller than or equal to the slope threshold value, determining that the inclined obstacle of the grid where the robot is located can pass through.
In an alternative embodiment, the system further comprises:
the second comparison module is used for comparing the maximum height difference value with a preset critical height value; when the maximum height difference value is larger than the critical height value for comparison, determining that a vertical obstacle of a grid where the robot is located cannot pass through; and when the maximum height difference value is less than or equal to the critical height value for comparison, determining that the vertical obstacle of the grid where the robot is located can pass through.
In an alternative embodiment, the reachability estimation module 5 includes:
the weighted summation unit is used for carrying out weighted summation on the detection result that the inclined barrier of any grid in the updated environment map can pass through and the detection result that the vertical barrier can pass through so as to obtain the passable result of any grid;
and the marking unit is used for marking the passable result of any grid into the updated environment map and generating the local environment reachability map.
It should be noted that the principle and technical effect of the robot environment map updating system according to the embodiment of the present invention are the same as those of the robot map updating method according to the first embodiment, and therefore, detailed description thereof is omitted.
Referring to fig. 5, a diagram of an environment map updating apparatus for a robot according to a fifth embodiment of the present invention is shown. As shown in fig. 5, the environment map updating apparatus of the robot includes: at least one processor 11, such as a CPU, at least one network interface 14 or other user interface 13, a memory 15, at least one communication bus 12, the communication bus 12 being used to enable connectivity communications between these components. The user interface 13 may optionally include a USB interface, and other standard interfaces, wired interfaces. The network interface 14 may optionally include a Wi-Fi interface as well as other wireless interfaces. The memory 15 may comprise a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 15 may optionally comprise at least one memory device located remotely from the aforementioned processor 11.
In some embodiments, memory 15 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof:
an operating system 151, which contains various system programs for implementing various basic services and for processing hardware-based tasks;
and (5) a procedure 152.
Specifically, the processor 11 is configured to call the program 152 stored in the memory 15 to execute the robot environment map updating method according to the foregoing embodiment, for example, step S1 shown in fig. 1. Alternatively, the processor, when executing the computer program, implements the functions of the modules/units in the above-mentioned device embodiments, such as the environment measurement data acquiring module.
Illustratively, the computer program may be partitioned into one or more modules/units that are stored in the memory and executed by the processor to implement the invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program in the environment map updating apparatus of the robot.
The environment map updating device of the robot can be a desktop computer, a notebook, a palm computer, a cloud server and other computing devices. The environment map updating device of the robot can comprise, but is not limited to, a processor and a memory. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of the environment map updating apparatus of the robot and does not constitute a limitation of the environment map updating apparatus of the robot, and may include more or less components than those shown, or combine some components, or different components.
The Processor 11 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc., and the processor 11 is a control center of the environment map updating apparatus of the robot, and various interfaces and lines are used to connect various parts of the environment map updating apparatus of the entire robot.
The memory 15 may be used to store the computer programs and/or modules, and the processor 11 implements various functions of the environment map updating apparatus of the robot by running or executing the computer programs and/or modules stored in the memory and calling data stored in the memory. The memory 15 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 15 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
Wherein, the integrated module/unit of the robot environment map updating device can be stored in a computer readable storage medium if it is implemented in the form of software functional unit and sold or used as a stand-alone product. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
A sixth embodiment of the present invention provides a computer-readable storage medium, which includes a stored computer program, where the computer program, when running, controls an apparatus in which the computer-readable storage medium is located to execute the method for updating an environment map of a robot according to any one of the first embodiments.
It should be noted that the above-described device embodiments are merely illustrative, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiment of the apparatus provided by the present invention, the connection relationship between the modules indicates that there is a communication connection between them, and may be specifically implemented as one or more communication buses or signal lines. One of ordinary skill in the art can understand and implement it without inventive effort.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention.

Claims (13)

1. An environment map updating method for a robot, comprising:
acquiring sensed environmental measurement data;
acquiring pose data of the robot;
updating a pre-stored environment map according to the environment measurement data and the pose data;
carrying out passable detection on the obstacles in the set range taking the robot as the center according to the updated environment map to obtain a passable obstacle detection result;
generating a local environment reachability map according to the barrier passable detection result and the updated environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
2. The robot environment map updating method according to claim 1, wherein updating a pre-stored environment map based on the environment measurement data and the pose data comprises:
locally updating the environment map according to the environment measurement data, wherein the environment map is an elevation grid map;
and performing incremental updating on the environment map after local updating according to the pose data to obtain an extended elevation grid map.
3. The robot environment map updating method according to claim 2, wherein before the local update of the environment map based on the environment measurement data, the method further comprises:
performing point cloud conversion on the environment measurement data to obtain point cloud data;
filtering the point cloud data;
and performing down-sampling processing on the filtered point cloud data.
4. The robot environment map updating method according to claim 2, wherein the locally updating the environment map based on the environment measurement data includes:
acquiring the distance from the distance sensor to any measuring point in a real scene according to the environment measuring data; wherein the distance sensor is to sense the environmental measurement data;
performing coordinate conversion on the obtained distance to obtain a height measurement value corresponding to each measurement point in a first coordinate system; wherein the first coordinate system is a coordinate system centered on the distance sensor;
mapping the height measurement value to a corresponding height scalar measurement value in the environment map according to a preset projection matrix;
calculating a variance corresponding to the height measurement value according to the height measurement value and a preset coordinate transformation matrix;
fusing the height measurement value and the corresponding variance with a height scalar measurement value in the environment map and the corresponding variance to obtain a fused height measurement value and a fused variance;
calculating a mahalanobis distance according to the height measurement value and the corresponding height scalar measurement value;
determining a final height update value for each cell grid in the environment map based on the mahalanobis distance, the fused height measurement value, the height measurement value, and the height scalar measurement value;
and determining a final variance updating value of each unit grid in the environment map according to the Mahalanobis distance, the fusion variance, the variance corresponding to the height measurement value and the variance corresponding to the height scalar measurement value.
5. The robot environment map updating method according to claim 4, wherein the calculating a variance corresponding to the height measurement value based on the height measurement value and a preset coordinate transformation matrix comprises:
respectively calculating the Jacobian of the distance sensor and the rotating Jacobian of a first coordinate system according to the height measurement value and a preset coordinate transformation matrix; wherein the first coordinate system and the second coordinate system are converted by the coordinate transformation matrix, and the second coordinate system is a coordinate system centered by the robot;
and calculating the variance corresponding to the height measurement value according to the Jacobian of the distance sensor and the rotating Jacobian of the first coordinate system.
6. The robot environment map updating method according to claim 4, wherein the robot incrementally updates the locally updated environment map according to the pose data to obtain an extended elevation grid map, and the method comprises:
under the condition that the robot moves, according to the pose data, a third coordinate system at the current time is superposed into a third coordinate system at the previous time to obtain a map common reference coordinate system;
the pose data comprise relative translation vectors and rotation of the third coordinate system and the second coordinate system at the current moment, and translation vectors and rotation of the robot from the last moment to the current moment in the second coordinate system; the third coordinate system is a map coordinate system with the robot as a center;
acquiring a variance updating value of each grid in the environment map, and calculating a spatial covariance matrix of the corresponding grid at the previous moment according to the variance updating value;
based on the map common reference coordinate system, calculating a spatial covariance matrix of the corresponding grid at the current moment according to the spatial covariance matrix of each grid at the previous moment;
calculating Gaussian distribution among a spatial covariance matrix of each grid at the current moment, a second coordinate system corresponding to the previous moment and a second coordinate system corresponding to the current moment to obtain uncertain estimation of the robot motion;
and calculating an updated value of the covariance matrix of each grid according to the uncertain estimation, and determining a final updated value of the variance of each grid according to the updated value of the covariance matrix of each grid.
7. The environment map updating method for a robot according to claim 4, wherein the obstacle passable detection result includes: a detection result that an inclined obstacle can pass through and a detection result that a vertical obstacle can pass through;
the method for detecting the barrier in the set range with the robot as the center in a passable manner according to the updated environment map to obtain a passable barrier detection result comprises the following steps:
acquiring a height measurement value within a set range centered on the robot;
calculating the slope value of the set range according to the height measurement value in the set range;
determining a passable detection result of the inclined barrier of the grid where the robot is located according to the slope value and a preset slope threshold;
calculating the maximum height difference between all grids in the footprint model coverage area and the central grid of the footprint model coverage area according to a preset footprint model of the robot;
and determining a detection result that the vertical barrier of the grid where the robot is located can pass through according to the maximum height difference value and a preset critical height value.
8. The robot environment map updating method according to claim 7, further comprising:
comparing the slope value with a preset slope threshold value;
determining that a tilted obstacle of a grid on which the robot is located cannot pass through if the slope value is larger than the slope threshold value;
and determining that the inclined obstacle of the grid where the robot is located can pass through when the slope value is smaller than or equal to the slope threshold value.
9. The robot environment map updating method according to claim 7, further comprising:
comparing the maximum height difference value with a preset critical height value;
determining that a vertical obstacle of a grid on which the robot is located cannot pass through if the maximum height difference value is larger than the critical height value;
and determining that the vertical obstacle of the grid where the robot is located can pass through if the maximum height difference value is smaller than or equal to the critical height value.
10. The robot environment map updating method according to claim 7, wherein generating a local environment reachability map based on the obstacle passable detection result and the updated environment map includes:
weighting and summing the detection result of the inclined barrier passable of any grid and the detection result of the vertical barrier passable of any grid in the updated environment map to obtain the passable result of any grid;
and marking the passable result of any grid into the updated environment map, and generating the local environment reachability map.
11. An environment map updating system for a robot, comprising:
the environment measurement data acquisition module is used for acquiring the sensed environment measurement data;
the pose data acquisition module is used for acquiring pose data of the robot;
the local environment sensing module is used for updating a pre-stored environment map according to the environment measurement data and the pose data;
the obstacle detection module is used for carrying out passable detection on obstacles in a set range taking the robot as a center according to the updated environment map to obtain a passable obstacle detection result;
the reachability estimation module is used for generating a local environment reachability map according to the barrier passable detection result and the environment map; wherein the local environment accessibility map has an obstacle passable mark and an obstacle impassable mark.
12. An environment map updating apparatus of a robot, comprising a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, the processor implementing the environment map updating method of the robot according to any one of claims 1 to 10 when executing the computer program.
13. A computer-readable storage medium, comprising a stored computer program, wherein the computer program, when executed, controls an apparatus in which the computer-readable storage medium is located to perform the robot environment map updating method according to any one of claims 1 to 10.
CN202110175609.2A 2021-02-07 2021-02-07 Robot environment map updating method, system, equipment and storage medium Withdrawn CN112987728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110175609.2A CN112987728A (en) 2021-02-07 2021-02-07 Robot environment map updating method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110175609.2A CN112987728A (en) 2021-02-07 2021-02-07 Robot environment map updating method, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112987728A true CN112987728A (en) 2021-06-18

Family

ID=76392510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110175609.2A Withdrawn CN112987728A (en) 2021-02-07 2021-02-07 Robot environment map updating method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112987728A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113440074A (en) * 2021-07-06 2021-09-28 上海高仙自动化科技发展有限公司 Cleaning method, device, equipment and storage medium
CN113686347A (en) * 2021-08-11 2021-11-23 追觅创新科技(苏州)有限公司 Method and device for generating robot navigation path
CN113741480A (en) * 2021-09-16 2021-12-03 中科南京软件技术研究院 Obstacle avoidance method based on combination of dynamic obstacle extraction and cost map
CN113868276A (en) * 2021-09-29 2021-12-31 深圳市银星智能科技股份有限公司 Map updating method and device, intelligent device and storage medium
CN113985907A (en) * 2021-10-28 2022-01-28 国网江苏省电力有限公司泰州供电分公司 Tree barrier risk prediction and optimization method based on multi-load data of unmanned aerial vehicle
CN114115250A (en) * 2021-11-11 2022-03-01 深圳市中舟智能科技有限公司 Robot motion map construction method, robot motion method and robot
CN114415659A (en) * 2021-12-13 2022-04-29 烟台杰瑞石油服务集团股份有限公司 Robot safety obstacle avoidance method and device, robot and storage medium
CN114754781A (en) * 2022-03-31 2022-07-15 深圳市优必选科技股份有限公司 Map updating method, device, robot and medium
CN114847809A (en) * 2022-07-07 2022-08-05 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium
CN115143964A (en) * 2022-07-05 2022-10-04 中国科学技术大学 Four-footed robot autonomous navigation method based on 2.5D cost map
WO2023124788A1 (en) * 2021-12-28 2023-07-06 速感科技(北京)有限公司 Autonomous mobile device, control method therefor, apparatus, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573318A (en) * 2015-12-15 2016-05-11 中国北方车辆研究所 Environment construction method based on probability analysis
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN110154669A (en) * 2019-05-19 2019-08-23 浙江大学 A kind of ECAS height of chassis above ground disturbance elimination method based on Multi-source Information Fusion
CN110181508A (en) * 2019-05-09 2019-08-30 中国农业大学 Underwater robot three-dimensional Route planner and system
CN111159326A (en) * 2020-01-02 2020-05-15 中国航空工业集团公司西安航空计算技术研究所 Airborne digital map local data incremental updating method and device
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN111982114A (en) * 2020-07-30 2020-11-24 广东工业大学 Rescue robot for estimating three-dimensional pose by adopting IMU data fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573318A (en) * 2015-12-15 2016-05-11 中国北方车辆研究所 Environment construction method based on probability analysis
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN110181508A (en) * 2019-05-09 2019-08-30 中国农业大学 Underwater robot three-dimensional Route planner and system
CN110154669A (en) * 2019-05-19 2019-08-23 浙江大学 A kind of ECAS height of chassis above ground disturbance elimination method based on Multi-source Information Fusion
CN111159326A (en) * 2020-01-02 2020-05-15 中国航空工业集团公司西安航空计算技术研究所 Airborne digital map local data incremental updating method and device
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN111982114A (en) * 2020-07-30 2020-11-24 广东工业大学 Rescue robot for estimating three-dimensional pose by adopting IMU data fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
索晨杰: "移动机器人的局部环境感知与避障系统的研究", 《道客巴巴》, pages 22 - 36 *
郑林飞: "基于局部构图的移动机器人三维地形环境感知建模研究", 《信息科技》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113440074A (en) * 2021-07-06 2021-09-28 上海高仙自动化科技发展有限公司 Cleaning method, device, equipment and storage medium
CN113686347A (en) * 2021-08-11 2021-11-23 追觅创新科技(苏州)有限公司 Method and device for generating robot navigation path
CN113686347B (en) * 2021-08-11 2024-06-04 北京小米移动软件有限公司 Method and device for generating robot navigation path
CN113741480A (en) * 2021-09-16 2021-12-03 中科南京软件技术研究院 Obstacle avoidance method based on combination of dynamic obstacle extraction and cost map
CN113868276A (en) * 2021-09-29 2021-12-31 深圳市银星智能科技股份有限公司 Map updating method and device, intelligent device and storage medium
CN113985907B (en) * 2021-10-28 2024-02-02 国网江苏省电力有限公司泰州供电分公司 Tree obstacle risk prediction and optimization method based on multi-load data of unmanned aerial vehicle
CN113985907A (en) * 2021-10-28 2022-01-28 国网江苏省电力有限公司泰州供电分公司 Tree barrier risk prediction and optimization method based on multi-load data of unmanned aerial vehicle
CN114115250A (en) * 2021-11-11 2022-03-01 深圳市中舟智能科技有限公司 Robot motion map construction method, robot motion method and robot
CN114415659A (en) * 2021-12-13 2022-04-29 烟台杰瑞石油服务集团股份有限公司 Robot safety obstacle avoidance method and device, robot and storage medium
CN114415659B (en) * 2021-12-13 2024-05-28 烟台杰瑞石油服务集团股份有限公司 Robot safety obstacle avoidance method and device, robot and storage medium
WO2023124788A1 (en) * 2021-12-28 2023-07-06 速感科技(北京)有限公司 Autonomous mobile device, control method therefor, apparatus, and storage medium
CN114754781A (en) * 2022-03-31 2022-07-15 深圳市优必选科技股份有限公司 Map updating method, device, robot and medium
CN115143964A (en) * 2022-07-05 2022-10-04 中国科学技术大学 Four-footed robot autonomous navigation method based on 2.5D cost map
CN115143964B (en) * 2022-07-05 2024-05-10 中国科学技术大学 Four-foot robot autonomous navigation method based on 2.5D cost map
CN114847809A (en) * 2022-07-07 2022-08-05 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium
CN114847809B (en) * 2022-07-07 2022-09-20 深圳市云鼠科技开发有限公司 Environment exploration method and device for cleaning robot, cleaning robot and medium

Similar Documents

Publication Publication Date Title
CN112987728A (en) Robot environment map updating method, system, equipment and storage medium
CN112219087A (en) Pose prediction method, map construction method, movable platform and storage medium
CN110488818B (en) Laser radar-based robot positioning method and device and robot
CN111273312B (en) Intelligent vehicle positioning and loop detection method
CN112444246B (en) Laser fusion positioning method in high-precision digital twin scene
CN111915675B (en) Particle drift-based particle filtering point cloud positioning method, device and system thereof
CN114120149B (en) Oblique photogrammetry building feature point extraction method and device, electronic equipment and medium
CN114255323A (en) Robot, map construction method, map construction device and readable storage medium
CN111145251A (en) Robot, synchronous positioning and mapping method thereof and computer storage device
US11373328B2 (en) Method, device and storage medium for positioning object
CN114429432B (en) Multi-source information layered fusion method and device and storage medium
CN118230231B (en) Pose construction method and device of unmanned vehicle, electronic equipment and storage medium
CN116753945A (en) Navigation method of industrial inspection robot based on multi-sensor fusion
CN115683100A (en) Robot positioning method, device, robot and storage medium
CN114061573A (en) Ground unmanned vehicle formation positioning device and method
CN112907625A (en) Target following method and system applied to four-footed bionic robot
CN114077249B (en) Operation method, operation equipment, device and storage medium
CN115307641A (en) Robot positioning method, device, robot and storage medium
WO2021056339A1 (en) Positioning method and system, and movable platform
CN112733971B (en) Pose determination method, device and equipment of scanning equipment and storage medium
CN114593735A (en) Pose prediction method and device
US20220155455A1 (en) Method and system for ground surface projection for autonomous driving
CN112405526A (en) Robot positioning method and device, equipment and storage medium
CN113483762A (en) Pose optimization method and device
CN115493579A (en) Positioning correction method, positioning correction device, mowing robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210618