Nothing Special   »   [go: up one dir, main page]

CN115507839A - Mapping method, device, equipment and storage medium - Google Patents

Mapping method, device, equipment and storage medium Download PDF

Info

Publication number
CN115507839A
CN115507839A CN202211104704.4A CN202211104704A CN115507839A CN 115507839 A CN115507839 A CN 115507839A CN 202211104704 A CN202211104704 A CN 202211104704A CN 115507839 A CN115507839 A CN 115507839A
Authority
CN
China
Prior art keywords
point cloud
frame
local
subgraph
cloud frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211104704.4A
Other languages
Chinese (zh)
Inventor
傅志刚
陶永康
彭登
南志捷
董博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Huitian Aerospace Technology Co Ltd
Original Assignee
Guangdong Huitian Aerospace Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Huitian Aerospace Technology Co Ltd filed Critical Guangdong Huitian Aerospace Technology Co Ltd
Priority to CN202211104704.4A priority Critical patent/CN115507839A/en
Priority to PCT/CN2022/131360 priority patent/WO2024050961A1/en
Publication of CN115507839A publication Critical patent/CN115507839A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a mapping method, a mapping device, mapping equipment and a storage medium, which are applied to movable equipment, wherein the movable equipment is provided with observation equipment, the observation equipment at least comprises a radar, and the method comprises the following steps: acquiring a first point cloud frame acquired by a radar at the current moment; if the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet the under-constraint condition, taking the second point cloud frame as an end frame of a first local sub-image of the region to be mapped, and taking the first point cloud frame as a start frame of a second local sub-image of the region to be mapped; wherein, different local subgraphs correspond to different sub-areas in the area to be constructed; and generating a point cloud map of the region to be mapped based on the first local subgraph and the second local subgraph. According to the method and the device, two adjacent frames of point cloud frames at the acquisition time with the under-constraint problem are divided into two different local sub-images, so that the under-constraint condition does not exist in each local sub-image, and the consistency with the real environment of the region to be mapped is ensured.

Description

Mapping method, device, equipment and storage medium
Technical Field
The present application relates to the field of automatic driving technologies, and in particular, to a mapping method, apparatus, device, and storage medium.
Background
In the field of automatic driving, a high-precision map constructed based on a Simultaneous Localization and Mapping (SLAM) technology is an indispensable part for realizing an automatic driving function, and is especially suitable for a flying automobile which is one of future vehicle development directions. However, due to the unknown environment of the region to be mapped and the position of the mobile device such as an automobile and an aircraft, an under-constrained problem may occur for a region with few geometric features, such as a lawn and a lake, so that the size, the position and the boundary of the region are uncertain, and have a deviation from the real environment of the region to be mapped, and thus, the full-terrain mapping cannot be realized.
Disclosure of Invention
In order to overcome the problems in the related art, the application provides a method, a device, equipment and a storage medium for creating a map.
According to a first aspect of embodiments of the present application, there is provided a mapping method applied to a mobile device, where the mobile device is configured with an observation device, and the observation device at least includes a radar, the method including:
acquiring a first point cloud frame acquired by a radar at the current moment;
if the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet an under-constrained condition, taking the second point cloud frame as an end frame of a first local subgraph of a region to be mapped, and taking the first point cloud frame as a start frame of a second local subgraph of the region to be mapped; wherein, different local subgraphs correspond to different sub-areas in the area to be constructed;
and generating a point cloud map of the to-be-mapped area based on the first local subgraph and the second local subgraph.
According to a second aspect of the embodiments of the present application, there is provided a mapping apparatus, applied to a mobile device, where the mobile device is configured with an observation device, and the observation device at least includes a radar, including:
the acquisition module is used for acquiring a first point cloud frame acquired by the radar at the current moment;
the determining module is used for taking the second point cloud frame as an end frame of a first local subgraph of a to-be-constructed region and taking the first point cloud frame as a starting frame of a second local subgraph of the to-be-constructed region under the condition that the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet under-constraint conditions; wherein, different local subgraphs correspond to different sub-regions in the region to be mapped;
and the generating module is used for generating a point cloud map of the to-be-mapped area based on the first local subgraph and the second local subgraph.
According to a third aspect of embodiments of the present application, there is provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of the first aspect when executing the program.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium storing a computer program for instructing associated hardware to perform the method of the first aspect.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
the method comprises the steps of establishing a map in a map area to be established in a local subgraph dividing mode, detecting under-constrained conditions existing in the map establishing process in real time, and dividing a first point cloud frame and a second point cloud frame into two different local subgraphs when the under-constrained conditions of the first point cloud frame and the second point cloud frame which are adjacent at the acquisition time occur. Therefore, under-constraint conditions do not exist in each local sub-graph, and consistency with the real environment of the region to be mapped can be ensured for the region with less geometric features.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1A is a flow chart illustrating a method of creating a map according to an exemplary embodiment of the present application.
FIG. 1B is a schematic diagram of a factor graph optimization illustrated herein according to an exemplary embodiment.
FIG. 1C is another factor graph optimization schematic shown herein according to an exemplary embodiment.
Fig. 2 is a block diagram of a mapping apparatus according to an exemplary embodiment of the present application.
Fig. 3 is a hardware structure diagram of an electronic device in which a mapping apparatus according to an exemplary embodiment is located.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at" \8230; "or" when 8230; \8230; "or" in response to a determination ", depending on the context.
In the process of establishing the map, for areas with few geometric features, such as lawns, lakes and other planes, a corresponding relation cannot be established between data acquired by the observation equipment in the areas at two adjacent moments, for example, lawn boundary points in point cloud frames of two frames of lawns acquired by a radar at two adjacent moments cannot be corresponded, so that the size, the boundary, the position and the like of the lawn are uncertain, the problem of under-constraint occurs, and the established map is inaccurate. In order to overcome the problem of under-constraint in the related technology, the method for constructing the map comprises the steps of constructing the map in a local sub-map dividing mode in a region to be constructed, detecting possible under-constraint conditions in real time in the map constructing process, and dividing a first point cloud frame and a second point cloud frame into two different local sub-maps when under-constraint conditions occur in the first point cloud frame and the second point cloud frame which are adjacent at the acquisition time. Therefore, under-constraint conditions do not exist in each local subgraph, and consistency with real environments of the to-be-constructed region can be guaranteed for the region with few geometric features. The mobile equipment is suitable for the mobile equipment with self-positioning and navigation requirements, including but not limited to automobiles, unmanned aerial vehicles, sweeping robots and the like, and observation equipment for acquiring environmental information of an area to be mapped is configured on the mobile equipment and at least comprises a radar.
Next, examples of the present application will be described in detail.
As shown in fig. 1A, fig. 1A is a flowchart of a mapping method according to an exemplary embodiment of the present application, including the following steps:
102, acquiring a first point cloud frame acquired by a radar at the current moment;
step 104, if the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet an under-constrained condition, taking the second point cloud frame as an end frame of a first local subgraph of the area to be mapped, and taking the first point cloud frame as a start frame of a second local subgraph of the area to be mapped; wherein, different local subgraphs correspond to different sub-areas in the area to be constructed;
and 106, generating a point cloud map of the to-be-created map area based on the first local subgraph and the second local subgraph.
On one hand, the synchronous positioning and mapping are carried out on the mapping area to be mapped based on the position of the movable equipment, on the other hand, the mapping is guided based on the created map, and the real-time performance and the accuracy of mapping are guaranteed so that the navigation effect can be achieved on the movable equipment. The method comprises the steps that under-constrained detection is carried out in real time in the mapping process, when a radar acquires a point cloud frame, whether the under-constrained problem exists in a first point cloud frame acquired at the current moment and a second point cloud frame acquired at the previous moment of the current moment is judged, if the under-constrained problem exists, a local sub-graph (a first local sub-graph) is constructed on the basis of the second point cloud frame and the point cloud frame acquired by the radar before the second point cloud frame, and the first point cloud frame is used as an initial frame to construct a new local sub-graph (a second local sub-graph). For example, point cloud frames acquired by the radar from the starting time t1 to the current time t3 are respectively p1, p2 and p3, if the p3 and the p2 have an under-constraint problem, the p1 and the p2 belong to the local sub-graph a1, and the p3 belongs to the local sub-graph a2. Because the two frames of point cloud frames with the under-constraint problem belong to two different local subgraphs, for each local subgraph, the under-constraint problem does not exist between any two adjacent frames of point cloud frames, and the position, the size, the shape and the like of each local subgraph and the corresponding sub-region of the region to be constructed are kept consistent.
Whether an under-constraint problem exists between two adjacent frames of point cloud frames can be judged based on state variables of the movable equipment, wherein the state variables represent the motion state and the position state of the movable equipment and at least comprise the pose of the movable equipment. Furthermore, the state variables may also include at least the velocity of the movable device, zero offset of the angular velocity of a gyroscope on the movable device, zero offset of the acceleration of the movable device, or gravitational acceleration, etc. It should be noted that the state variables of the movable device are all vectors in real space.
The state variables can be provided by navigation equipment arranged on the movable equipment, the navigation equipment can provide pose information of the movable equipment, and the state variables of the movable equipment at least integrate the pose information provided by the navigation equipment. Because the navigation equipment provides global observation information, namely pose information under a world coordinate system, the state variable of the movable equipment can solve the under-constraint problem on one hand and the drift problem on the other hand after being fused with data acquired by the navigation equipment. At the same time, the navigation device also provides a covariance matrix for characterizing the accuracy of the state variables. In an embodiment of the present application, determining whether a first point cloud frame acquired at a current time and a second point cloud frame acquired at a previous time of the current time satisfy an under-constrained condition based on a covariance matrix corresponding to a state variable of a mobile device at the current time includes the following steps: calculating the characteristic value of the covariance matrix according to the covariance matrix corresponding to the state variable of the movable equipment at the current moment; and inputting the calculated characteristic values into a pre-trained classifier, and determining whether the first point cloud frame and the second point cloud frame meet under-constraint conditions or not based on a classification result output by the classifier. In addition, other methods can be adopted for under-constraint detection, which is not limited in the present application.
Under the condition that the state variables of the movable equipment are fused with the pose information provided by the navigation equipment, the pose information acquired by the navigation equipment has a certain magnitude of error, and the magnitude of the error is generally larger for mapping, so that the error of the system state variables is gradually increased along with the increase of the moving path of the movable equipment, and the mapping precision is influenced. In order to alleviate errors caused by the position and pose information provided by the navigation equipment to the state variables of the movable equipment, a local sub-graph can be reconstructed when the errors are accumulated to a certain degree, and the state variables of the movable equipment at the moment are determined again based on the position and pose information provided by the navigation equipment at the collection moment of the initial frame of the new local sub-graph, so that the errors caused by the position and pose information provided by the navigation equipment can be controlled within a certain range.
That is, whether to start constructing a new local sub-graph can be determined by setting an upper limit of error caused by pose information provided by the navigation device, and a new local sub-graph is constructed when the current time exceeds the upper limit, where the upper limit of error can be represented by a moving distance of the movable device or by a deviation between a state variable of the movable device at the current time and pose information provided by the navigation device at the current time. And under the condition that the deviation between the state variable of the movable equipment at the current moment and the pose information provided by the navigation equipment at the current moment exceeds a preset threshold value and/or under the condition that the accumulated moving distance of the movable equipment at the current moment is a preset distance threshold value, starting to construct a new local sub-graph (a second local sub-graph), taking a first point cloud frame acquired at the current moment as a starting frame of the second local sub-graph, and taking a second point cloud frame acquired at the previous moment as an ending frame of the first local sub-graph. The deviation threshold between the state variable and the pose information provided by the navigation device and the preset distance threshold of the accumulated movement distance of the navigation device may be set based on the model and usage experience of the navigation device. In addition, whether a new local sub-graph needs to be constructed can be judged based on other conditions capable of representing the upper error limit brought by the pose information provided by the navigation equipment, and the method is not limited in this application.
It can be understood that, based on the foregoing, a plurality of trigger conditions for reconstructing a new local sub-graph can be set simultaneously, and a new local sub-graph is reconstructed regardless of whether an under-constraint problem occurs or whether an error exceeds an upper limit, so that an error caused by pose information provided by a navigation device can be reduced while ensuring that no under-constraint problem exists inside each local sub-graph. It should be further noted that the state variable for detecting the under-constrained condition and for determining whether the error caused by the pose information provided by the navigation device exceeds the upper limit is a state variable corresponding to the movable device at the time when the radar acquires the point cloud frame.
In addition to the navigation device, the inertial measurement unit can also provide state variables of the movable device, and changes in the velocity and pose of the movable device can be determined based on the angular velocity and acceleration acquired by the inertial measurement unit. In an embodiment of the application, the observation device further includes an inertial measurement unit, and the state variable of the movable device is determined by fusing data collected by the navigation device, the radar and the inertial measurement unit. The iterative updating is carried out on the basis of the initial value of the state variable of the movable equipment every time the radar or the navigation equipment collects data, and the initial value of the state variable can be provided by the navigation equipment or set in advance by a user. The specific process of fusion is as follows: assuming that the initial value of the state variable at the initial time t0 is x0, if data d1 is acquired by the radar or the navigation equipment at the time t1, fusing motion data of the movable equipment acquired by the inertial measurement unit from the time t0 to the time t1 and the data d1 to determine the state variable x1 at the time t1 on the basis of x 0; if the radar or the navigation equipment acquires the data d2 at the moment t2, determining a state variable x2 at the moment t2 by fusing the motion data of the movable equipment acquired by the inertial measurement unit from the moment t1 to the moment t2 and the data d2 on the basis of x1, and the like. The process of data fusion may be implemented by a kalman filter.
In order to further reduce the influence of errors caused by pose information provided by the navigation equipment on the accuracy of the state variables of the movable equipment, each local sub-graph can be subjected to error optimization, for example, a factor graph optimization method can be adopted for optimization, and two constraint conditions of prior constraint and association constraint are introduced. Each local sub-graph corresponds to a plurality of frames of point cloud frames, each frame of point cloud frame corresponding to a state variable of the mobile device at the time the frame of point cloud frame was acquired. That is, each local subgraph corresponds to a plurality of state variables, and optimization of the local subgraph is realized through optimization of the state variables. The priori constraint is an expected value and a covariance matrix corresponding to pose information acquired by navigation equipment from the acquisition time of a start frame of a local sub-image to the acquisition time of an end frame of the local sub-image; the association constraint is composed of a plurality of association factors, and each association factor is obtained by calculation based on a state variable and a covariance matrix corresponding to each point cloud frame in any two adjacent point cloud frames at any two acquisition moments corresponding to the local subgraph. Then, calculating a first attitude matrix based on the state variable corresponding to the end frame before optimization, calculating a second attitude matrix based on the state variable corresponding to the end frame after optimization, and determining the product of the first attitude matrix and the second attitude matrix as an update matrix; and updating the state variable and the local subgraph corresponding to the optimized ending frame based on the updating matrix.
And updating the state variable and the local subgraph corresponding to the optimized end frame based on the updating matrix.
As shown in fig. 1B, a factor graph optimization diagram according to an exemplary embodiment of the present application is constructed based on state variables of a movable device corresponding to a local sub-graph to be optimized at different times (time), the local sub-graph is constructed based on point cloud frames acquired by a radar at times t1-t6, corresponding to state variables x1-x6 of the movable device at times t1-t6, and each state variable corresponds to a covariance matrix. The navigation equipment respectively collects pose information y1, y3 and y6 at the time t1, t3 and t6, and then an expected value and a covariance matrix corresponding to y1 are used as prior factors of a state variable at the time t1 to control the adjustment range of the state variable at the time t 1; the expected value and the covariance matrix corresponding to y3 are used as prior factors of the state variable at the time t3, and the adjustment range of the state variable at the time t3 is controlled; and taking the expected value and the covariance matrix corresponding to the y6 as prior factors of the state variable at the time t6, controlling the adjustment range of the state variable at the time t6, and taking all the prior factors as prior constraints. And calculating correlation factors of the x1 and the x2 based on the covariance matrix z1 corresponding to the state variable x1 at the time t1 and the covariance matrix z2 corresponding to the state variable x2 at the time t2, controlling a difference between the x1 and the x2, and so on to obtain correlation factors of other two adjacent state variables, wherein all the correlation factors are used as correlation constraints. And then, based on a probability maximization method, calculating to obtain the optimized state variables x1'-x6' at the moments of t1-t 6.
It should be noted that, due to the inconsistency of the sampling frequencies of the observation devices, data may not be acquired by the navigation device in the time period corresponding to the local sub-graph (for example, the navigation device does not acquire pose information at any time from t1 to t 6), in this case, the pose information of the navigation device at each time from t1 to t6 may be determined by an interpolation method, and an expected value and a covariance matrix corresponding to the pose information at least one time are selected as a priori constraints.
Then, a first attitude matrix T6 is calculated based on the pre-optimization state variable x6 corresponding to the end frame acquisition time T6, a second attitude matrix T6' is calculated based on the post-optimization state variable x6' corresponding to the end frame acquisition time T6, and the product of T6 and T6' is used as an update matrix. The update matrix can be expressed as
Figure BDA0003841146510000081
Figure BDA0003841146510000082
Wherein T is k For the pose matrix before the k-time optimization,
Figure BDA0003841146510000083
ru is a 3x3 matrix for the pose matrix after the optimization at the moment k, corresponds to a rotating part of an updating matrix formed by angle deviation amounts of pose variables corresponding to the end frame in the local subgraph before and after the optimization, and is used for representing the amplitude of the attitude angle to be adjusted and correcting the deviation of the attitude angle; tu is a 3 × 1 vector corresponding to a translation portion of an update matrix formed by position deviation amounts of a pose variable corresponding to an end frame in a local sub-graph before and after optimization, and is used for representing the amplitude of the position to be adjusted and correcting the position deviation.
The state variables may be represented as X = { r, t, V, ba, bg, g }, where r represents an attitude angle of the movable device, t represents a position of the movable device, V represents a velocity of the movable device, ba represents a zero offset of an acceleration of the movable device, bg represents a zero offset of an angular velocity of a gyroscope on the movable device, and g represents a gravitational acceleration. The state variable updated by the update matrix is represented as X = { log (R) u exp(r)),R u t+t u ,R u V,ba,bg,R u g }. The local subgraph is updated by adjusting the local subgraph based on the update matrixThe local subgraph can be expressed as χ = { p) according to the coordinate of each point in the point cloud frame corresponding to the subgraph i I =1,2,., N }, wherein p i The coordinates of each point in the point cloud frame are represented, and the updated local subgraph can be represented as chi u ={R u p i +t u ,i=1,2,...,N}。
After each local subgraph is optimized, global optimization can be carried out on the multiple optimized local subgraphs, the state variable can be further eliminated, and errors are reduced. The optimization mode is similar to that of the single local subgraph, and a factor graph optimization method is adopted, two constraint conditions of prior constraint and association constraint are introduced, and only the state variables corresponding to the single local subgraph are replaced by the state variables corresponding to the local subgraphs. The state variable of each local sub-graph is the state variable updated by the movable device at the end frame acquisition time of the local sub-graph. The difference lies in that each local sub-graph corresponds to an expected value and a covariance matrix corresponding to pose information acquired by navigation equipment at a time corresponding to an end frame of the local sub-graph during global optimization and serves as prior constraints. The association constraint is composed of a plurality of association factors, the association factors are obtained by calculation based on the updated state variable and covariance matrix corresponding to the end frame of any adjacent first local subgraph and the updated state variable and covariance matrix corresponding to the end frame of the second local subgraph, and the acquisition time of the end frame of the first local subgraph is adjacent to the acquisition time of the start frame of the second local subgraph. Because the pose information acquired by the navigation equipment is introduced as prior constraint during global optimization, the under-constraint problem existing between two adjacent local subgraphs can be eliminated.
As shown in fig. 1C, another factor graph optimization diagram shown in this application according to an exemplary embodiment is constructed based on state variables of end frame acquisition times (times) of a plurality of constructed local subgraphs, and global optimization is performed on the 6 local subgraphs g1 to g6, where the state variables corresponding to the local subgraphs are x1 to x6, and each state variable corresponds to one covariance matrix. The navigation equipment respectively collects pose information y1-y6 at the corresponding moment of the end frame of each local subgraph, then the expected value and covariance matrix corresponding to y1 are used as prior factors of the local subgraph g1, the expected value and covariance matrix corresponding to y2 are used as prior factors of the local subgraph g2, and the like, prior factors of other local subgraphs are obtained, and all the prior factors are used as prior constraints. And calculating correlation factors of x1 and x2 based on the state variables x1 of the local subgraph g1 and the covariance matrix z1 corresponding to the x1 and the covariance matrix z2 corresponding to the state variables x2 and x2 of the local subgraph g2, and controlling the difference between the x1 and the x2, so as to obtain the correlation factors of other two adjacent local subgraphs, wherein all the correlation factors are used as correlation constraints. And then, calculating to obtain the optimized state variables x1'-x6' of each local subgraph based on a probability maximization method.
Corresponding to the embodiment of the method, the application also provides an embodiment of the image creating device and the terminal applied by the image creating device. As shown in fig. 2, fig. 2 is a block diagram of a mapping apparatus 200 shown in the present application according to an exemplary embodiment, the apparatus includes:
an obtaining module 210, configured to obtain a first point cloud frame acquired by a radar at a current time;
a determining module 220, configured to, when the first point cloud frame and a second point cloud frame acquired by the radar at a time before a current time meet an under-constraint condition, take the second point cloud frame as an end frame of a first local sub-image of a to-be-mapped region, and take the first point cloud frame as a start frame of a second local sub-image of the to-be-mapped region; wherein, different local subgraphs correspond to different sub-regions in the region to be mapped;
a generating module 230, configured to generate a point cloud map of the to-be-mapped region based on the first local sub-graph and the second local sub-graph.
The implementation process of the functions and actions of each module in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement without inventive effort.
The embodiment of the mapping device in the present document can be installed on an electronic device. The apparatus embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory through the processor and running the computer program instructions. From a hardware level, as shown in fig. 3, a hardware structure diagram of an electronic device 300 where a mapping apparatus is located in the embodiment of the present application is shown, and besides the processor 310, the memory 330, the network interface 320, and the nonvolatile memory 340 shown in fig. 3, the electronic device where the mapping apparatus 331 is located in the embodiment may also include other hardware according to an actual function of the electronic device, which is not described again.
Accordingly, the present application also provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method according to any of the foregoing method embodiments when executing the program. The present application further provides a computer-readable storage medium storing a computer program for instructing associated hardware to implement the method according to any of the foregoing method embodiments.
The foregoing description has been directed to specific embodiments of this application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (11)

1. A mapping method, applied to a mobile device equipped with observation devices including at least radar, the method comprising:
acquiring a first point cloud frame acquired by a radar at the current moment;
if the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet an under-constrained condition, taking the second point cloud frame as an end frame of a first local subgraph of a region to be mapped, and taking the first point cloud frame as a start frame of a second local subgraph of the region to be mapped; wherein, different local subgraphs correspond to different sub-regions in the region to be mapped;
and generating a point cloud map of the to-be-mapped area based on the first local subgraph and the second local subgraph.
2. The method according to claim 1, characterized in that the observation device further comprises at least a navigation device, the state variable of the movable device corresponding to a covariance matrix provided by the navigation device characterizing the accuracy of the state variable;
judging whether the first point cloud frame and the second point cloud frame meet an under-constrained condition based on the following steps:
calculating an eigenvalue of the covariance matrix according to the covariance matrix corresponding to the state variable of the movable equipment at the current moment;
inputting the characteristic values into a pre-trained classifier, and determining whether the first point cloud frame and the second point cloud frame meet under-constrained conditions or not based on a classification result output by the classifier.
3. The method of claim 2, wherein the state variables of the mobile device fuse at least pose information provided by the navigation device; the state variables include at least a pose, the method further comprising:
if the deviation between the state variable of the movable equipment at the current moment and the pose information provided by the navigation equipment at the current moment exceeds a preset threshold value, taking the second point cloud frame as an end frame of a first local subgraph of the area to be mapped, and taking the first point cloud frame as a start frame of a second local subgraph of the area to be mapped;
and/or the presence of a gas in the gas,
and if the accumulated moving distance of the movable equipment at the current moment is a preset distance threshold, taking the second point cloud frame as an ending frame of a first local subgraph of the to-be-constructed area, and taking the first point cloud frame as a starting frame of a second local subgraph of the to-be-constructed area.
4. The method of claim 2, wherein the observation device further comprises an inertial measurement unit, and the state variable of the movable device is determined by fusing data collected by the navigation device, the radar, and the inertial measurement unit.
5. The method of claim 4, wherein the fusing of the data collected by the navigation device, the radar, and the inertial measurement unit is accomplished by a Kalman filter.
6. The method of claim 2, wherein each of the local subgraphs corresponds to a plurality of frames of point cloud, each frame of point cloud corresponding to the state variable of the mobile device at the time the point cloud frame was acquired, the method further comprising:
optimizing a plurality of state variables corresponding to the local subgraph based on prior constraints and association constraints;
the priori constraint is an expected value and a covariance matrix corresponding to pose information acquired by the navigation equipment from the acquisition time of a start frame of the local sub-image to the acquisition time of an end frame of the local sub-image;
the association constraint is composed of a plurality of association factors, and the association factors are obtained by calculation based on state variables and covariance matrixes corresponding to each point cloud frame in any two adjacent point cloud frames at any two acquisition moments corresponding to the local subgraph;
calculating a first attitude matrix based on the state variable corresponding to the end frame before optimization, calculating a second attitude matrix based on the state variable corresponding to the end frame after optimization, and determining the product of the first attitude matrix and the second attitude matrix as an update matrix;
and updating the state variable and the local subgraph corresponding to the optimized ending frame based on the updating matrix.
7. The method of claim 6, further comprising:
optimizing the point cloud map of the region to be mapped based on the prior constraint and the association constraint;
the priori constraint is an expected value and a covariance matrix corresponding to pose information acquired by the navigation equipment at the moment corresponding to the end frame of each local sub-graph;
the association constraint is composed of a plurality of association factors, the association factors are obtained by calculation based on the updated state variable and covariance matrix corresponding to the end frame of the first local sub-image and the updated state variable and covariance matrix corresponding to the end frame of the second local sub-image which are adjacent to each other, and the acquisition time of the end frame of the first local sub-image is adjacent to the acquisition time of the start frame of the second local sub-image.
8. The method of claim 2, wherein the state variables further include at least one of:
velocity, zero bias of gyroscope angular velocity, zero bias of acceleration, or gravitational acceleration.
9. An apparatus for mapping, applied to a mobile device equipped with observation devices including at least a radar, the apparatus comprising:
the acquisition module is used for acquiring a first point cloud frame acquired by the radar at the current moment;
the determining module is used for taking the second point cloud frame as an end frame of a first local subgraph of a to-be-constructed region and taking the first point cloud frame as a starting frame of a second local subgraph of the to-be-constructed region under the condition that the first point cloud frame and a second point cloud frame acquired by the radar at the previous moment of the current moment meet under-constraint conditions; wherein, different local subgraphs correspond to different sub-areas in the area to be constructed;
and the generating module is used for generating a point cloud map of the to-be-mapped area based on the first local subgraph and the second local subgraph.
10. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 8 when executing the program.
11. A computer readable storage medium having stored thereon a computer program for instructing associated hardware to perform the method of any one of claims 1 to 8.
CN202211104704.4A 2022-09-09 2022-09-09 Mapping method, device, equipment and storage medium Pending CN115507839A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211104704.4A CN115507839A (en) 2022-09-09 2022-09-09 Mapping method, device, equipment and storage medium
PCT/CN2022/131360 WO2024050961A1 (en) 2022-09-09 2022-11-11 Mapping method and apparatus, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211104704.4A CN115507839A (en) 2022-09-09 2022-09-09 Mapping method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115507839A true CN115507839A (en) 2022-12-23

Family

ID=84503495

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211104704.4A Pending CN115507839A (en) 2022-09-09 2022-09-09 Mapping method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115507839A (en)
WO (1) WO2024050961A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105677458A (en) * 2014-11-21 2016-06-15 国际商业机器公司 Method and device for obtaining constrains of events
CN108592919B (en) * 2018-04-27 2019-09-17 百度在线网络技术(北京)有限公司 Drawing and localization method, device, storage medium and terminal device
JP2022542289A (en) * 2020-04-30 2022-09-30 上海商▲湯▼▲臨▼港智能科技有限公司 Mapping method, mapping device, electronic device, storage medium and computer program product
CN113819914B (en) * 2020-06-19 2024-06-07 北京图森未来科技有限公司 Map construction method and device
CN114910905A (en) * 2022-04-22 2022-08-16 北京理工大学 GEO satellite-machine bistatic SAR moving target intelligent imaging method under similarity constraint

Also Published As

Publication number Publication date
WO2024050961A1 (en) 2024-03-14

Similar Documents

Publication Publication Date Title
US20210293544A1 (en) Laser scanner with real-time, online ego-motion estimation
CN111912417B (en) Map construction method, map construction device, map construction equipment and storage medium
CN109974693B (en) Unmanned aerial vehicle positioning method and device, computer equipment and storage medium
EP3520076B1 (en) Computer vision systems and methods for detecting and modeling features of structures in images
CN110146909B (en) Positioning data processing method
Scherer et al. River mapping from a flying robot: state estimation, river detection, and obstacle mapping
US20220051031A1 (en) Moving object tracking method and apparatus
KR102130687B1 (en) System for information fusion among multiple sensor platforms
CN110223380B (en) Scene modeling method, system and device fusing aerial photography and ground visual angle images
KR20220058901A (en) Data processing methods and devices
CN114636414A (en) High definition city map drawing
CN111145251A (en) Robot, synchronous positioning and mapping method thereof and computer storage device
CN112985391A (en) Multi-unmanned aerial vehicle collaborative navigation method and device based on inertia and binocular vision
US11373328B2 (en) Method, device and storage medium for positioning object
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN113960614A (en) Elevation map construction method based on frame-map matching
CN114077249B (en) Operation method, operation equipment, device and storage medium
CN115507839A (en) Mapping method, device, equipment and storage medium
CN110794434A (en) Pose determination method, device, equipment and storage medium
CN116630421A (en) Method and system for estimating relative attitude of satellite formation based on monocular vision
CN113034538B (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
CN117405118B (en) Multi-sensor fusion mapping method, system, equipment and storage medium
US20240367643A1 (en) Associating measurement points to spline points in determinations of a free space boundary in a vehicle assistance system
CN116954203A (en) Grid map construction method, device, equipment and storage medium
CN117928512A (en) Camera-radar-inertial coupling mapping method and system in lunar environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination