Nothing Special   »   [go: up one dir, main page]

CN108871311B - Pose determination method and device - Google Patents

Pose determination method and device Download PDF

Info

Publication number
CN108871311B
CN108871311B CN201810550880.8A CN201810550880A CN108871311B CN 108871311 B CN108871311 B CN 108871311B CN 201810550880 A CN201810550880 A CN 201810550880A CN 108871311 B CN108871311 B CN 108871311B
Authority
CN
China
Prior art keywords
image
time
electronic equipment
real time
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810550880.8A
Other languages
Chinese (zh)
Other versions
CN108871311A (en
Inventor
郭亨凯
陈尧
淮静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201810550880.8A priority Critical patent/CN108871311B/en
Publication of CN108871311A publication Critical patent/CN108871311A/en
Application granted granted Critical
Publication of CN108871311B publication Critical patent/CN108871311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a pose determining method and device. The method comprises the following steps: acquiring first hardware parameter information of electronic equipment; matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information; and acquiring the real-time attitude estimation of the electronic equipment in real time by using the target positioning algorithm. The method can realize cross-platform use and can automatically adapt to different machine types.

Description

Pose determination method and device
Technical Field
The invention relates to the technical field of electronic equipment, in particular to a pose determination method and a pose determination device.
Background
With the advent of the information age, electronic devices have become indispensable tools in life, and the demand for Augmented Reality (AR) technology for electronic devices has increased. The core technology of the AR technology is an instant positioning and mapping (SLAM) technology, that is, the real-time attitude estimation of the electronic device is realized by the SLAM technology.
Currently, the SLAM technology of electronic devices mainly includes ARKit of Apple and ARCore of Google, but the positioning algorithm adopted by ARKit cannot be used for the electronic devices of Android system, and the positioning algorithm adopted by ARCore cannot be used for the electronic devices of IOS system, that is, ARKit and rocre cannot be used across platforms.
Disclosure of Invention
Based on this, it is necessary to provide a pose determination method and apparatus for the problem that the ARKit and the ARCore cannot be used across platforms.
In a first aspect, an embodiment of the present invention provides a pose determination method, including:
acquiring first hardware parameter information of electronic equipment;
matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information;
and acquiring real-time attitude estimation of the electronic equipment in real time by using a target positioning algorithm.
In one embodiment, the obtaining real-time pose estimates of the electronic device in real-time using a target location algorithm includes:
acquiring an image in real time through a visual sensor of the electronic equipment;
extracting the positions of key points in the image;
extracting descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information;
and determining real-time attitude estimation of the electronic equipment according to the positions of the key points in the image and the descriptors of each key point.
In one embodiment, determining a real-time pose estimate for an electronic device from keypoint locations in an image and a descriptor of each keypoint comprises:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point;
and at least taking the associated image characteristics as observed quantities, and minimizing observation errors through a target positioning algorithm to obtain real-time attitude estimation of the electronic equipment.
In one embodiment, the associating the image features of the image acquired in real time with the image features of the previous frame of image comprises:
and using an image feature association algorithm matched with the first hardware parameter information to associate the image features of the image acquired in real time with the image features of the image of the previous frame.
In one embodiment, determining a real-time pose estimate for an electronic device from keypoint locations in an image and a descriptor of each keypoint comprises:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point;
at least taking the related image characteristics as observed quantities to obtain observation errors;
acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame of image;
obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and image estimation acquired in real time;
estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm;
determining a first error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount;
and obtaining real-time attitude estimation of the electronic equipment through the first error and the observation error according to a target positioning algorithm.
In one embodiment, the method further comprises the following steps:
acquiring acceleration information and angular velocity information in real time through an inertial measurement unit of the electronic equipment, and carrying out time synchronization on the acceleration information and the angular velocity information and image characteristics of an image acquired by a visual sensor;
at least taking the associated image characteristics as observed quantities, minimizing observation errors through a target positioning algorithm, and obtaining real-time attitude estimation of the electronic equipment, wherein the method comprises the following steps:
and combining the image characteristics, the acceleration information and the angular velocity information after time synchronization as observed quantities, and minimizing an observation error through a target positioning algorithm to obtain the real-time attitude estimation of the electronic equipment.
In one embodiment, taking at least the associated image features as an observed quantity, and minimizing an observation error through a target positioning algorithm to obtain a real-time attitude estimate of the electronic device includes:
taking at least one of the offset and the noise amplitude in the data information of the inertial measurement unit, the dynamic delay time between the time stamp of the data information of the inertial measurement unit and the time stamp of the image, external parameters of the electronic equipment and second hardware parameter information as an estimator of a target positioning algorithm, minimizing an observation error and obtaining real-time attitude estimation of the electronic equipment; wherein the external parameters of the electronic equipment comprise relative postures of the vision sensor and the inertial measurement unit.
In one embodiment, minimizing the observation error by the target location algorithm, obtaining the real-time attitude estimate of the electronic device comprises:
taking internal parameters of a visual sensor of the electronic equipment as random variables of a target positioning algorithm, calculating the distribution of observation errors, and taking the posture corresponding to the minimum value in the distribution of the observation errors as the real-time posture estimation of the electronic equipment; wherein the internal parameters of the visual sensor of the electronic equipment comprise at least one of a focal length and an optical center position of the visual sensor of the electronic equipment; wherein the random variables obey a gaussian distribution.
In a second aspect, an embodiment of the present invention provides a pose determination apparatus, including:
the first acquisition module is used for acquiring first hardware parameter information of the electronic equipment;
the matching module is used for matching the first hardware parameter information with the positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information;
and the second acquisition module is used for acquiring the real-time attitude estimation of the electronic equipment in real time by utilizing a target positioning algorithm.
In a third aspect, an electronic device provided in an embodiment of the present invention includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method in any of the above embodiments when executing the computer program.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the method in any of the above embodiments.
According to the pose determining method and device, the obtained first hardware parameter information of the electronic equipment is matched with the positioning algorithm to obtain the target positioning algorithm matched with the first hardware parameter information, and then the matched target positioning algorithm is used for obtaining the real-time pose estimation of the electronic equipment in real time. The target positioning algorithm adopted in the pose determining method provided by the embodiment is obtained by matching according to the first hardware parameter information of the electronic device, and is not related to whether the electronic device adopts an Android system or an IOS system, that is, the pose determining method provided by the embodiment can be used across platforms, and the target positioning algorithm only needs to be matched with the first hardware parameter information of the electronic device, so that the pose determining method can automatically adapt to different machine types, and does not need to finely calibrate and measure each machine type.
Drawings
Fig. 1 is a schematic flowchart of a pose determination method according to an embodiment;
FIG. 2 is a flowchart illustrating steps performed by an object location algorithm in the pose determination method according to an embodiment;
FIG. 3 is a schematic flow chart of determining a real-time pose estimate of an electronic device in a pose determination method according to an embodiment;
FIG. 4 is a schematic flow chart illustrating a method for determining a real-time pose estimate of an electronic device according to another embodiment;
fig. 5 is a schematic structural diagram of a pose determination apparatus according to an embodiment;
fig. 6 is a schematic structural diagram of a pose determination apparatus according to another embodiment;
FIG. 7 is a diagram illustrating an internal structure of an electronic device in one embodiment.
Detailed Description
With the continuous development of information technology, more and more application programs are applied to electronic devices, for example, an Augmented Reality (AR) technology is applied to a mobile phone terminal, and a core technology of the AR technology is a Simultaneous localization and mapping (SLAM) technology, that is, the real-time attitude estimation of the electronic devices is realized by the SLAM technology. SLAM technology acquires the location of an electronic device in the real world by using inputs of a visual sensor and an Inertial Measurement Unit (IMU), and models the real world. Currently, the SLAM technology of electronic devices mainly includes ARKit of Apple and ARCore of Google, but the positioning algorithm adopted by ARKit cannot be used for the electronic devices of Android system, and the positioning algorithm adopted by ARCore cannot be used for the electronic devices of IOS system, that is, ARKit and rocre cannot be used across platforms. In addition, the ARCore requires fine calibration measurements for each model that is adapted, and cannot automatically adapt to different models. The pose determination method and the pose determination device aim to solve the technical problems of the traditional technology.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the embodiments of the present invention are further described in detail by the following embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a schematic flowchart of a pose determination method according to an embodiment. The embodiment relates to a specific process for acquiring real-time attitude estimation of electronic equipment in real time according to a target positioning algorithm matched with first hardware parameter information. As shown in fig. 1, the method includes:
s101, acquiring first hardware parameter information of the electronic equipment.
Optionally, the electronic device may be, but is not limited to, various smart phones, tablet computers, and portable wearable devices. Optionally, the first hardware parameter information of the electronic device may include: the computing power of the CPU of the electronic equipment and whether the electronic equipment comprises an inertia measuring unit (such as an accelerometer and a gyroscope) or not, wherein the electronic equipment is preset with a CPU computing power score, and the computing power of the CPU of the electronic equipment is graded according to the CPU computing power score. If the electronic device includes an inertial measurement unit, the first hardware parameter information may further include: the manufacturer information of the inertial measurement unit, hardware parameters of an accelerometer and a gyroscope in the inertial measurement unit, brands of the accelerometer and the gyroscope, and the like.
Optionally, after the application program determined based on the pose is started, the first hardware parameter information of the electronic device may be acquired. Optionally, the pose determination-based application program is an AR application program.
S102, matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information.
Specifically, the positioning algorithm may include: the method comprises a nonlinear Kalman filtering algorithm (the calculated amount is small, the precision is not enough), a nonlinear optimization algorithm (multiple iterations are needed, the precision is better, but the operation amount is higher), and the nonlinear optimization algorithm is a Gauss Newton method, a Levenseberg-Marquardt method and the like. Optionally, the positioning algorithm may be preset in the electronic device, may also be preset in the cloud, and may also be preset in other positions where the first hardware parameter information may be matched, where the preset position of the positioning algorithm is not limited in this embodiment. Generally, a plurality of positioning algorithms are preset in the electronic device or in the cloud or other locations, so that the electronic device can select the positioning algorithm matched with the first hardware parameter information from the plurality of positioning algorithms.
When the computing power weight of the CPU of the electronic equipment in the first hardware parameter information of the electronic equipment is the highest, namely the first hardware parameter information is matched with a plurality of algorithms preset in the electronic equipment, the matching is mainly performed according to the computing power of the CPU. For example, according to the obtained first hardware parameter information of the electronic device, if the computing capability of the CPU is strong, that is, the level of the computing capability of the CPU of the electronic device is higher, then the non-linear optimization algorithm, such as the gauss-newton method, is matched with the other first hardware parameter information.
S103, acquiring real-time attitude estimation of the electronic equipment in real time by using a target positioning algorithm.
Specifically, the real-time attitude estimation of the electronic device includes a real-time translation amount and a real-time rotation amount of the electronic device. And calculating the real-time translation amount and the real-time rotation amount of the electronic equipment in real time according to the matched target positioning algorithm.
According to the pose determining method provided by the embodiment, the obtained first hardware parameter information of the electronic equipment is matched with the positioning algorithm to obtain the target positioning algorithm matched with the first hardware parameter information, and then the matched target positioning algorithm is used for obtaining the real-time pose estimation of the electronic equipment in real time. The target positioning algorithm adopted in the pose determining method provided by the embodiment is obtained by matching according to the first hardware parameter information of the electronic device, and is not related to whether the electronic device adopts an Android system or an IOS system, that is, the pose determining method provided by the embodiment can be used across platforms, and the target positioning algorithm only needs to be matched with the first hardware parameter information of the electronic device, so that the pose determining method can automatically adapt to different machine types, and does not need to finely calibrate and measure each machine type.
Fig. 2 is a flowchart illustrating steps executed by an object location algorithm in the pose determination method according to an embodiment. The embodiment relates to a specific process of how to obtain real-time attitude estimation of an electronic device in real time by using a target positioning algorithm. On the basis of the foregoing embodiment, optionally, the foregoing S103 may include:
s201, acquiring an image in real time through a visual sensor of the electronic equipment.
Specifically, the vision sensor may be a front-facing camera or a rear-facing camera, and it should be noted that the embodiment does not limit the type of the vision sensor, and only needs to acquire an image in real time.
S202, extracting the positions of key points in the image.
Optionally, the positions of the key points in the image obtained in real time may be extracted by using a feature From Accelerated Segmentation Test (FAST) algorithm, the positions of the key points in the image obtained in real time may be extracted by using a Difference of Gaussian (DoG) algorithm, the positions of the key points in the image obtained in real time may be extracted by using an Adaptive and General Accelerated Segmentation Test (AGAST) algorithm, the positions of the key points in the image obtained in real time may be extracted by using an algorithm such as a Binary Robust Invariant Scalable key point (BRISK) algorithm, or the positions of the key points in the image obtained in real time may be extracted by using other algorithms. The following description will take an example of extracting the positions of key points in an image acquired in real time by using the FAST algorithm.
Optionally, according to a difference of the first hardware parameter information (mainly referring to the computing capability of the CPU), a parameter of the function in the FAST algorithm may be changed accordingly, and of course, when the first hardware parameter information is different, the parameter of the function in the FAST algorithm may adopt a fixed and unchangeable value. For example, when the computing power of the CPU is strong, the positions of the key points in some images can be extracted relatively more, and conversely, the positions of the key points in some images can be extracted relatively less.
When the function parameters in the FAST algorithm change according to the difference of the first hardware parameter information, the number of positions of key points in the image obtained in real time is different, and the more positions of the extracted key points, the more accurate the pose determination.
And S203, extracting the descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information.
Specifically, according to the first hardware parameter information (mainly referring to the computing power of the CPU), a descriptor extraction manner matching the first hardware parameter information may be selected. Optionally, the descriptor extraction method may be to extract an ORB descriptor by using an ORB algorithm (the extraction method is slow), and use the ORG descriptor as a descriptor of the key point; the descriptor extraction method may also be to extract an original pixel value within a preset range of the key point position (the extraction method is fast, but the effect is poor), and take the original pixel value within the preset range of the key point position as a descriptor of the key point, which does not need to be calculated. For example, when the computing power of the CPU is weak, the original pixel value within a preset range of the position of the extracted key point is selected as the descriptor of the key point, and when the computing power of the CPU is strong, the ORG algorithm is selected to extract the ORG descriptor as the descriptor of the key point. It should be noted that the present embodiment does not limit the extraction method of the descriptor.
According to the method, different key point descriptor extraction modes are selected according to different first hardware parameter information, so that the key point descriptor extraction mode is more flexible when the first hardware parameter information is different.
And S204, determining the real-time attitude estimation of the electronic equipment according to the positions of the key points in the image and the descriptors of each key point.
This S204 can be implemented in several possible real-time ways:
a first possible real-time approach: referring to fig. 3, the possible real-time approach includes the following steps:
s301, correlating image features of the image acquired in real time with image features of the image of the previous frame, wherein the image features comprise positions of key points in the image and descriptors of each key point.
Specifically, the image acquired in real time is displaced and rotated from the image of the previous frame, that is, the position of the key point in the image acquired in real time is different from the position of the same key point in the image of the previous frame, but the positions of the same key point in the images at different times are correlated.
On the basis of the foregoing embodiment, optionally, the foregoing S301 specifically includes:
and using an image feature association algorithm matched with the first hardware parameter information to associate the image features of the image acquired in real time with the image features of the image of the previous frame.
Specifically, according to the first hardware parameter information of the electronic device (mainly referring to the computing power of the CPU), an image feature association algorithm matching the first hardware parameter information may be selected. Alternatively, the image feature association algorithm may obtain the locations of the corresponding key points in the real-time image directly from the locations of the key points in the image of the previous frame based on an optical flow tracking algorithm (with relatively high speed), and the image feature association algorithm may also obtain the key points in the image of the previous frame matched with the key points in the real-time image based on a local matching algorithm (with relatively low speed but relatively high accuracy). Specifically, the position of any one of the key points in the previous frame image and the image acquired in real time are used as the input of an optical flow tracking algorithm, and the position of the corresponding key point in the image acquired in real time of the any one key point is calculated by using the optical flow tracking algorithm, so that the image features of the image acquired in real time are associated with the image features of the previous frame image. In addition, all image features of the previous frame image, the position of any one key point of the image acquired in real time and a descriptor corresponding to any one key point are used as input of a local matching algorithm, the key point matched with any one key point in the previous frame image is output by using the local matching algorithm, and if the matched key point is not output, the key point in the image acquired in real time is discarded. For example, when the computing power of the CPU is weak, the image features of the image acquired in real time may be selected to be associated with the image features of the previous frame image based on the optical flow tracking algorithm, while when the computing power of the CPU is strong, the image features of the image acquired in real time may be selected to be associated with the image features of the previous frame image based on the local matching algorithm, and when the computing power of the CPU is stronger, the image features of the image acquired in real time may be associated with the image features of the previous frame image by simultaneously selecting the two algorithms.
According to the difference of the first hardware parameter information, different image feature association algorithms are selected, so that key points of image features of the image obtained in real time are matched with key points in image features of the previous frame of image, when the computing capacity of the CPU is strong enough, the two image feature association algorithms operate simultaneously, more key points are matched between the image features of the image obtained in real time and the image features of the previous frame of image, the matching effect of the image features of the image obtained in real time and the image features of the previous frame of image is better, and namely the image features of the image obtained in real time and the image features of the previous frame of image are better in association.
S302, at least taking the associated image features as observed quantities, and minimizing observation errors through a target positioning algorithm to obtain real-time attitude estimation of the electronic equipment.
Specifically, the observation is an interpretation of measurement.
The observation error is an error caused by the associated image feature as an observed quantity during the observation process.
In the embodiment, the associated image features are used as observed quantities, and the observation errors are minimized through a target positioning algorithm matched with the first hardware parameter information of the electronic equipment, so that the pose determination method provided by the embodiment can be used across platforms.
Second possible real-time approach: referring to fig. 4, the possible real-time approach includes the following steps:
s401, correlating image features of the image acquired in real time with image features of the previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point.
The specific process of how to associate the image features of the image acquired in real time with the image features of the image of the previous frame refers to the above statement of S301, and is not described herein again.
S402, at least taking the related image characteristics as observed quantities to obtain observation errors.
Specifically, the statement of the observation error is referred to the above S302, and is not described herein again.
And S403, acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image.
Alternatively, a proportional from orthogonal projection and Scaling with Iterations (post) algorithm may be used to obtain the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image, and of course, other algorithms may be used to obtain the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image. It should be noted that the present embodiment does not limit what algorithm is used to acquire the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image.
S404, obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and image estimation acquired in real time.
Optionally, the obtaining manner of the first estimated two-dimensional translation amount includes but is not limited to: and obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through the three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and the image estimation acquired in real time.
S405, estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm.
Optionally, the obtaining manner of the second estimated two-dimensional translation amount includes but is not limited to: and estimating to obtain a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm.
S406, determining a first error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount.
Specifically, the squared difference of the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount may be taken as the first error.
And S407, obtaining real-time attitude estimation of the electronic equipment through the first error and the observation error according to a target positioning algorithm.
Specifically, the electronic device may minimize a sum of the first error and the observation error according to a target positioning algorithm, and the attitude estimation corresponding to the minimum sum of the first error and the observation error is the obtained real-time attitude estimation of the electronic device.
In the pose determining method provided by this embodiment, a first estimated two-dimensional translational amount is obtained through projection of a three-dimensional translational amount and a three-dimensional rotation amount of a previous frame image and image estimation obtained in real time, a second estimated two-dimensional translational amount is obtained through estimation of an image tracking algorithm, then a square difference between the first estimated two-dimensional translational amount and the second estimated two-dimensional translational amount is used as a first error, and an attitude estimation corresponding to a time when a sum of the first error and an observation error is minimum is a real-time attitude estimation of the electronic device. In this embodiment, an error term is added to the observation error, that is, the error term is used as a constraint term of the observation error, so that the obtained real-time attitude estimation of the electronic device is more accurate, and the stability of real-time positioning is improved.
On the basis of the embodiment, the method can also acquire acceleration information and angular velocity information through an inertial measurement unit of the real-time electronic equipment, and perform time synchronization on the acceleration information and the angular velocity information and the image characteristics of the image acquired by the vision sensor; correspondingly, the specific implementation manner of obtaining the real-time attitude estimation of the electronic device by using at least the associated image features as the observed quantity and minimizing the observation error through the target positioning algorithm may be: and combining the image characteristics, the acceleration information and the angular velocity information after time synchronization as observed quantities, and minimizing an observation error through a target positioning algorithm to obtain the real-time attitude estimation of the electronic equipment.
Specifically, an Inertial Measurement Unit (IMU) may be a device for measuring three-axis attitude angles (or angular velocities) and acceleration of an object. Wherein, the IMU may comprise three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detecting acceleration information of the object in three independent axes of the carrier coordinate system, and the gyroscopes detecting angular velocity information of the carrier relative to the navigation coordinate system, i.e. measuring angular velocity and acceleration of the object in three-dimensional space.
The acceleration information, the angular velocity information and the image characteristics are subjected to time synchronization, specifically, the acceleration information, the angular velocity information and the image characteristics are collected in real time, and the acceleration information, the angular velocity information and the image characteristics collected in the same time period are packaged. For example, if 1s extracts image features of 30 frames of images and 1s collects 60 pieces of acceleration information and angular velocity information, the image features of 1 frame of images and 2 pieces of acceleration information and angular velocity information in the same time period are packed. And combining the image characteristics, the acceleration information and the angular velocity information after time synchronization to serve as observed quantities, namely, using the packed image characteristics, the acceleration information and the angular velocity information in the same time period as the observed quantities. In this embodiment, the observation error includes an image feature error and an IMU error, and the image feature error includes a distance between a position of a key point in the image obtained at least one time in the historical time and an actual position of a corresponding key point in the image at the current time, where the at least one time in the historical time is at least one time in any time before the current time. The IMU error includes a difference between a real-time translation amount of the electronic device and a translation amount obtained by the IMU, and a difference between a real-time rotation amount of the electronic device and a rotation amount obtained by the IMU.
According to the pose determining method provided by the embodiment, the acceleration information and the angular velocity information of the IMU are synchronized with the image characteristics and then combined to be used as the observed quantity, and then the observation error is minimized through a target positioning algorithm, so that the real-time pose estimation of the electronic equipment is obtained. In the embodiment, the acceleration information and the angular velocity information of the IMU are combined with the image characteristics after synchronization to serve as observed quantities, and the IMU error and the image characteristic error are adopted to obtain the real-time attitude estimation of the electronic equipment, so that the obtained real-time attitude estimation of the electronic equipment is more accurate, and the stability of real-time positioning is improved.
On the basis of the foregoing embodiment, optionally, taking at least the associated image features as an observed quantity, and minimizing an observation error through a target location algorithm to obtain a real-time attitude estimate of the electronic device includes:
taking at least one of the offset and the noise amplitude in the data information of the inertial measurement unit, the dynamic delay time between the time stamp of the data information of the inertial measurement unit and the time stamp of the image, external parameters of the electronic equipment and second hardware parameter information as an estimator of a target positioning algorithm, minimizing an observation error and obtaining real-time attitude estimation of the electronic equipment; wherein the external parameters of the electronic equipment comprise relative postures of the vision sensor and the inertial measurement unit.
Specifically, the estimator of the target location algorithm is a dependent variable of an objective function in the target location algorithm.
Specifically, for electronic devices of different models, the offset and noise amplitude values in the IMU data information are different, which affects the positioning effect of the target positioning algorithm, and therefore, the target positioning algorithm needs to make the electronic device static for a preset duration (e.g., 1s) during initialization to estimate the initial values of the offset and noise amplitude values of the IMU, so that when the offset and noise amplitude values of the IMU are used as the estimation quantities of the target positioning algorithm, the real-time attitude estimation of the electronic device can be obtained more accurately.
Specifically, due to the fact that the data of the electronic device is acquired through hardware, dynamic delay time exists between a timestamp of IMU data information of the electronic device (the timestamp of accelerometer data information and the timestamp of gyroscope data information of the IMU are the same) and a timestamp of an image, and when the dynamic delay time between the timestamp of inertial measurement unit data information and the timestamp of the image is used as an estimator of a target positioning algorithm, real-time attitude estimation of the electronic device can be obtained more accurately.
Optionally, the external parameters of the electronic device may include the relative pose of the visual sensor of the electronic device and the IMU. When the relative attitude of the vision sensor of the electronic equipment and the IMU is used as the estimator of the target positioning algorithm, the real-time attitude estimation of the electronic equipment can be obtained more accurately.
Specifically, the second hardware parameter information may include hardware parameter information whose hardware parameter value is unknown. Optionally, the second hardware parameter information may include a rolling time. When the rolling time is used as the estimator of the target positioning algorithm, the real-time attitude estimation of the electronic equipment can be more accurately obtained.
In the pose determining method provided by this embodiment, at least one of the offset and the noise amplitude in the data information of the inertial measurement unit, the dynamic delay time between the time stamp of the inertial measurement unit and the time stamp of the image, the external parameter of the electronic device, and the second hardware parameter information is used as the estimator of the target location algorithm, each estimator can make the obtained real-time pose estimation of the electronic device more accurate, and the accuracy of the real-time pose estimation of the electronic device can be further improved by combining a plurality of estimators.
On the basis of the foregoing embodiment, optionally, minimizing the observation error by using the target location algorithm to obtain the real-time attitude estimate of the electronic device includes:
taking internal parameters of a visual sensor of the electronic equipment as random variables of a target positioning algorithm, calculating the distribution of observation errors, and taking the posture corresponding to the minimum value in the distribution of the observation errors as the real-time posture estimation of the electronic equipment; wherein the internal parameters of the visual sensor of the electronic equipment comprise at least one of a focal length and an optical center position of the visual sensor of the electronic equipment; wherein the random variables obey a gaussian distribution.
Specifically, the internal parameters of the vision sensor of the electronic equipment are used as random variables of a target positioning algorithm, the random variables obey Gaussian distribution, the distribution of observation errors is calculated through the Gaussian distribution obeyed by the internal parameters of the vision sensor of the electronic equipment, then the minimum value is selected from the obtained distribution of the observation errors, and the attitude corresponding to the minimum value is the real-time attitude estimation of the electronic equipment.
In the pose determining method provided by this embodiment, the distribution of the observation errors is obtained through the distribution of the internal parameters of the visual sensor of the electronic device, and the minimum value is selected from the distribution of the observation errors, where the pose corresponding to the minimum value is the real-time pose estimation of the electronic device. The real-time attitude estimation of the electronic equipment is more accurate according to the distribution of the internal parameters of the vision sensor of the electronic equipment.
It should be understood that although the various steps in the flow charts of fig. 1-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-4 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
Fig. 5 is a schematic structural diagram of a pose determination apparatus according to an embodiment. As shown in fig. 5, the apparatus includes: a first obtaining module 11, a matching module 12 and a second obtaining module 13.
Specifically, the first obtaining module 11 is configured to obtain first hardware parameter information of the electronic device;
the matching module 12 is configured to match the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information;
and the second obtaining module 13 is configured to obtain a real-time attitude estimate of the electronic device in real time by using a target positioning algorithm.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 6 is a schematic structural diagram of a pose determination apparatus according to another embodiment. On the basis of the embodiment shown in fig. 5, as shown in fig. 6, the second obtaining module 13 includes: an acquisition unit 131, a first extraction unit 132, a second extraction unit 133, and a determination unit 134.
Specifically, the acquiring unit 131 is configured to acquire an image in real time through a visual sensor of the electronic device;
a first extraction unit 132 for extracting the positions of key points in the image;
a second extracting unit 133, configured to extract descriptors of the key points by using a descriptor extracting manner matched with the first hardware parameter information;
a determining unit 134 for determining a real-time pose estimate of the electronic device according to the locations of the keypoints in the image and the descriptor of each keypoint.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
On the basis of the foregoing embodiment, the determining unit 134 is further configured to associate image features of an image acquired in real time with image features of an image of a previous frame, where the image features include positions of key points in the image and descriptors of each key point; and at least taking the associated image characteristics as observed quantities, and minimizing observation errors through a target positioning algorithm to obtain real-time attitude estimation of the electronic equipment.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
Optionally, the determining unit 134 is further configured to associate the image feature of the image acquired in real time with the image feature of the previous frame of image by using an image feature association algorithm matched with the first hardware parameter information.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
On the basis of the foregoing embodiment, the determining unit 134 is further configured to associate image features of an image acquired in real time with image features of an image of a previous frame, where the image features include positions of key points in the image and descriptors of each key point; at least taking the related image characteristics as observed quantities to obtain observation errors; acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame of image; obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and image estimation acquired in real time; estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm; determining a first error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount; and obtaining real-time attitude estimation of the electronic equipment through the first error and the observation error according to a target positioning algorithm.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
On the basis of the above embodiment, the apparatus further includes: and a synchronization module.
Specifically, the synchronization module is configured to acquire acceleration information and angular velocity information in real time through an inertial measurement unit of the electronic device, and perform time synchronization on the acceleration information and the angular velocity information and image features of an image acquired by the vision sensor;
correspondingly, the determining unit 134 is further configured to combine the image features, the acceleration information, and the angular velocity information after time synchronization as an observed quantity, and minimize an observation error through a target location algorithm, so as to obtain a real-time attitude estimation of the electronic device.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
Optionally, the determining unit 134 is further configured to use at least one of a bias and a noise amplitude in the data information of the inertial measurement unit, a dynamic delay time between a timestamp of the inertial measurement unit and a timestamp of the image, an external parameter of the electronic device, and second hardware parameter information as an estimator of a target positioning algorithm, so as to minimize an observation error and obtain a real-time attitude estimate of the electronic device; wherein the external parameters of the electronic equipment comprise relative postures of the vision sensor and the inertial measurement unit.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
Optionally, the determining unit 134 is further configured to calculate a distribution of the observation errors by using an internal parameter of a visual sensor of the electronic device as a random variable of a target positioning algorithm, and use an attitude corresponding to a minimum value in the distribution of the observation errors as a real-time attitude estimation of the electronic device; wherein the internal parameters of the visual sensor of the electronic equipment comprise at least one of a focal length and an optical center position of the visual sensor of the electronic equipment; wherein the random variables obey a gaussian distribution.
The pose determination apparatus provided by this embodiment may implement the method embodiments described above, and the implementation principle and the technical effect are similar, which are not described herein again.
For specific definition of the pose determination device, reference may be made to the definition of the pose determination method above, and details are not repeated here. The respective modules in the above pose determination apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the electronic device, or can be stored in a memory in the electronic device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, an electronic device is provided, the internal structure of which may be as shown in FIG. 7. The electronic device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the electronic device is used for connecting and communicating with an external electronic device through a network. The electronic device is executed by a processor to implement a pose determination method. The display screen of the electronic equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, an electronic device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring first hardware parameter information of electronic equipment; matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information; and acquiring real-time attitude estimation of the electronic equipment in real time by using a target positioning algorithm.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring an image in real time through a visual sensor of the electronic equipment; extracting the positions of key points in the image; extracting descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information; and determining real-time attitude estimation of the electronic equipment according to the positions of the key points in the image and the descriptors of each key point.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point; and at least taking the associated image characteristics as observed quantities, and minimizing observation errors through a target positioning algorithm to obtain real-time attitude estimation of the electronic equipment.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and using an image feature association algorithm matched with the first hardware parameter information, and adopting a selected image feature association algorithm to associate the image features of the image acquired in real time with the image features of the image of the previous frame.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point; at least taking the related image characteristics as observed quantities to obtain observation errors; acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame of image; obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and image estimation acquired in real time; estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm; determining a first error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount; and obtaining real-time attitude estimation of the electronic equipment through the first error and the observation error according to a target positioning algorithm.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring acceleration information and angular velocity information in real time through an inertial measurement unit of the electronic equipment, and carrying out time synchronization on the acceleration information and the angular velocity information and image characteristics of an image acquired by a visual sensor; and combining the image characteristics, the acceleration information and the angular velocity information after time synchronization as observed quantities, and minimizing an observation error through a target positioning algorithm to obtain the real-time attitude estimation of the electronic equipment.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
taking at least one of the offset and the noise amplitude in the data information of the inertial measurement unit, the dynamic delay time between the time stamp of the data information of the inertial measurement unit and the time stamp of the image, external parameters of the electronic equipment and second hardware parameter information as an estimator of a target positioning algorithm, minimizing an observation error and obtaining real-time attitude estimation of the electronic equipment; wherein the external parameters of the electronic equipment comprise relative postures of the vision sensor and the inertial measurement unit.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
taking internal parameters of a visual sensor of the electronic equipment as random variables of a target positioning algorithm, calculating the distribution of observation errors, and taking the posture corresponding to the minimum value in the distribution of the observation errors as the real-time posture estimation of the electronic equipment; wherein the internal parameters of the visual sensor of the electronic equipment comprise at least one of a focal length and an optical center position of the visual sensor of the electronic equipment; wherein the random variables obey a gaussian distribution.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring first hardware parameter information of electronic equipment; matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information; and acquiring real-time attitude estimation of the electronic equipment in real time by using a target positioning algorithm.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring an image in real time through a visual sensor of the electronic equipment; extracting the positions of key points in the image; extracting descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information; and determining real-time attitude estimation of the electronic equipment according to the positions of the key points in the image and the descriptors of each key point.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point; and at least taking the associated image characteristics as observed quantities, and minimizing observation errors through a target positioning algorithm to obtain real-time attitude estimation of the electronic equipment.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and using an image feature association algorithm matched with the first hardware parameter information, and adopting a selected image feature association algorithm to associate the image features of the image acquired in real time with the image features of the image of the previous frame.
In one embodiment, the computer program when executed by the processor further performs the steps of:
correlating image features of the image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point; at least taking the related image characteristics as observed quantities to obtain observation errors; acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame of image; obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and image estimation acquired in real time; estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm; determining a first error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount; and obtaining real-time attitude estimation of the electronic equipment through the first error and the observation error according to a target positioning algorithm.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring acceleration information and angular velocity information in real time through an inertial measurement unit of the electronic equipment, and carrying out time synchronization on the acceleration information and the angular velocity information and image characteristics of an image acquired by a visual sensor; and combining the image characteristics, the acceleration information and the angular velocity information after time synchronization as observed quantities, and minimizing an observation error through a target positioning algorithm to obtain the real-time attitude estimation of the electronic equipment.
In one embodiment, the computer program when executed by the processor further performs the steps of:
taking at least one of the offset and the noise amplitude in the data information of the inertial measurement unit, the dynamic delay time between the time stamp of the data information of the inertial measurement unit and the time stamp of the image, external parameters of the electronic equipment and second hardware parameter information as an estimator of a target positioning algorithm, minimizing an observation error and obtaining real-time attitude estimation of the electronic equipment; wherein the external parameters of the electronic equipment comprise relative postures of the vision sensor and the inertial measurement unit.
In one embodiment, the computer program when executed by the processor further performs the steps of:
taking internal parameters of a visual sensor of the electronic equipment as random variables of a target positioning algorithm, calculating the distribution of observation errors, and taking the posture corresponding to the minimum value in the distribution of the observation errors as the real-time posture estimation of the electronic equipment; wherein the internal parameters of the visual sensor of the electronic equipment comprise at least one of a focal length and an optical center position of the visual sensor of the electronic equipment; wherein the random variables obey a gaussian distribution.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (7)

1. A pose determination method, comprising:
acquiring first hardware parameter information of electronic equipment;
matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information;
acquiring an image in real time through a visual sensor of the electronic equipment;
extracting the positions of key points in the image;
extracting descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information;
associating image features of an image acquired in real time with image features of a previous frame of image, wherein the image features comprise positions of key points in the image and descriptors of each key point;
taking the related image features as observed quantities to obtain observation errors;
acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image;
obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and the image estimation acquired in real time;
estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm;
determining a calculation error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount;
and according to the target positioning algorithm, obtaining the real-time attitude estimation of the electronic equipment through the calculation error and the observation error.
2. The method according to claim 1, wherein the associating the image features of the image acquired in real time with the image features of the image of the previous frame comprises:
and using an image feature association algorithm matched with the first hardware parameter information to associate the image features of the image acquired in real time with the image features of the previous frame of image.
3. The method of claim 1, further comprising:
acquiring acceleration information and angular velocity information in real time through an inertial measurement unit of the electronic equipment, and carrying out time synchronization on the acceleration information and the angular velocity information and image characteristics of an image acquired by the vision sensor;
the using the associated image features as observed quantities includes:
and combining the image characteristics, the acceleration information and the angular velocity information after time synchronization to serve as observed quantities.
4. The method of claim 3, wherein the associating the image feature as an observed quantity comprises:
taking at least one of the offset and the noise amplitude in the inertial measurement unit data information, the dynamic delay time between the time stamp of the inertial measurement unit data information and the time stamp of the image acquired by the vision sensor in real time, the external parameter of the electronic equipment and the second hardware parameter information as the estimator of the target positioning algorithm; wherein the external parameters of the electronic device include relative poses of the vision sensor and the inertial measurement unit; the second hardware parameter information is a rolling time.
5. A pose determination apparatus, comprising:
the first acquisition module is used for acquiring first hardware parameter information of the electronic equipment;
the matching module is used for matching the first hardware parameter information with a positioning algorithm to obtain a target positioning algorithm matched with the first hardware parameter information;
the acquisition unit is used for acquiring an image in real time through a visual sensor of the electronic equipment;
the first extraction unit is used for extracting the positions of key points in the image;
the second extraction unit is used for extracting descriptors of the key points by using a descriptor extraction mode matched with the first hardware parameter information;
the image processing device comprises a determining unit, a processing unit and a processing unit, wherein the determining unit is used for associating image characteristics of an image acquired in real time with image characteristics of a previous frame of image, and the image characteristics comprise positions of key points in the image and descriptors of each key point; taking the related image features as observed quantities to obtain observation errors; acquiring the three-dimensional translation amount and the three-dimensional rotation amount of the previous frame image; obtaining a first estimated two-dimensional translation amount of the previous frame image and the image acquired in real time through three-dimensional translation amount and three-dimensional rotation amount projection of the previous frame image and the image estimation acquired in real time; estimating and obtaining a second estimated two-dimensional translation amount of the image acquired in real time and the previous frame image through an image tracking algorithm; determining a calculation error according to the first estimated two-dimensional translation amount and the second estimated two-dimensional translation amount; and according to the target positioning algorithm, obtaining the real-time attitude estimation of the electronic equipment through the calculation error and the observation error.
6. An electronic device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 4 when executing the computer program.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
CN201810550880.8A 2018-05-31 2018-05-31 Pose determination method and device Active CN108871311B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810550880.8A CN108871311B (en) 2018-05-31 2018-05-31 Pose determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810550880.8A CN108871311B (en) 2018-05-31 2018-05-31 Pose determination method and device

Publications (2)

Publication Number Publication Date
CN108871311A CN108871311A (en) 2018-11-23
CN108871311B true CN108871311B (en) 2021-01-19

Family

ID=64336286

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810550880.8A Active CN108871311B (en) 2018-05-31 2018-05-31 Pose determination method and device

Country Status (1)

Country Link
CN (1) CN108871311B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109798891B (en) * 2019-01-25 2023-04-28 上海交通大学 Inertial measurement unit calibration system based on high-precision motion capture system
CN109840921B (en) * 2019-01-29 2020-07-03 北京三快在线科技有限公司 Method and device for determining result of unmanned task and unmanned equipment
CN110008856B (en) * 2019-03-20 2021-08-17 联想(北京)有限公司 Positioning method and electronic equipment
CN112113582A (en) * 2019-06-21 2020-12-22 上海商汤临港智能科技有限公司 Time synchronization processing method, electronic device, and storage medium
CN112288803A (en) * 2019-07-25 2021-01-29 阿里巴巴集团控股有限公司 Positioning method and device for computing equipment
CN110689019B (en) * 2019-09-27 2022-05-24 中国银行股份有限公司 OCR recognition model determining method and device
CN110889413A (en) * 2019-11-18 2020-03-17 中国银行股份有限公司 Terminal equipment determining method and device for realizing optical character recognition
CN111604899A (en) * 2020-05-15 2020-09-01 深圳国信泰富科技有限公司 Data transmission system of high intelligent robot
CN112362084A (en) * 2020-11-23 2021-02-12 北京三快在线科技有限公司 Data calibration method, device and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014035231A1 (en) * 2012-08-29 2014-03-06 Silverlake Mobility Ecosystem Sdn Bhd Method of pairing mobile devices
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN107707979A (en) * 2017-10-19 2018-02-16 广州华多网络科技有限公司 Method, apparatus, system and the readable storage medium storing program for executing of direct broadcasting room AR positioning
CN108090921A (en) * 2016-11-23 2018-05-29 中国科学院沈阳自动化研究所 Monocular vision and the adaptive indoor orientation method of IMU fusions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014035231A1 (en) * 2012-08-29 2014-03-06 Silverlake Mobility Ecosystem Sdn Bhd Method of pairing mobile devices
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN108090921A (en) * 2016-11-23 2018-05-29 中国科学院沈阳自动化研究所 Monocular vision and the adaptive indoor orientation method of IMU fusions
CN107707979A (en) * 2017-10-19 2018-02-16 广州华多网络科技有限公司 Method, apparatus, system and the readable storage medium storing program for executing of direct broadcasting room AR positioning

Also Published As

Publication number Publication date
CN108871311A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108871311B (en) Pose determination method and device
CN107888828B (en) Space positioning method and device, electronic device, and storage medium
US10247556B2 (en) Method for processing feature measurements in vision-aided inertial navigation
EP3309751B1 (en) Image processing device, method, and program
US10012504B2 (en) Efficient vision-aided inertial navigation using a rolling-shutter camera with inaccurate timestamps
CN105103089B (en) System and method for generating accurate sensor corrections based on video input
CN110211151B (en) Method and device for tracking moving object
US10204445B2 (en) Information processing apparatus, method, and storage medium for determining a failure of position and orientation measurement of an image capturing device
Porzi et al. Visual-inertial tracking on android for augmented reality applications
US11042984B2 (en) Systems and methods for providing image depth information
CN111862150A (en) Image tracking method and device, AR device and computer device
CN108827341A (en) The method of the deviation in Inertial Measurement Unit for determining image collecting device
WO2022205750A1 (en) Point cloud data generation method and apparatus, electronic device, and storage medium
CN110825079A (en) Map construction method and device
US11245763B2 (en) Data processing method, computer device and storage medium
CN108804161B (en) Application initialization method, device, terminal and storage medium
CN112629565B (en) Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
US9245343B1 (en) Real-time image geo-registration processing
CN111489376A (en) Method and device for tracking interactive equipment, terminal equipment and storage medium
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN115727871A (en) Track quality detection method and device, electronic equipment and storage medium
CN111723826B (en) Method, device, computer equipment and storage medium for detecting precision of tracking algorithm
KR101418873B1 (en) method for calibating orientation measurement sensor, apparatus for calibating orientation measurement sensor, mobile electronic device comprising the same and program recording medium
CN113670327A (en) Visual inertial odometer initialization method, device, equipment and storage medium
CN113159197A (en) Pure rotation motion state judgment method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant