CN108170297B - Real-time six-degree-of-freedom VR/AR/MR device positioning method - Google Patents
Real-time six-degree-of-freedom VR/AR/MR device positioning method Download PDFInfo
- Publication number
- CN108170297B CN108170297B CN201711399552.4A CN201711399552A CN108170297B CN 108170297 B CN108170297 B CN 108170297B CN 201711399552 A CN201711399552 A CN 201711399552A CN 108170297 B CN108170297 B CN 108170297B
- Authority
- CN
- China
- Prior art keywords
- pose
- freedom
- degree
- imu
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
The inventionThe method is characterized in that the six-degree-of-freedom positioning scheme calculates the pose (Pose (t)) of a camera at the moment t, and the Pose (t) is directly applied to a picture with rendering requirements at the moment; due to the complexity of a six-degree-of-freedom positioning scheme, the time interval between the next Pose (t +1) and the time t is longer, and when an intermediate picture has rendering requirements, the Pose (t) and the IMU rotation matrix R are usedIMUAnd performing primary fusion, and directly applying the obtained result to rendering so as to meet the requirement of real-time rendering. The IMU is added on the basis of the traditional six-degree-of-freedom scheme, the IMU module is cheap and light, the cost is not high, the consumption of the fusion algorithm of the IMU on the performance of the CPU is low, and the original six-degree-of-freedom scheme is easy to add. Finally, a six-degree-of-freedom positioning scheme with real-time performance is realized.
Description
Technical Field
The invention relates to the field of image processing, in particular to a real-time six-degree-of-freedom VR/AR/MR device positioning method.
Background
The three-degree-of-freedom positioning scheme comprises the following steps: the technique of using IMU data to compute the rotation of the device in three axes, one of the simplest methods is to integrate time through gyroscope data in the IMU and apply the results to the VR device.
A six-degree-of-freedom positioning scheme: the existing solution generally uses a camera to simultaneously obtain the rotation of the device in three axial directions and the displacement in three directions through an image processing technology.
The existing VR equipment can only support three-degree-of-freedom positioning, a picture displayed by the VR equipment only rotates in three directions, the displacement of a user in space cannot be reflected on the picture, and the immersion feeling of the three-degree-of-freedom positioning is insufficient. In order to realize positioning with six degrees of freedom, AR/MR equipment generally calculates the displacement of a user by using a camera image processing method, and due to the limitation of the calculated amount and camera hardware, the time for calculating the posture once is generally far longer than 16 milliseconds, which often cannot meet the requirement of 60FPS required for smooth picture and cannot perform real-time positioning.
Disclosure of Invention
The invention aims to provide a six-degree-of-freedom positioning scheme, which can accurately calculate the posture of a user and simultaneously meet the calculation speed required by at least 60FPS rendering.
The technical scheme is as follows:
a real-time six-degree-of-freedom VR/AR/MR device positioning method comprises the steps of firstly calculating a six-degree-of-freedom positioning scheme, calculating the pose (Pose) (t) of a camera at the moment t by the six-degree-of-freedom positioning scheme, and directly applying the Pose (t) when a picture has a rendering requirement; due to the complexity of a six-degree-of-freedom positioning scheme, the time interval between the next Pose (t +1) and the time t is longer, and when an intermediate picture has rendering requirements, the Pose (t) and the IMU rotation matrix R are usedIMUPerforming primary fusion, and directly applying the obtained result to rendering so as to meet the requirement of real-time rendering;
wherein, the fusion result of Pose (t) and IMU is PoseIMU(t)TCalculated according to the following formula:
PoseIMU(t)T=RIMU*Pose(t)T
in the formula, RIMUThe homogeneous matrix, 4 x 4, is the rotation matrix calculated by the IMU.
Specifically, the six-degree-of-freedom positioning scheme is calculated by using a SLAM algorithm or a PTAM algorithm.
Preferably, the calculation steps of the SLAM algorithm are as follows:
s101: camera calibration, aiming at obtaining an internal reference matrix of a camera:
where fx and fy are the focal lengths of the cameras, xo,yoIs a principal point coordinate, and s is a coordinate axis inclination parameter;
s102: positioning, and solving the pose of the camera according to the picture shot by the camera after the camera is calibrated;
s103: reconstructing a map, converting a picture shot by a camera into point clouds, and re-representing the surrounding environment in the form of the point clouds by a point cloud splicing method;
s104: loop detection, wherein the deviation is small in a short time and an accumulated error exists after a long time because the positioning calculation result deviates from the real camera pose, the calculated pose is greatly different from the real pose, and the camera pose can be corrected when the pictures shot by the camera are repeated before through loop detection;
the resulting camera pose matrix Pose (t) is as follows:
wherein R is3×3Is a rotation matrix belonging to the SO3 group and representing three rotation amounts in six degrees of freedom; t is3×1Is a column vector, each element of which represents the displacement of the camera in three axes, respectively.
Specifically, the camera calibration in step S101 includes: linear calibration, nonlinear optimization calibration and two-step calibration.
Specifically, the method for solving the pose of the camera in step S102 is a beam adjustment method.
Preferably, the IMU rotation matrix RIMUIs calculated by
RIMU=αRgyro+βRacc+γRmag
Wherein α + β + γ ═ 1;Rgyro、Racc、Rmagrespectively are position and attitude matrixes calculated by a gyroscope, an accelerometer and a magnetometer.
Specifically, the values of α, β, γ are calculated by EKF algorithm, linear KF algorithm, or Mahony filter.
The calibration mode of the gyroscope is as follows: taking an average value in multiple sampling; or firstly collecting the offset of the gyroscope in a static state and removing the offset when reporting data.
The accelerometer eliminates interference by using the direction of gravity acceleration in a static state, and the process is as follows:
A=(ax,ay,az)
G=(g,0,0)
Δ=A-G
wherein A is accelerometer data in a static state, G is a gravity acceleration vector, Delta is an accelerometer offset, A is a reference valuetIn order to be the raw data of the accelerometer,the accelerometer data at time t after calibration.
Magnetometer calibration needs to consider whether there is an additional local magnetic field environment: if not, the magnetometer is calibrated by adopting a plane calibration method or a splayed calibration method; in an environment containing an additional local magnetic field, the calibration is carried out by the following method:
s201: horizontally rotating the equipment for 360 degrees;
s202: finding the minimum output X of the magnetometer horizontal direction datamin,YminAnd maximum output Xmax,Ymax;
S203: first scale factor Xs=1;
S204: calculating another scaling factor Ys
S205: calculating offset compensation
Xb=Xs[1/2(Xmax-Xmin)-Xmax]
Yb=Ys[1/2(Ymax-Ymin)-Ymax]
S206: obtain an output
Xout=Xin*Xs+Xb
Yout=Yin*Ys+Yb
Wherein, XoutYoutFor calibrated magnetometer data, XinYinThe Z-axis does not need to be calibrated for raw magnetometer data when using this method.
The invention has the advantages of
The invention adds IMU on the basis of general six-freedom-degree scheme, thus realizing the six-freedom-degree positioning scheme with real-time property. The IMU module is cheap and light, the cost is not high, the IMU module is easy to modify on the original six-degree-of-freedom scheme, the performance consumption of the CPU by the fusion algorithm of the IMU is low, and the original six-degree-of-freedom scheme is easy to add.
Drawings
FIG. 1 is a block diagram of a conventional SLAM algorithm scheme for six-degree-of-freedom positioning
FIG. 2 is a block diagram of a conventional scheme for calculating pose of a camera through IMU
FIG. 3 is a block diagram of a six degree-of-freedom positioning scheme of the present invention
Detailed Description
The invention will be further described in the following preferred embodiments with reference to the attached drawings, but the scope of protection of the invention is not limited thereto:
with reference to fig. 3, a real-time six-degree-of-freedom VR/AR/MR device positioning method includes calculating a six-degree-of-freedom positioning scheme based on SLAM, calculating a pose (dose) (t) of a camera at time t by the six-degree-of-freedom positioning scheme, and directly applying the pose (dose) (t) when a rendering requirement exists on a picture; due to the complexity of the six-degree-of-freedom positioning scheme, the next Pose Pose (t +1) and t timeThe interval time is longer, and when the intermediate picture has rendering requirements, the Pose (t) and IMU rotation matrix R obtained by SLAM calculationIMUPerforming primary fusion, and directly applying the obtained result to rendering so as to meet the requirement of real-time rendering;
wherein, the fusion result of Pose (t) and IMU is PoseIMU(t)TCalculated according to the following formula:
PoseIMU(t)T=RIMU*Pose(t)T
in the formula, RIMUThe homogeneous matrix, 4 x 4, is the rotation matrix calculated by the IMU.
(in other embodiments, the six degree of freedom positioning scheme may be based on a PTAM algorithm calculation).
With reference to fig. 1, the camera scheme that can be used by the SLAM algorithm mainly includes: RGBD camera, binocular camera, monocular camera, SLAM algorithm calculation step is:
s101: camera calibration (optionally linear calibration, non-linear optimization calibration or two-step calibration) aims at obtaining an internal reference matrix of the camera:
where fx and fy are the focal lengths of the cameras, xo,yoIs a principal point coordinate, and s is a coordinate axis inclination parameter;
s102: positioning, after the camera is calibrated, solving the pose of the camera by a light beam adjustment method according to the picture shot by the camera;
s103: reconstructing a map, converting a picture shot by a camera into point clouds, and re-representing the surrounding environment in the form of the point clouds by a point cloud splicing method;
s104: loop detection, wherein the deviation is small in a short time and an accumulated error exists after a long time because the positioning calculation result deviates from the real camera pose, the calculated pose is greatly different from the real pose, and the camera pose can be corrected when the pictures shot by the camera are repeated before through loop detection;
the resulting camera pose matrix Pose (t) is as follows:
wherein R is3×3Is a rotation matrix belonging to the SO3 group and representing three rotation amounts in six degrees of freedom; t is3×1Is a column vector, each element of which represents the displacement of the camera in three axes, respectively. The pose matrix can be directly applied to the rendering of graphics libraries and engines such as Direct3D, OpenGL, Unity and the like.
In connection with FIG. 2, the IMU rotation matrix RIMUIs calculated by
RIMU=αRgyro+βRacc+γRmag
Wherein α + β + γ is 1 (the values of α, β, γ are calculated by EKF algorithm, linear KF algorithm or Mahony filter); rgyro、Racc、RmagRespectively are position and attitude matrixes calculated by a gyroscope, an accelerometer and a magnetometer.
Rgyro、Racc、RmagIn the practical process, calibration is needed, specifically:
the calibration mode of the gyroscope is as follows: taking an average value in multiple sampling; or firstly collecting the offset of the gyroscope in a static state and removing the offset when reporting data.
The accelerometer eliminates interference by using the direction of gravity acceleration in a static state, and the process is as follows:
A=(ax,ay,az)
G=(g,0,0)
Δ=A-G
wherein A is accelerometer data in a static state, G is a gravity acceleration vector, Delta is an accelerometer offset, A is a reference valuetFor accelerometer raw data,The accelerometer data at time t after calibration.
Magnetometer calibration needs to consider whether there is an additional local magnetic field environment: if not, the magnetometer is calibrated by adopting a plane calibration method or a splayed calibration method; in an environment containing an additional local magnetic field, the calibration is carried out by the following method:
s201: horizontally rotating the equipment for 360 degrees;
s202: finding the minimum output X of the magnetometer horizontal direction datamin,YminAnd maximum output Xmax,Ymax;
S203: first scale factor Xs=1;
S204: calculating another scaling factor Ys
S205: calculating offset compensation
Xb=Xs[1/2(Xmax-Xmin)-Xmax]
Yb=Ys[1/2(Ymax-Ymin)-Ymax]
S206: obtain an output
Xout=Xin*Xs+Xb
Yout=Yin*Ys+Yb
Wherein, XoutYoutFor calibrated magnetometer data, XinYinThe Z-axis does not need to be calibrated for raw magnetometer data when using this method.
In the embodiment, on the basis of the SLAM, an IMU fusion algorithm is added, so that when the SLAM is not in time to give a result, the IMU is used for giving a result which is closer to the real posture, the posture of the user can be accurately calculated, and the calculation speed required by at least 60FPS rendering is met.
Noun interpretation of related art terms
An IMU: the inertial measurement unit, International chemical Union, is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object. Generally, an IMU includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of an object in three independent axes of a carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, and measure angular velocity and acceleration of the object in three-dimensional space, and then solve the attitude of the object.
Three-degree-of-freedom positioning: the three-degree-of-freedom positioning refers to a positioning mode for describing the posture of an object by three rotation angles, and the state of the object in the space cannot be completely simulated.
Six-degree-of-freedom positioning: the six-degree-of-freedom positioning is based on the three-degree-of-freedom, and the state of an object in the space can be completely simulated by additionally adding displacement information of the object in three directions.
SLAM: simultaneous localization and mapping, synchronized positioning and mapping. The SLAM problem can be described as that the robot moves from an unknown position in an unknown environment, self-positioning is carried out according to position estimation and a map in the moving process, and meanwhile, an incremental map is built on the basis of self-positioning, so that autonomous positioning and navigation of the robot are realized.
EKF: extended kalman filter, kalman filter is a highly efficient recursive filter (autoregressive filter) that is capable of estimating the state of a dynamic system from a series of measurements that do not completely contain noise. The extended Kalman filter introduces a Jacobian matrix on the basis of the Kalman filter to realize the estimation of the nonlinear model.
VR: virtual Reality is a computer simulation system that can create and experience a Virtual world, which uses a computer to generate a system simulation that simulates an environment as an interactive three-dimensional dynamic view and physical behavior of multi-source information fusion to immerse a user in the environment.
AR: augmented Reality is a technology for calculating the position and angle of a camera image in real time and adding a corresponding image, and the aim of the technology is to sleeve a virtual world on a screen in the real world and perform interaction.
MR: mixed Reality is a further development of virtual Reality technology, which introduces real scene information in a virtual environment to build an interactive feedback information loop among a virtual world, a real world and a user so as to enhance the Reality of the user experience.
SO3 group: the three-dimensional rotation group, the members of the group are 3 × 3 matrixes, one matrix of the SO3 group can represent one rotation in the three-dimensional euclidean space, and the multiplication result between the matrixes also belongs to the SO3 group.
RGBD camera: the depth camera comprises an RGB camera and a depth camera, wherein the RGB camera is a common camera, RGB represents that image information comprises three colors of red, green and blue, namely a common color image, an image returned by the depth camera is depth information, and the value of each pixel in the image represents the distance between an object corresponding to the pixel and the camera.
A binocular camera: the binocular camera consists of two RGB cameras, the two cameras are fixed in advance according to a certain angle and distance, and through image processing, distance information can be reconstructed by using known information, so that the same effect as that of the RGBD camera is achieved.
Monocular camera: the monocular camera only has one RGB camera, can be positioned to a certain extent, but can only give a relative result due to lack of scale information.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.
Claims (10)
1. A real-time six-freedom-degree VR/AR/MR device positioning method is characterized in that a six-freedom-degree positioning scheme is calculated firstCalculating the pose (Pose) (t) of the camera at the moment t, and directly applying the Pose (t) when the picture has a rendering requirement; due to the complexity of a six-degree-of-freedom positioning scheme, the time interval between the Pose (t +1) at the next t +1 moment and the t moment is longer, and when the intermediate picture has rendering requirements, the Pose (t) and the IMU rotation matrix R are usedIMUPerforming primary fusion, and directly applying the obtained result to rendering so as to meet the requirement of real-time rendering;
wherein, the fusion result of Pose (t) and IMU is PoseIMU(t)TCalculated according to the following formula:
PoseIMU(t)T=RIMU*Pose(t)T
in the formula, RIMUIs the rotation matrix calculated by the IMU.
2. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 1, characterized in that the six-degree-of-freedom localization scheme is computed using SLAM algorithm or PTAM algorithm.
3. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 2, wherein the SLAM algorithm is calculated by:
s101: camera calibration, aiming at obtaining an internal reference matrix of a camera:
where fx and fy are the focal lengths of the cameras, xo,yoIs a principal point coordinate, and s is a coordinate axis inclination parameter;
s102: positioning, and solving the pose of the camera according to the picture shot by the camera after the camera is calibrated;
s103: reconstructing a map, converting a picture shot by a camera into point clouds, and re-representing the surrounding environment in the form of the point clouds by a point cloud splicing method;
s104: loop detection is carried out, and the pose of the camera is corrected;
the resulting camera pose matrix Pose (t) is as follows:
wherein R is3×3Is a rotation matrix belonging to the SO3 group and representing three rotation amounts in six degrees of freedom; t is3×1Is a column vector, each element of which represents the displacement of the camera in three axes, respectively.
4. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 3, wherein the camera calibration in step S101 comprises: linear calibration, nonlinear optimization calibration and two-step calibration.
5. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 3, wherein the method of solving the pose of the camera in step S102 is a beam adjustment method.
6. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 1, wherein the IMU rotation matrix R is a matrix ofIMUIs calculated by
RIMU=αRgyro+βRacc+γRmag
Wherein α + β + γ ═ 1; rgyro、Racc、RmagRespectively are position and attitude matrixes calculated by a gyroscope, an accelerometer and a magnetometer.
7. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 6, wherein the values of α, β, γ are calculated by EKF algorithm, linear KF algorithm, or Mahony filter.
8. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 6, wherein the gyroscope is calibrated by: taking an average value in multiple sampling; or firstly collecting the offset of the gyroscope in a static state and removing the offset when reporting data.
9. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 6, wherein the accelerometer uses the direction of the gravitational acceleration to reject the disturbance at rest as follows:
A=(ax,ay,az)
G=(g,0,0)
Δ=A-G
10. The real-time six-degree-of-freedom VR/AR/MR device localization method of claim 6, wherein the magnetometer is calibrated using a planar calibration method or a splayed calibration method; in an environment containing an additional local magnetic field, the calibration is carried out by the following method:
s201: horizontally rotating the equipment for 360 degrees;
s202: finding the minimum output X of the magnetometer horizontal direction datamin,YminAnd maximum output Xmax,Ymax;
S203: first scale factor Xs=1;
S204: calculating another scaling factor Ys
S205: calculating offset compensation
Xb=Xs[1/2(Xmax-Xmin)-Xmax]
Yb=Ys[1/2(Ymax-Ymin)-Ymax]
S206: obtain an output
Xout=Xin*Xs+Xb
Yout=Yin*Ys+Yb
Wherein, XoutYoutFor calibrated magnetometer data, XinYinThe Z-axis does not need to be calibrated for raw magnetometer data when using this method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2017108127536 | 2017-09-11 | ||
CN201710812753 | 2017-09-11 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108170297A CN108170297A (en) | 2018-06-15 |
CN108170297B true CN108170297B (en) | 2021-11-16 |
Family
ID=62523406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711399552.4A Active CN108170297B (en) | 2017-09-11 | 2017-12-22 | Real-time six-degree-of-freedom VR/AR/MR device positioning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108170297B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109613983A (en) * | 2018-12-26 | 2019-04-12 | 青岛小鸟看看科技有限公司 | It wears the localization method of handle in display system, device and wears display system |
CN109920000B (en) * | 2019-03-04 | 2020-11-03 | 杭州师范大学 | Multi-camera cooperation-based dead-corner-free augmented reality method |
CN110276774B (en) * | 2019-06-26 | 2021-07-23 | Oppo广东移动通信有限公司 | Object drawing method, device, terminal and computer-readable storage medium |
CN110672212A (en) * | 2019-09-19 | 2020-01-10 | 珠海格力电器股份有限公司 | Spatial three-dimensional temperature field detection method, computer readable storage medium and intelligent household appliance |
CN111580650A (en) * | 2020-04-29 | 2020-08-25 | 南京睿悦信息技术有限公司 | Low-delay pose fusion method for mobile terminal sensor and external positioning system |
CN111935396A (en) * | 2020-07-01 | 2020-11-13 | 青岛小鸟看看科技有限公司 | 6DoF data processing method and device of VR (virtual reality) all-in-one machine |
CN113029053B (en) * | 2021-04-06 | 2022-05-13 | 中国科学技术大学 | Universal CT countershaft method |
CN113961068B (en) * | 2021-09-29 | 2023-01-06 | 北京理工大学 | Close-range real object eye movement interaction method based on augmented reality helmet |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105607737A (en) * | 2015-12-18 | 2016-05-25 | 北京诺亦腾科技有限公司 | Positioning method for physical prop of virtual reality system, and virtual reality system |
CN106095113A (en) * | 2016-06-27 | 2016-11-09 | 南京睿悦信息技术有限公司 | The measuring and calculating of user's attitude and the virtual reality follow-up method that a kind of nine axle sensors merge |
CN107025668A (en) * | 2017-03-30 | 2017-08-08 | 华南理工大学 | A kind of design method of the visual odometry based on depth camera |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9402070B2 (en) * | 2014-06-12 | 2016-07-26 | Faro Technologies, Inc. | Coordinate measuring device with a six degree-of-freedom handheld probe and integrated camera for augmented reality |
-
2017
- 2017-12-22 CN CN201711399552.4A patent/CN108170297B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105607737A (en) * | 2015-12-18 | 2016-05-25 | 北京诺亦腾科技有限公司 | Positioning method for physical prop of virtual reality system, and virtual reality system |
CN106095113A (en) * | 2016-06-27 | 2016-11-09 | 南京睿悦信息技术有限公司 | The measuring and calculating of user's attitude and the virtual reality follow-up method that a kind of nine axle sensors merge |
CN107025668A (en) * | 2017-03-30 | 2017-08-08 | 华南理工大学 | A kind of design method of the visual odometry based on depth camera |
Also Published As
Publication number | Publication date |
---|---|
CN108170297A (en) | 2018-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108170297B (en) | Real-time six-degree-of-freedom VR/AR/MR device positioning method | |
CN111156998B (en) | Mobile robot positioning method based on RGB-D camera and IMU information fusion | |
CN107747941B (en) | Binocular vision positioning method, device and system | |
US10026233B2 (en) | Efficient orientation estimation system using magnetic, angular rate, and gravity sensors | |
Weiss et al. | Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments | |
CN109540126A (en) | A kind of inertia visual combination air navigation aid based on optical flow method | |
CN103278177B (en) | Calibration method of inertial measurement unit based on camera network measurement | |
CN104658012A (en) | Motion capture method based on inertia and optical measurement fusion | |
CN110533719B (en) | Augmented reality positioning method and device based on environment visual feature point identification technology | |
CN108846857A (en) | The measurement method and visual odometry of visual odometry | |
JP2002259992A (en) | Image processor and its method as well as program code and storage medium | |
Oskiper et al. | Augmented reality binoculars | |
Gomez-Jauregui et al. | Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM | |
CN112116651B (en) | Ground target positioning method and system based on monocular vision of unmanned aerial vehicle | |
JP4195382B2 (en) | Tracking system expandable with automatic line calibration | |
CN110388919B (en) | Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality | |
CN109186596B (en) | IMU measurement data generation method, system, computer device and readable storage medium | |
CN110455301A (en) | A kind of dynamic scene SLAM method based on Inertial Measurement Unit | |
JP2010020729A (en) | Vehicle traveling locus observation system, vehicle traveling locus observation method and program | |
US8509522B2 (en) | Camera translation using rotation from device | |
CN113034571B (en) | Object three-dimensional size measuring method based on vision-inertia | |
Karam et al. | Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping | |
Satoh et al. | A head tracking method using bird's-eye view camera and gyroscope | |
Choi et al. | Position-based augmented reality platform for aiding construction and inspection of offshore plants | |
CN110503684A (en) | Camera position and orientation estimation method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |