Nothing Special   »   [go: up one dir, main page]

CN114384515A - Positioning and orientation method based on multi-aperture imaging - Google Patents

Positioning and orientation method based on multi-aperture imaging Download PDF

Info

Publication number
CN114384515A
CN114384515A CN202210026337.4A CN202210026337A CN114384515A CN 114384515 A CN114384515 A CN 114384515A CN 202210026337 A CN202210026337 A CN 202210026337A CN 114384515 A CN114384515 A CN 114384515A
Authority
CN
China
Prior art keywords
aperture imaging
camera
imaging system
point
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210026337.4A
Other languages
Chinese (zh)
Inventor
于起峰
尚洋
李彬
关棒磊
李璋
梁顺坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202210026337.4A priority Critical patent/CN114384515A/en
Publication of CN114384515A publication Critical patent/CN114384515A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a positioning and orientation method based on multi-aperture imaging, which adopts at least three phase mechanisms to form a multi-aperture imaging system, wherein cameras do not have a common imaging area and are equivalent to large-view-field and high-resolution combined imaging; establishing a corresponding set between two-dimension and three-dimension; based on pre-calibrated camera internal parameters and camera external parameters, performing equivalent monocular pose estimation, and solving a PnP problem to obtain a multi-aperture imaging system position and attitude initial value; and constructing an observation model based on the multi-aperture imaging relationship, and optimizing the initial values of the position and the posture of the multi-aperture imaging system to obtain the high-precision position and posture of the multi-aperture imaging system. The invention is applied to the technical field of photogrammetry and computer vision, utilizes multi-aperture imaging equivalent ultra-high resolution imaging in an ultra-large field range, optimizes and obtains high-precision position and posture by establishing an observation model based on multi-aperture imaging relation, breaks through hardware limitation, and realizes extremely high-precision positioning and orientation.

Description

Positioning and orientation method based on multi-aperture imaging
Technical Field
The invention relates to the technical field of photogrammetry and computer vision, in particular to a positioning and orienting method based on multi-aperture imaging.
Background
At present, the positioning and orientation technology mainly comprises inertial navigation, visual navigation and the like. Although the inertial navigation has good autonomous performance and does not depend on external information, the high-precision inertial navigation has heavy mass, large volume and high price, and errors are dispersed along with time due to mechanism limitation. Visual navigation has no time drift and error accumulation, and is another way of navigation with great attention. The existing visual navigation methods are various, generally, a monocular, binocular or depth camera is used for solving an n-point perspective problem (PnP) to realize positioning and orientation, but the measurement accuracy still needs to be improved. High-precision positioning and orientation are still a difficult problem to be solved urgently.
High-precision positioning and orientation require that an imaging system has ultrahigh object plane resolution and an oversized view field range, and a single camera cannot meet the requirements of the two aspects simultaneously under the existing hardware condition.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a positioning and orientation method based on multi-aperture imaging, which utilizes multi-aperture imaging equivalent ultra-large field range ultrahigh resolution imaging, breaks through hardware limitation and realizes extremely high-precision positioning and orientation.
In order to achieve the above object, the present invention provides a positioning and orienting method based on multi-aperture imaging, which adopts at least three phase mechanisms to form a multi-aperture imaging system, and each camera has no common imaging region, and is equivalent to large-field and high-resolution combined imaging, and the positioning and orienting method specifically comprises the following steps:
step 1, obtaining accurate image coordinates and reference space three-dimensional coordinates of feature points based on landscape matching, and establishing a two-three dimensional corresponding set;
step 2, calculating a normalized image coordinate on the combined imaging based on a camera internal parameter and a camera external parameter which are calibrated in advance, and estimating and solving a PnP problem by using the normalized image coordinate and a spatial three-dimensional coordinate of the feature point to obtain a multi-aperture imaging system position and attitude initial value;
and 3, constructing an observation model based on the multi-aperture imaging relationship, and optimizing the initial values of the position and the posture of the multi-aperture imaging system to obtain the high-precision position and posture of the multi-aperture imaging system.
Further, in step 3, the observation model is:
Figure BDA0003464837690000021
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000022
is the image coordinate of the jth characteristic point on the ith camera,
Figure BDA0003464837690000023
is a three-dimensional coordinate of a reference space of a jth characteristic point on an ith camera, KiIs the internal reference matrix of the ith camera, RiIs the extrinsic rotation matrix of the ith camera, tiThe external reference translation vector of the ith camera is shown in R, t, which is the position and attitude of the multi-aperture imaging system.
Further, in step 3, the optimizing of the initial position and attitude values of the multi-aperture imaging system specifically includes:
and optimizing the position and attitude initial values of the multi-aperture imaging system by taking the deviation between the reprojection point of the minimized space point on the image and the actual image point as a target function.
Further, the objective function is:
Figure BDA0003464837690000024
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000025
the actual image point of the jth characteristic point on the ith camera,
Figure BDA0003464837690000026
is the reprojection point of the jth characteristic point on the ith camera.
Further, in step 2, the normalized image coordinates of the j-th feature point on the i-th camera on the joint imaging are:
Figure BDA0003464837690000027
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000028
is the normalized coordinate of the j characteristic point on the i camera.
The positioning and orientation method based on multi-aperture imaging provided by the invention utilizes multi-aperture imaging equivalent ultra-high resolution imaging in an ultra-large field range, optimizes and obtains high-precision position and posture by establishing an observation model based on multi-aperture imaging relation, breaks through hardware limitation, and realizes extremely high-precision positioning and orientation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a schematic diagram of large field of view, high resolution joint imaging in an embodiment of the invention;
FIG. 2 is a schematic diagram of a multi-aperture imaging system in an embodiment of the invention;
fig. 3 is a flowchart of a positioning and orientation method based on multi-aperture imaging according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indicators (such as up, down, left, right, front, and rear … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicator is changed accordingly.
In addition, the descriptions related to "first", "second", etc. in the present invention are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; the connection can be mechanical connection, electrical connection, physical connection or wireless communication connection; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
The embodiment discloses a positioning and orientation method based on multi-aperture imaging, which adopts at least three cameras which are mutually opened at large interval angles to form a multi-aperture imaging system, and all the cameras do not have a common imaging area, which is equivalent to the large-field and high-resolution combined imaging shown in fig. 1. Referring to fig. 2, in the present embodiment, a multi-aperture imaging system is formed by using a five-phase mechanism, wherein the view field of the camera No. 1 is vertically downward, and the angle between the optical axis directions of the cameras No. 2, 3, 4, and 5 and the optical axis direction of the camera No. 1 is α. It should be noted that the multi-aperture camera system does not necessarily require one camera to be oriented vertically downward, and the use of 5 cameras in the present embodiment is a relatively sufficient solution, i.e. it is possible to use No. 1 camera.
Referring to fig. 3, the positioning and orienting method in this embodiment specifically includes the following steps:
step 1, obtaining accurate image coordinates and reference space three-dimensional coordinates of feature points based on landscape matching, and establishing a two-three dimensional corresponding set;
step 2, based on the camera internal parameter K calibrated in advance1、K2、K3、K4、K5And an inter-camera extrinsic parameter R1、t1、R2、t2、R3、t3、R4、t4、R5、t5. Wherein, KiIs the internal reference matrix of the ith camera, RiIs the extrinsic rotation matrix of the ith camera, tiIs the extrinsic translation vector for the ith camera.
Figure BDA0003464837690000041
Cx、CyIs the principal point of the image, Fx、FyIs in the transverse directionThe longitudinal equivalent focal length. And then, calculating the normalized image coordinate of the feature point on the combined imaging according to the calibrated internal parameter and external parameter of the camera, estimating the equivalent monocular pose by using the normalized image coordinate and the spatial three-dimensional coordinate of the feature point, and solving the PnP problem to obtain the position and attitude initial value of the multi-aperture imaging system. Solving the PnP problem is a conventional technical means in the art, and therefore, the details thereof are not described in this embodiment.
And 3, constructing an observation model based on the multi-aperture imaging relationship, and optimizing the initial values of the position and the posture of the multi-aperture imaging system to obtain the high-precision position and posture of the multi-aperture imaging system. Wherein, the observation model is:
Figure BDA0003464837690000042
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000043
is the image coordinate of the jth characteristic point on the ith camera,
Figure BDA0003464837690000044
the reference space three-dimensional coordinate of the jth characteristic point on the ith camera is shown in R, t, which is the position and the posture of the multi-aperture imaging system. In the specific implementation process, the optimization of the initial values of the position and the attitude of the multi-aperture imaging system specifically comprises the following steps:
and optimizing the position and attitude initial values of the multi-aperture imaging system by taking the deviation between the reprojection point of the minimized space point on the joint imaging and the actual image point as a target function. Wherein the objective function is:
Figure BDA0003464837690000045
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000046
the actual image point of the jth characteristic point on the ith camera is taken as the image point.
In this embodiment, the normalized image coordinates of the jth feature point on the ith camera on the joint imaging are obtained according to the camera intrinsic parameters and the camera extrinsic parameters calibrated in advance, and the specific implementation process is as follows:
let x be [ u v 1 ] as the image coordinate of the feature point on the camera image]TThe normalized image coordinate on the joint imaging is x ' ═ u ' v ' 1]TThen, there are:
Figure BDA0003464837690000051
thus obtaining:
Figure BDA0003464837690000052
in the formula (I), the compound is shown in the specification,
Figure BDA0003464837690000053
is the normalized coordinate of the j characteristic point on the i camera.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (5)

1. A positioning and orientation method based on multi-aperture imaging is characterized in that a multi-aperture imaging system is formed by adopting at least three phase mechanisms, common imaging areas do not exist among cameras, and the imaging is equivalent to large-field and high-resolution imaging, and the positioning and orientation method specifically comprises the following steps:
step 1, obtaining accurate image coordinates and reference space three-dimensional coordinates of feature points based on landscape matching, and establishing a two-three dimensional corresponding set;
step 2, calculating a normalized image coordinate on the combined imaging based on a camera internal parameter and a camera external parameter which are calibrated in advance, and estimating and solving a PnP problem by using the normalized image coordinate and a spatial three-dimensional coordinate of the feature point to obtain a multi-aperture imaging system position and attitude initial value;
and 3, constructing an observation model based on the multi-aperture imaging relationship, and optimizing the initial values of the position and the posture of the multi-aperture imaging system to obtain the high-precision position and posture of the multi-aperture imaging system.
2. The multi-aperture imaging based positioning and orientation method according to claim 1, wherein in step 3, the observation model is:
Figure FDA0003464837680000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003464837680000012
is the image coordinate of the jth characteristic point on the ith camera,
Figure FDA0003464837680000013
is a three-dimensional coordinate of a reference space of a jth characteristic point on an ith camera, KiIs the internal reference matrix of the ith camera, RiIs the extrinsic rotation matrix of the ith camera, tiThe external reference translation vector of the ith camera is shown in R, t, which is the position and attitude of the multi-aperture imaging system.
3. The multi-aperture imaging-based positioning and orientation method according to claim 2, wherein in step 3, the initial position and attitude values of the multi-aperture imaging system are optimized, specifically:
and optimizing the position and attitude initial values of the multi-aperture imaging system by taking the deviation between the reprojection point of the minimized space point on the image and the actual image point as a target function.
4. The multi-aperture imaging based localization and orientation method according to claim 3, wherein the objective function is:
Figure FDA0003464837680000014
in the formula (I), the compound is shown in the specification,
Figure FDA0003464837680000015
the actual image point of the jth characteristic point on the ith camera,
Figure FDA0003464837680000016
is the reprojection point of the jth characteristic point on the ith camera.
5. The multi-aperture imaging based positioning and orientation method according to claim 1, wherein in step 2, the normalized image coordinates of the jth feature point on the ith camera on the joint imaging are:
Figure FDA0003464837680000021
in the formula (I), the compound is shown in the specification,
Figure FDA0003464837680000022
is the normalized coordinate of the j characteristic point on the i camera.
CN202210026337.4A 2022-01-11 2022-01-11 Positioning and orientation method based on multi-aperture imaging Pending CN114384515A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210026337.4A CN114384515A (en) 2022-01-11 2022-01-11 Positioning and orientation method based on multi-aperture imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210026337.4A CN114384515A (en) 2022-01-11 2022-01-11 Positioning and orientation method based on multi-aperture imaging

Publications (1)

Publication Number Publication Date
CN114384515A true CN114384515A (en) 2022-04-22

Family

ID=81202683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210026337.4A Pending CN114384515A (en) 2022-01-11 2022-01-11 Positioning and orientation method based on multi-aperture imaging

Country Status (1)

Country Link
CN (1) CN114384515A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118393592A (en) * 2024-06-27 2024-07-26 中国人民解放军国防科技大学 Multi-aperture detection target joint positioning method, device, equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118393592A (en) * 2024-06-27 2024-07-26 中国人民解放军国防科技大学 Multi-aperture detection target joint positioning method, device, equipment and medium
CN118393592B (en) * 2024-06-27 2024-08-20 中国人民解放军国防科技大学 Multi-aperture detection target joint positioning method, device, equipment and medium

Similar Documents

Publication Publication Date Title
JP6573419B1 (en) Positioning method, robot and computer storage medium
CN108012325A (en) A kind of navigation locating method based on UWB and binocular vision
JPH10221072A (en) System and method for photogrammetry
CN110411457B (en) Positioning method, system, terminal and storage medium based on stroke perception and vision fusion
WO2020133172A1 (en) Image processing method, apparatus, and computer readable storage medium
CN107481288A (en) The inside and outside ginseng of binocular camera determines method and apparatus
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN111123242A (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN111207688B (en) Method and device for measuring distance of target object in vehicle and vehicle
WO2020181409A1 (en) Capture device parameter calibration method, apparatus, and storage medium
JP2006234703A (en) Image processing device, three-dimensional measuring device, and program for image processing device
CN111462241B (en) Target positioning method based on monocular vision
CN113947638A (en) Image orthorectification method for fisheye camera
CN114384515A (en) Positioning and orientation method based on multi-aperture imaging
CN112857328B (en) Calibration-free photogrammetry method
CN114926538A (en) External parameter calibration method and device for monocular laser speckle projection system
JPH11514434A (en) Method and apparatus for determining camera position and orientation using image data
CN113295159A (en) Positioning method and device for end cloud integration and computer readable storage medium
CN113034565A (en) Monocular structured light depth calculation method and system
CN109712200B (en) Binocular positioning method and system based on least square principle and side length reckoning
CN111563936A (en) Camera external parameter automatic calibration method and automobile data recorder
WO2020024150A1 (en) Map processing method, apparatus, and computer readable storage medium
CN216116064U (en) Pose calibration system of heading machine
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method
CN113405532B (en) Forward intersection measuring method and system based on structural parameters of vision system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination