Nothing Special   »   [go: up one dir, main page]

CN118365713B - Dynamic calibration method for random moving multi-camera - Google Patents

Dynamic calibration method for random moving multi-camera Download PDF

Info

Publication number
CN118365713B
CN118365713B CN202410510536.1A CN202410510536A CN118365713B CN 118365713 B CN118365713 B CN 118365713B CN 202410510536 A CN202410510536 A CN 202410510536A CN 118365713 B CN118365713 B CN 118365713B
Authority
CN
China
Prior art keywords
calibration
camera
determining
video stream
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410510536.1A
Other languages
Chinese (zh)
Other versions
CN118365713A (en
Inventor
艾得闻
圣洁
陈松灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Anji Wuzhi Technology Co ltd
Original Assignee
Zhejiang Anji Wuzhi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Anji Wuzhi Technology Co ltd filed Critical Zhejiang Anji Wuzhi Technology Co ltd
Priority to CN202410510536.1A priority Critical patent/CN118365713B/en
Publication of CN118365713A publication Critical patent/CN118365713A/en
Application granted granted Critical
Publication of CN118365713B publication Critical patent/CN118365713B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a dynamic calibration method for a random moving multi-camera, which relates to the technical field of camera calibration, and comprises the following steps: determining the spatial distribution characteristics of the camera array and the calibration plate array, capturing multiple spatial angles of the calibration plate, and determining video stream data; verifying video stream data and preprocessing the video stream data to determine an effective video stream; identifying a calibration plate ID, carrying out homography matrix analysis and distortion compensation treatment on the effective video stream, and analyzing and determining a homography matrix; and analyzing a homography relation based on the homography matrix, combining the relative characteristics of the camera array, determining a target calibration result, integrating a calibration thinking chain, and performing interactive interface visual display. The invention solves the technical problems that the traditional calibration method in the prior art needs a specific environment, a calibration plate and the like, and limits the flexibility, the accuracy and the calibration efficiency of the calibration process, and achieves the technical effects of improving the flexibility, the accuracy and the calibration efficiency of the calibration process through dynamic calibration.

Description

Dynamic calibration method for random moving multi-camera
Technical Field
The invention relates to the technical field of camera calibration, in particular to a dynamic calibration method for randomly moving a multi-camera.
Background
With the development of computer vision technology, the multi-camera system is widely applied to various occasions because of being capable of providing rich three-dimensional information. To ensure the accuracy of the system, the cameras need to be calibrated to determine their internal parameters and their geometrical relationship to each other. Conventional calibration methods typically require a specific calibration environment, use a fixed calibration plate, and require the camera to remain stationary, which greatly limits the flexibility and application scenarios of the calibration process.
Disclosure of Invention
The application provides a dynamic calibration method for randomly moving a multi-camera, which is used for solving the technical problems that the traditional calibration method in the prior art needs a specific environment, a calibration plate and the like, and the flexibility, the accuracy and the calibration efficiency of the calibration process are limited.
In a first aspect of the present application, there is provided a method for dynamic calibration of a randomly moving multi-camera, the method comprising: determining spatial distribution characteristics of a camera array and a calibration plate array, wherein the spatial distribution characteristics are determined based on a world coordinate system, the camera array is relatively independent and has random movement of spatial positions, and the calibration plate array is fixed in position and is marked with unique IDs; starting a real-time calibration mode of a camera system, combining the spatial distribution characteristics, capturing multiple spatial angles of a calibration plate based on the camera array, and determining video stream data; checking the video stream data and preprocessing the video stream data to determine an effective video stream, wherein the effective video stream is marked with a main point of a calibration board; identifying a calibration plate ID, and carrying out homography matrix analysis and distortion compensation processing on the effective video stream by combining an image processing module, so as to analyze and determine a homography matrix; analyzing a homography relation based on the homography matrix by combining a calibration algorithm, and determining a target calibration result by combining the relative characteristics of the camera array, wherein the relative characteristics at least comprise relative positions and relative posture characteristics; and integrating a calibration thinking chain, and combining the target calibration result to perform interactive interface visual display.
One or more technical schemes provided by the application have at least the following technical effects or advantages:
The application provides a dynamic calibration method of a random moving multi-camera, which relates to the technical field of camera calibration, and aims to solve the technical problems that a traditional calibration method in the prior art needs a specific environment, a calibration plate and the like, limits the flexibility, the accuracy and the calibration efficiency of a calibration process, and achieves the technical effects of improving the flexibility, the accuracy and the calibration efficiency of the calibration process through dynamic calibration by determining the spatial distribution characteristics of a camera array and a calibration plate array, capturing multiple spatial angles of the calibration plate, determining an effective video stream, carrying out homography matrix analysis and distortion compensation processing on the effective video stream, analyzing homography relation based on the homography matrix, combining the relative characteristics of the camera array, determining a target calibration result, and carrying out interactive interface visualization.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly explain the drawings needed in the description of the embodiments, which are merely examples of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a dynamic calibration method for a random-movement multi-camera according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of real-time correction of anomalies in a dynamic calibration process in a dynamic calibration method for a random-movement multi-camera according to an embodiment of the present application;
Fig. 3 is a schematic flow chart of homography matrix analysis and distortion compensation processing for an effective video stream in the dynamic calibration method of a random moving multi-camera according to the embodiment of the present application.
Detailed Description
The application provides a dynamic calibration method for randomly moving a multi-camera, which is used for solving the technical problems that the traditional calibration method in the prior art needs a specific environment, a calibration plate and the like, and the flexibility, the accuracy and the calibration efficiency of the calibration process are limited.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that, the terms "first," "second," and the like in the description of the present application and the above drawings are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the present application provides a dynamic calibration method for a randomly moving multi-camera, the method comprising:
P10: determining spatial distribution characteristics of a camera array and a calibration plate array, wherein the spatial distribution characteristics are determined based on a world coordinate system, the camera array is relatively independent and has random movement of spatial positions, and the calibration plate array is fixed in position and is marked with unique IDs;
optionally, a plurality of cameras are configured, a plurality of fixed calibration plates are arranged in the scene to form a camera array and a calibration plate array, and spatial distribution characteristics of the camera array and the calibration plate array, namely the relative position relationship of the camera array and the calibration plate array in a three-dimensional space, are determined based on a world coordinate system. The world coordinate system is a fixed three-dimensional coordinate system that describes the absolute position of all objects in space, and all positional and attitude information about the camera and calibration plate can be determined based on the world coordinate system.
Wherein the camera array is composed of a plurality of independent cameras and can freely move relative to each other and other objects, and the camera array is relatively independent, so that the spatial position of the camera array can be randomly changed along with the change of time or other factors. The fixed position of the array of calibration plates means that the calibration plates remain in the same position throughout, regardless of the movement of the camera. Each calibration plate has a unique ID so that different calibration plates can be accurately identified and distinguished in subsequent image processing.
P20: starting a real-time calibration mode of a camera system, combining the spatial distribution characteristics, capturing multiple spatial angles of a calibration plate based on the camera array, and determining video stream data;
Specifically, the camera system is set to a real-time calibration mode, in the real-time calibration mode, capturing operation of the camera is guided according to the space distribution characteristics determined before, the capturing operation comprises the steps of obtaining the relative position relation between the camera array and the calibration plate array and the visual field range of each camera, further, capturing images of the calibration plate from a plurality of different angles and positions by using the camera array, namely, capturing the images of the calibration plate at multiple space angles, wherein each camera captures the images of the calibration plate according to the visual field and the position of the camera, and generating video stream data, wherein the video stream data comprises the images of the calibration plate captured from the angles and the positions and camera information related to the images, such as ID, the position, the gesture and the like of the camera, and the video stream data can be used as basic input of a subsequent calibration algorithm.
Further, as shown in fig. 2, after the real-time calibration mode of the camera system is started, the embodiment of the present application further includes step P20a, where step P20a further includes:
p21a: determining a multiple abnormal type in a dynamic calibration process, and determining an abnormal contact, wherein the abnormal contact is determined based on critical abnormal characteristics;
p22a: mapping the multielement anomaly type and the anomaly contact, and setting an anomaly error correction mechanism;
P23a: and synchronously activating the abnormal error correction mechanism along with the starting of the real-time calibration mode to correct the abnormality in real time in the dynamic calibration process.
Further, step P22a of the embodiment of the present application further includes:
P22-1a: the abnormal error correction mechanism comprises dynamic triggering and result triggering;
P22-2a: if the dynamic triggering is performed, generating an instant adjustment instruction, and performing dynamic feedback regulation and control in the whole calibration period;
P22-3a: and if the result is triggered, tracing the error correction points and regulating and controlling the error correction points based on the abnormal causes, and compensating the error correction point results.
It should be appreciated that in practical applications, the dynamic calibration process may experience various anomalies that may affect the accuracy and stability of the calibration. Therefore, in the real-time calibration mode, it is very important to add an abnormality detection and correction mechanism.
Specifically, firstly, multiple abnormal types possibly occurring in the dynamic calibration process are identified and determined, including camera faults, calibration plate identification errors, data transmission errors and the like. Meanwhile, based on critical abnormality characteristics such as image quality degradation, data transfer delay, and the like, an abnormal contact, that is, a specific cause or condition that causes an abnormality to occur is determined.
Further, mapping the multiple exception types and the exception contacts, namely associating each exception type with the corresponding exception contact, and setting an exception error correction mechanism according to the mapping relation, wherein the exception error correction mechanism comprises processing strategies aiming at different exception types and is used for timely correcting and processing when the exception occurs. When the real-time calibration mode is started, the abnormal error correction mechanism is synchronously activated, and in the calibration process, if any abnormality is detected, the error correction mechanism is immediately triggered to correct the abnormality immediately, so that the continuity and stability of the calibration process are ensured, and calibration failure or inaccurate results caused by the abnormality are avoided.
The abnormal error correction mechanism comprises two triggering modes, namely dynamic triggering and result triggering, wherein the dynamic triggering is used for detecting the abnormality in real time and immediately correcting the abnormality in the calibration process; and the result triggering is to check the result after the calibration is finished, and trace and compensate if the abnormality is found.
Therefore, when the abnormal error correction mechanism is dynamically triggered, an instant adjustment instruction is generated, and relevant parameters or operations in the calibration process are adjusted in real time, so that dynamic feedback regulation and control are realized in the whole period, and the accuracy and stability of the calibration process are ensured. When the abnormal error correction mechanism is triggered by the result, tracing the error correction point of the calibration result, and finding out the specific cause or inducement of the abnormality. Then, the regulation and control processing is performed according to the reasons or causes, and the results of the error correction points are compensated, for example, if the acquired video stream data is abnormal, the results are repaired, or local re-acquisition is performed, if the calibration result is abnormal, the calibration is performed based on the influence of the abnormal tracing point and the abnormal stride, so that the accuracy and the reliability of the final calibration result are ensured.
P30: checking the video stream data and preprocessing the video stream data to determine an effective video stream, wherein the effective video stream is marked with a main point of a calibration board;
The main points of the calibration plate are effective distribution points in the calibration plate;
Optionally, in practical application, because various factors, such as illumination change, camera shake, shielding of the calibration board, etc., the captured video stream data may contain noise, blurring or missing information, it is necessary to verify accuracy, completeness, consistency of feature points, etc. of the video stream data, and perform preprocessing on the video stream data after verification, including filtering, enhancing, smoothing, etc. operations, to eliminate noise and blurring in the image, enhance contrast and sharpness of the image, and further determine an effective video stream, that is, image data that can clearly display the calibration board, and where the principal point of the calibration board can be accurately identified and located.
And the effective video stream is marked with a main point of the calibration plate, wherein the main point of the calibration plate is an effective distribution point position in the calibration plate, namely a specific mark point or pattern contained on the calibration plate, is easy to identify and position in image processing, and can be used for determining the relative position and direction between a camera and the calibration plate and calculating the internal and external parameters of the camera.
P40: identifying a calibration plate ID, and carrying out homography matrix analysis and distortion compensation processing on the effective video stream by combining an image processing module, so as to analyze and determine a homography matrix;
The image processing module comprises a matrix mapping unit and a distortion compensation unit, wherein the distortion compensation unit comprises a radial compensation branch and a tangential compensation branch which are communicated and interacted laterally;
The tangential compensation branch is generated based on the mapping undistorted sample and the distorted sample training, and comprises static pixel coordinates and continuous image coordinates; the radial compensation branches are trained on a center-to-edge trend basis.
Specifically, the ID of each calibration plate is identified to avoid confusion of identification targets. And carrying out homography matrix analysis and distortion compensation processing on the effective video stream by combining an image processing module, wherein the homography matrix can describe the mapping relation from one plane to the other plane, and the corresponding relation between the points on the calibration plate and the points on the image plane can be established through homography matrix analysis. The distortion compensation process is to eliminate image distortion caused by manufacturing and mounting errors of the camera lens, such as radial distortion and tangential distortion, through the image radial compensation branch and the tangential compensation branch.
And the image processing module comprises a matrix mapping unit and a distortion compensation unit, wherein the matrix mapping unit is responsible for mapping points on the calibration plate and points on the image plane, and a corresponding relation is established. The distortion compensation unit is responsible for carrying out distortion compensation processing on the image, and comprises a radial compensation branch and a tangential compensation branch, and the two branches work cooperatively, so that different types of distortion can be comprehensively processed.
The tangential compensation branch is generated by training based on the undistorted samples and the distorted samples, the undistorted samples and the distorted samples can be acquired through big data, the corresponding relation between the undistorted samples and the distorted samples is utilized, and through a machine learning principle, how to eliminate tangential distortion in an image is learned, and the tangential distortion is effectively compensated by processing the training static pixel coordinates and the continuous image coordinates.
The radial compensation branches train with the trend from the center to the edge as a reference, and mainly deal with radial distortion of the image, namely, the phenomenon that an object near the center of the image is enlarged and an object near the edge is reduced. Through training and learning, the radial compensation branches can adjust the pixel sizes of different positions in the image so as to eliminate radial distortion.
Further, as shown in fig. 3, step P40 of the embodiment of the present application further includes:
P41: taking a world coordinate system and a plane pixel coordinate system of a camera as references, extracting and matrixing mapping coordinate points of main points of the calibration plates, and determining an initialization matrix, wherein the mapping coordinate points are coordinate points of the main points of the calibration plates under the world coordinate system and the plane pixel coordinate system;
P42: and carrying out pixel distortion quantity recognition and correction on the initialization matrix, and determining the homography matrix.
Specifically, the world coordinate system and the plane pixel coordinate system of the camera are used as references, and the mapping coordinate points of the main points of the calibration plate are extracted and matrixed. The mapping coordinate points are corresponding coordinate points of the main points of the calibration plate under the world coordinate system and the pixel coordinate system, and the corresponding relation between the image plane and the world coordinate system is established by extracting and matrixing the mapping coordinate points. And determining an initialization matrix by matrixing the mapped coordinate points, wherein the initialization matrix can correspond to the position and the shape of an object in the image.
Further, for the initialization matrix, the distortion compensation unit is used for identifying and correcting the pixel distortion amount, and the homography matrix is determined, namely, the corresponding relation between the points on the calibration plate and the points on the image plane is established.
Further, the embodiment of the present application further includes a step P40a, where the step P40a further includes:
P41a: if the extraction of the mapping coordinate points of the main points of the calibration plate is limited, performing view angle conversion and pixel coordinate system synchronization, and extracting effective coordinate points:
the method for synchronizing the view angle conversion and the pixel coordinate system comprises the following steps:
positioning a local pixel region based on the mapping relation of the limited principal points;
Determining effective conversion parameters by taking the viewing angle distance and the angle as conversion characteristics, carrying out presence screening on video frames under a plane pixel coordinate system, and determining an effective pixel area under an original pixel coordinate system;
And identifying the effective coordinate point based on the effective pixel area.
It should be understood that in the case where extraction of the mapped coordinate points of the principal points of the calibration board is limited, for example, under the acquisition space angle, some principal point identification is limited, for example, under the lateral angle, there is visual overlapping of two points, and effective identification of the points, the coordinate points and the like cannot be effectively performed, and then the effective coordinate points are extracted by synchronizing the perspective conversion with the pixel coordinate system.
The method for synchronizing the view angle conversion and the pixel coordinate system comprises the following steps: based on the mapping relation of the limited principal points, the approximate position of the limited principal points in the image is estimated to locate the local pixel area. Further, the effective conversion parameter is determined by taking the visual angle distance and the angle as conversion characteristics, wherein the visual angle distance refers to the distance between the camera and the calibration plate, and the angle describes the shooting angle of the camera. By analyzing and calculating the viewing angle distance and angle, the effective conversion parameters, that is, the effective shooting distance and the effective shooting angle of each camera, can be determined to perform presence screening on the video frames under the plane pixel coordinate system, wherein the purpose of the presence screening is to determine the effective pixel area under the original pixel coordinate system. Since the viewing angle conversion may cause some parts of the image to be blocked or deformed, it is necessary to exclude these invalid regions by a screening process, leaving only valid pixel regions that can accurately reflect the position of the principal point of the calibration plate. Exemplary, based on the video stream, it is determined whether a viewing angle to be converted exists in the currently extracted video frame, if so, the corresponding frame region is directly extracted, and if not, the estimation is performed in combination with the inter-frame viewing angle difference, so as to obtain an effective pixel region.
And finally, identifying effective coordinate points based on the effective pixel area, wherein the effective coordinate points are accurate representations of main points of the calibration plate under a pixel coordinate system and can be used for subsequent matrixing and homography matrix calculation.
P50: analyzing a homography relation based on the homography matrix by combining a calibration algorithm, and determining a target calibration result by combining the relative characteristics of the camera array, wherein the relative characteristics at least comprise relative positions and relative posture characteristics;
Specifically, a calibration algorithm is utilized to analyze the homography relation based on the homography matrix, the homography matrix is subjected to deep analysis, information about image transformation and object positions is extracted, the relative characteristics of the camera array, including relative positions and relative posture characteristics, are combined, the homography relation is further optimized and adjusted, namely the homography matrix is corrected, and a target calibration result is determined.
Further, after determining the target calibration result, step P50a of the embodiment of the present application further includes:
p51a: analyzing a calibration error in combination with the relative characteristic, wherein the calibration error at least comprises a re-projection error;
p52a: optimizing the internal and external parameters of the camera by taking the minimum calibration error as a standard, and determining the calibrated internal and external parameters;
p53a: and based on the calibration internal and external parameters, carrying out real-time regulation and control on the camera array.
Optionally, after determining the target calibration result, the calibration accuracy of the camera array is improved through fine adjustment and optimization. Firstly, the relative characteristics of the camera array are combined to analyze the calibration error, which is an unavoidable part in the calibration process, and the accuracy of the calibration result can be reflected. The re-projection error is an important index in calibration error, namely, the deviation between a point in three-dimensional space and an actual image point after the point is projected to an image plane by a camera.
Furthermore, the internal and external parameters of the camera are optimized by taking the minimum calibration error as a standard, wherein the internal and external parameters are key parameters for describing the properties and the spatial positions of the camera, and the accuracy of the internal and external parameters directly influences the accuracy of the calibration result. And (3) carrying out iterative adjustment on the internal and external parameters through an optimization algorithm, such as a least square method, a gradient descent method and the like, so as to minimize calibration errors and determine the calibration internal and external parameters.
Further, based on the calibration internal and external parameters, real-time regulation and control of the camera array are performed, and the camera array is better adapted to different application scenes and changing conditions by adjusting parameters such as the position, the angle and the focal length of the camera.
P60: and integrating a calibration thinking chain, and combining the target calibration result to perform interactive interface visual display.
Further, step P60 of the embodiment of the present application further includes:
P61: integrating the calibrated thinking chain and identifying key thinking nodes;
P62: generating a temporary calibration file by combining the target calibration result based on the calibration thinking chain;
P63: based on the interactive interface, the temporary calibration file is visually displayed, the manual verification is carried out, a man-machine interaction instruction is generated, and background storage management is carried out.
In one possible embodiment of the application, the complex calibration process is presented in an intuitive manner by integrating a calibration thought chain and visually displaying the interactive interface of the target calibration result. Specifically, a calibration thinking chain is integrated and key thinking nodes are identified, wherein the calibration thinking chain refers to a series of logic thinking and operation steps from the beginning to the end of calibration, and a complete thinking framework is formed by integrating the steps. And, the identified key thinking nodes can be used to quickly locate important operations or decision points.
Further, a temporary calibration file is generated based on the calibration thinking chain and the target calibration result. The temporary calibration file is a recording and storing mode of the calibration result and contains all necessary calibration parameters and information. And carrying out visual display on the temporary calibration file based on the interaction interface, wherein through the visual display, a user can intuitively see various parameters and indexes of the calibration result, so that the calibration process and the calibration result are better understood. Meanwhile, errors or deviations possibly existing are found and corrected, and the accuracy of the calibration result is ensured. After the manual verification is carried out, a man-machine interaction instruction is generated, background storage management is carried out, and the calibration result and the verification information are integrated into the whole system, so that convenience is provided for subsequent application.
In summary, the embodiment of the application has at least the following technical effects:
According to the method, the spatial distribution characteristics of the camera array and the calibration plate array are determined, multi-spatial angle capturing of the calibration plate is carried out, an effective video stream is determined, homography matrix analysis and distortion compensation processing are carried out on the effective video stream, the homography matrix is determined through analysis, homography relation based on the homography matrix is analyzed, the relative characteristics of the camera array are combined, a target calibration result is determined, and visual display of an interactive interface is carried out.
The technical effects of improving the flexibility, accuracy and calibration efficiency of the calibration process through dynamic calibration are achieved.
It should be noted that the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The foregoing description of the preferred embodiments of the application is not intended to limit the application to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the application are intended to be included within the scope of the application.
The specification and figures are merely exemplary illustrations of the present application and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the scope of the application. Thus, it is intended that the present application cover the modifications and variations of this application provided they come within the scope of the application and its equivalents.

Claims (8)

1. A method for dynamically calibrating a randomly moving multi-camera, the method comprising:
Determining spatial distribution characteristics of a camera array and a calibration plate array, wherein the spatial distribution characteristics are determined based on a world coordinate system, the camera array is relatively independent and has random movement of spatial positions, and the calibration plate array is fixed in position and is marked with unique IDs;
Starting a real-time calibration mode of a camera system, combining the spatial distribution characteristics, capturing multiple spatial angles of a calibration plate based on the camera array, and determining video stream data;
Checking the video stream data and preprocessing the video stream data to determine an effective video stream, wherein the effective video stream is marked with a main point of a calibration board;
Identifying a calibration plate ID, and carrying out homography matrix analysis and distortion compensation processing on the effective video stream by combining an image processing module, so as to analyze and determine a homography matrix;
Analyzing a homography relation based on the homography matrix by combining a calibration algorithm, and determining a target calibration result by combining the relative characteristics of the camera array, wherein the relative characteristics at least comprise relative positions and relative posture characteristics;
And integrating a calibration thinking chain, and combining the target calibration result to perform interactive interface visual display, wherein the calibration thinking chain refers to a series of logic thinking and operation steps from the beginning to the end of calibration.
2. The method of claim 1, after initiating the real-time calibration mode of the camera system, comprising:
Determining a multiple abnormal type in a dynamic calibration process, and determining an abnormal contact, wherein the abnormal contact is determined based on critical abnormal characteristics;
mapping the multielement anomaly type and the anomaly contact, and setting an anomaly error correction mechanism;
and synchronously activating the abnormal error correction mechanism along with the starting of the real-time calibration mode to correct the abnormality in real time in the dynamic calibration process.
3. The method of claim 2, wherein the exception correction mechanism comprises a dynamic trigger and a result trigger;
if the dynamic triggering is performed, generating an instant adjustment instruction, and performing dynamic feedback regulation and control in the whole calibration period;
And if the result is triggered, tracing the error correction points and regulating and controlling the error correction points based on the abnormal causes, and compensating the error correction point results.
4. The method of claim 1, wherein the calibration plate principal points are effective distribution points in the calibration plate, the image processing module comprises a matrix mapping unit and a distortion compensation unit, and the distortion compensation unit comprises a radial compensation branch and a tangential compensation branch which are communicated and interacted laterally;
The tangential compensation branch is generated based on the mapping undistorted sample and the distorted sample in a training way and comprises static pixel coordinates and continuous image coordinates; the radial compensation branches are trained on a center-to-edge trend basis.
5. The method of claim 4, wherein performing homography matrix analysis and distortion compensation processing on the active video stream in conjunction with an image processing module comprises:
Taking a world coordinate system and a plane pixel coordinate system of a camera as references, extracting and matrixing mapping coordinate points of main points of the calibration plates, and determining an initialization matrix, wherein the mapping coordinate points are coordinate points of the main points of the calibration plates under the world coordinate system and the plane pixel coordinate system;
And carrying out pixel distortion quantity recognition and correction on the initialization matrix, and determining the homography matrix.
6. The method of claim 5, wherein if the extraction of the mapped coordinate points of the principal points of the calibration plate is limited, performing viewing angle conversion in synchronization with the pixel coordinate system, and extracting effective coordinate points:
the method for synchronizing the view angle conversion and the pixel coordinate system comprises the following steps:
positioning a local pixel region based on the mapping relation of the limited principal points;
Determining effective conversion parameters by taking the viewing angle distance and the angle as conversion characteristics, carrying out presence screening on video frames under a plane pixel coordinate system, and determining an effective pixel area under an original pixel coordinate system;
And identifying the effective coordinate point based on the effective pixel area.
7. The method of claim 1, wherein after determining the target calibration result, comprising:
Analyzing a calibration error in combination with the relative characteristic, wherein the calibration error at least comprises a re-projection error;
Optimizing the internal and external parameters of the camera by taking the minimum calibration error as a standard, and determining the calibrated internal and external parameters;
and based on the calibration internal and external parameters, carrying out real-time regulation and control on the camera array.
8. The method of claim 1, wherein integrating the calibration thought chain in conjunction with the target calibration results for interactive interface visual display comprises:
Integrating the calibrated thinking chain and identifying key thinking nodes;
generating a temporary calibration file by combining the target calibration result based on the calibration thinking chain;
based on the interactive interface, the temporary calibration file is visually displayed, the manual verification is carried out, a man-machine interaction instruction is generated, and background storage management is carried out.
CN202410510536.1A 2024-04-26 2024-04-26 Dynamic calibration method for random moving multi-camera Active CN118365713B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410510536.1A CN118365713B (en) 2024-04-26 2024-04-26 Dynamic calibration method for random moving multi-camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410510536.1A CN118365713B (en) 2024-04-26 2024-04-26 Dynamic calibration method for random moving multi-camera

Publications (2)

Publication Number Publication Date
CN118365713A CN118365713A (en) 2024-07-19
CN118365713B true CN118365713B (en) 2024-10-29

Family

ID=91877748

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410510536.1A Active CN118365713B (en) 2024-04-26 2024-04-26 Dynamic calibration method for random moving multi-camera

Country Status (1)

Country Link
CN (1) CN118365713B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
CN106204731A (en) * 2016-07-18 2016-12-07 华南理工大学 A kind of multi-view angle three-dimensional method for reconstructing based on Binocular Stereo Vision System

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110599548A (en) * 2019-09-02 2019-12-20 Oppo广东移动通信有限公司 Camera calibration method and device, camera and computer readable storage medium
CN113808220A (en) * 2021-09-24 2021-12-17 上海闻泰电子科技有限公司 Calibration method and system of binocular camera, electronic equipment and storage medium
CN115713564A (en) * 2022-11-24 2023-02-24 江西欧迈斯微电子有限公司 Camera calibration method and device
CN116977444A (en) * 2023-07-28 2023-10-31 江苏集萃华科智能装备科技有限公司 Stereoscopic microscope calibration method, device and system based on coplanar points
CN117557657A (en) * 2023-12-15 2024-02-13 武汉理工大学 Binocular fisheye camera calibration method and system based on Churco calibration plate

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567989A (en) * 2011-11-30 2012-07-11 重庆大学 Space positioning method based on binocular stereo vision
CN106204731A (en) * 2016-07-18 2016-12-07 华南理工大学 A kind of multi-view angle three-dimensional method for reconstructing based on Binocular Stereo Vision System

Also Published As

Publication number Publication date
CN118365713A (en) 2024-07-19

Similar Documents

Publication Publication Date Title
US11503275B2 (en) Camera calibration system, target, and process
CN110555889B (en) CALTag and point cloud information-based depth camera hand-eye calibration method
CN106504290B (en) A kind of high-precision video camera dynamic calibrating method
CN111210468A (en) Image depth information acquisition method and device
KR100805486B1 (en) A system and a method of measuring a display at multi-angles
CN112070845A (en) Calibration method and device of binocular camera and terminal equipment
US8428313B2 (en) Object image correction apparatus and method for object identification
CA2507174A1 (en) Method of registering and aligning multiple images
CN108492335B (en) Method and system for correcting perspective distortion of double cameras
CN115494652B (en) Method, device, equipment and storage medium for assembling head display equipment
US20140293035A1 (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and computer-readable recording medium
CN111627073B (en) Calibration method, calibration device and storage medium based on man-machine interaction
CN111383204A (en) Video image fusion method, fusion device, panoramic monitoring system and storage medium
CN110087049A (en) Automatic focusing system, method and projector
CN113269762A (en) Screen defect detection method, system and computer storage medium
CN110879131B (en) Imaging quality testing method and imaging quality testing device for visual optical system, and electronic apparatus
CN111445537A (en) Calibration method and system of camera
CN112541932A (en) Multi-source image registration method based on different focal length transformation parameters of dual-optical camera
CN102981683B (en) A kind of camera optical alignment method for quickly correcting and optical axis bearing calibration thereof
CN118365713B (en) Dynamic calibration method for random moving multi-camera
CN112037128A (en) Panoramic video splicing method
CN116563391A (en) Automatic laser structure calibration method based on machine vision
CN105791655A (en) Method for computing lens distortion of photographing module
CN115567678A (en) High-altitude parabolic monitoring method and system thereof
Zhang et al. Effective video frame acquisition for image stitching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant