Nothing Special   »   [go: up one dir, main page]

CN117714620A - Method and system for realizing synchronous acquisition of multiple types of sensors - Google Patents

Method and system for realizing synchronous acquisition of multiple types of sensors Download PDF

Info

Publication number
CN117714620A
CN117714620A CN202410160372.4A CN202410160372A CN117714620A CN 117714620 A CN117714620 A CN 117714620A CN 202410160372 A CN202410160372 A CN 202410160372A CN 117714620 A CN117714620 A CN 117714620A
Authority
CN
China
Prior art keywords
data
pixel
current
frame
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410160372.4A
Other languages
Chinese (zh)
Inventor
田滨
郭若楠
王海洋
孙扬
陈龙
吕宜生
王飞跃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jizhong Energy Fengfeng Group Co ltd
Institute of Automation of Chinese Academy of Science
Hebei University of Engineering
Original Assignee
Jizhong Energy Fengfeng Group Co ltd
Institute of Automation of Chinese Academy of Science
Hebei University of Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jizhong Energy Fengfeng Group Co ltd, Institute of Automation of Chinese Academy of Science, Hebei University of Engineering filed Critical Jizhong Energy Fengfeng Group Co ltd
Priority to CN202410160372.4A priority Critical patent/CN117714620A/en
Publication of CN117714620A publication Critical patent/CN117714620A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/06Generation of synchronising signals
    • H04N5/067Arrangements or circuits at the transmitter end
    • H04N5/0675Arrangements or circuits at the transmitter end for mixing the synchronising signals with the picture signal or mutually
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps
    • H04J3/0667Bidirectional timestamps, e.g. NTP or PTP for compensation of clock drift and for compensation of propagation delays

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method and a system for realizing synchronous acquisition of multiple types of sensors, wherein the method comprises the following steps: acquiring an ith pixel value in a pixel frame from a plurality of cameras in parallel based on a unit pixel clock and marking pixel time information, thereby storing the ith pixel value to a specified position in an image storage area; and adding 1 to the value of i, repeating the previous step until the storage of the current pixel frame is completed, wherein in the process of continuously obtaining the pixel frames of a plurality of cameras, based on a unit pixel clock, the j-th data value in the data frames from a plurality of types of non-image sensors is obtained in parallel and the current corresponding pixel time information is marked, so that the j-th data value of the completed time mark is stored in a designated sensor storage area, and j represents the data sequence number in one frame of data frame. The invention can be used for realizing the synchronous data acquisition of multiple types of sensors.

Description

Method and system for realizing synchronous acquisition of multiple types of sensors
Technical Field
The invention relates to the technical field of sensor acquisition, in particular to a method and a system for realizing synchronous acquisition of multiple types of sensors.
Background
The visual synchronous positioning and mapping (VSLAM) technique has wide application in the fields of mobile robots, unmanned aerial vehicles, autonomous driving, virtual Reality (VR), augmented Reality (AR) and the like. In practical floor application, a filter-based multi-sensor fusion SLAM algorithm is becoming the mainstream, and in this framework, a key problem faced is how to synchronize asymmetric sensor data, i.e. mark all received data with a high precision relative or absolute time, which helps to improve the positioning precision of the whole system.
In the prior art, an ARM chip is generally adopted as a core computing unit of a data acquisition product in consideration of factors such as cost, power consumption, portability and the like. However, because the ARM chip has the problems of inherent cache delay, uncontrollable response time caused by insufficient calculation power and the like, when a plurality of camera data or various sensor data are processed at the same time, the data can be received and stored with larger delay, so that the time of each sensor cannot be aligned accurately.
Disclosure of Invention
The invention aims at providing a scheme for synchronously collecting data of multiple types of sensors, which can meet the real-time requirement of a mobile platform.
In order to solve the above technical problems, an embodiment of the present invention provides a method for implementing synchronous acquisition of multiple types of sensors, including: step one, based on a unit pixel clock, acquiring an ith pixel value in a pixel frame from a plurality of cameras in parallel and marking pixel time information, so that the ith pixel value is stored in a designated position in an image storage area, wherein i represents a pixel serial number of the pixel frame of one frame; and step two, adding 1 to the value of i, and repeating the step one until the storage of the current pixel frame is completed, wherein in the process of continuously obtaining the pixel frames of a plurality of cameras, the j-th data value in the data frames from a plurality of types of non-image sensors is obtained in parallel based on a unit pixel clock and the current corresponding pixel time information is marked, so that the j-th data value of the completion time mark is stored in a designated sensor storage area, and j represents the data sequence number in one frame of data frame.
Preferably, the non-image sensor comprises at least one IMU sensor, wherein the method comprises: based on the unit pixel clock, the s-th group of data values in the data frame from at least one IMU sensor are obtained in parallel and the current corresponding pixel time information is marked, and the s-th group of data values with the time mark completed are stored in the corresponding rows in the appointed IMU storage unit columns in the IMU storage area, wherein the method comprises the following steps: s1, in a unit pixel time interval, acquiring an S-th group of IMU data value in an IMU data frame from at least one IMU sensor in parallel, marking current pixel time information at the head of the current IMU data value, and then storing the j-th group of IMU data value of the current finishing time mark into a row of an S-th corresponding data group in a designated IMU storage unit column; and S2, adding the value of S to 1, and repeating the step S1 in the next unit pixel time interval until the storage of the current IMU data frame is completed.
Preferably, the non-image sensor comprises at least one lidar sensor, wherein the method comprises: based on the unit pixel clock, obtaining an mth data value in a data frame from at least one laser radar sensor in parallel and marking current corresponding pixel time information, and storing the mth data value of the completed time mark in a corresponding row in a specified radar storage unit column in a laser radar storage area, wherein the method comprises the following steps: m1, in a unit pixel time interval, parallelly acquiring a 1 st data value in a radar data frame from at least one laser radar sensor, marking current pixel time information at the head of the current data value, and then storing the 1 st data value of the current completion time mark into a 1 st row in a corresponding row of radar storage units; m2, in the next unit pixel time interval, parallelly acquiring the 2 nd data value in the current radar data frame and storing the 2 nd data value of the current completion time mark into the 2 nd row in the corresponding column radar storage unit; and M3, adding 1 to the serial number of the data in the M2, and repeating the M2 in the next unit pixel time interval until the storage of the current radar data frame is completed.
Preferably, the non-image sensor comprises at least one GPS sensor, wherein the method comprises: based on the unit pixel clock, acquiring a kth group of data values in a data frame from the GPS sensor and marking current corresponding pixel time information, and storing the kth group of data values with the completed time mark into corresponding rows in a specified GPS storage unit column in a positioning information storage area, wherein the method comprises the following steps: step K1, acquiring a 1 st group of GPS data values in a GPS data frame from a GPS sensor in a unit pixel time interval, marking current pixel time information at the head of the current GPS data value, and then storing the 1 st group of GPS data values of the current finishing time mark into a 1 st row in a positioning information storage unit column; step K2, in the next unit pixel time interval, acquiring a 2 nd group of GPS data values in the current GPS data frame and storing the 2 nd group of GPS data values of the current completion time mark into a 2 nd row in a positioning information storage unit column; and step K3, adding 1 to the serial number of the GPS data in the step K2, and repeating the step K2 in the next unit pixel time interval until the storage of the current GPS data frame is completed.
Preferably, the storage positions of the various sensors and the internal structure of the storage area are determined according to the data bit number of the corresponding sensor, wherein in a row of memory in the IMU storage area, the same type of data are stored one by one according to the sequence of the IMU sensors to form; in a line of memory in the image storage area, the plurality of cameras are formed in a storage one by one in the order of the plurality of cameras.
Preferably, the method further comprises: acquiring current reference time information and generating a synchronization-based unit pixel clock accurate to milliseconds, wherein the acquired reference time information is stored in a time register to generate the unit pixel clock, the reference time information being time information from a GPS sensor or other device; and controlling the cameras and the at least one laser radar sensor to start acquisition based on the unit pixel clock, wherein an acquisition instruction is sent to the cameras and a PTP synchronization signal is sent to the at least one laser radar sensor.
Preferably, the method further comprises: in obtaining a frame of pixels for a plurality of cameras, comprising: calibrating the starting signals and exposure time lengths of the cameras so as to synchronize the time sequences of the output images of the cameras; based on the identification information of the plurality of cameras and the non-image sensors of different types, information specifying the memory range is read from the storage areas of different types.
Preferably, the method is implemented using an FPGA.
In another aspect, a system for implementing synchronous acquisition of multiple types of sensors is provided, where the system is configured to implement the method described above, and the system includes: a time register configured to generate a unit pixel clock; a camera synchronization acquisition module configured to acquire an i-th pixel value in a pixel frame from a plurality of cameras in parallel based on a unit pixel clock and mark pixel time information, thereby storing the i-th pixel value to a specified position in an image storage area, wherein i represents a pixel number of a pixel frame of one frame; and a non-image synchronization acquisition section configured to acquire, in parallel, a j-th data value in a data frame from the plurality of types of non-image sensors based on the unit pixel clock and mark current corresponding pixel time information in a process of continuously acquiring pixel frames of the plurality of cameras, thereby storing the j-th data value of the completion time mark in a specified sensor storage area, j representing a data sequence number in one frame of data frame.
Preferably, the system further comprises: an RTC clock module configured to acquire current reference time information to generate the unit pixel clock from the time register; and an instruction control module configured to control the plurality of cameras and the at least one lidar sensor to start acquisition based on the unit pixel clock, wherein an acquisition instruction is sent to the plurality of cameras and a PTP synchronization signal is sent to the at least one lidar sensor.
One or more embodiments of the above-described arrangements may have the following advantages or benefits over the prior art.
The invention provides a method and a system for realizing synchronous acquisition of multiple types of sensors. According to the method and the system, the synchronization and recording of all types of sensor data can be completed without manual intervention, so that the working complexity and strength of data acquisition personnel are reduced, and complete and synchronously aligned data are obtained.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention, without limitation to the invention.
Fig. 1 is a step diagram of a method for implementing synchronous acquisition of multiple types of sensors according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a multi-type sensor data synchronization acquisition and storage process in a method for implementing multi-type sensor synchronization acquisition according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of a system for implementing synchronous acquisition of multiple types of sensors according to an embodiment of the present application.
Detailed Description
The following will describe embodiments of the present invention in detail with reference to the drawings and examples, thereby solving the technical problems by applying technical means to the present invention, and realizing the technical effects can be fully understood and implemented accordingly. It should be noted that, as long as no conflict is formed, each embodiment of the present invention and each feature of each embodiment may be combined with each other, and the formed technical solutions are all within the protection scope of the present invention.
Additionally, the steps illustrated in the flowcharts of the figures may be performed in a computer system, such as a set of computer executable instructions. Also, while a logical order is depicted in the flowchart, in some cases, the steps depicted or described may be performed in a different order than presented herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Fig. 3 is a schematic structural diagram of a system for implementing synchronous acquisition of multiple types of sensors according to an embodiment of the present application. As shown in fig. 3, a system for implementing synchronous acquisition of multiple types of sensors (also referred to as a "synchronous acquisition system") according to an embodiment of the present invention at least includes: a time register 31, a camera synchronization acquisition module 32, and a non-image synchronization acquisition section 33.
The time register 31 is configured to generate a unit pixel clock.
The camera synchronization acquisition module 32 is connected to a plurality of cameras, respectively. The plurality of cameras may be integrated in a multi-view camera. The camera synchronization acquisition module 32 is configured to acquire an i-th pixel value in a pixel frame from a plurality of cameras in parallel based on a unit pixel clock and mark pixel time information so as to store the i-th pixel value to a specified position in an image memory area, where i represents a pixel number of a pixel frame of one frame.
In one embodiment, the camera synchronization acquisition module 32 includes: an image combining unit and a camera control unit. The camera control unit is mainly responsible for signal calibration of the plurality of cameras. In the process of obtaining one frame of pixel frames of the plurality of cameras, the camera control unit calibrates the start signals and the exposure time lengths of the plurality of cameras so that the timing of the output images of the plurality of cameras is synchronous. In addition, the camera control unit is configured to also acquire an i-th pixel value in a pixel frame from a plurality of cameras in parallel, and store the i-th pixel value of the completion timestamp to a specified position in the image storage area. The camera control unit is configured to retrieve current time information from the time register 31 and write a time stamp of the retrieved current time match in the current pixel value when the i-th pixel value in the current pixel frame is obtained.
The non-image synchronization acquiring section 33 is configured to acquire, in parallel, a j-th data value in a data frame from a plurality of types of non-image sensors based on a unit pixel clock and mark current corresponding pixel time information in a process of continuously acquiring pixel frames of a plurality of cameras, thereby storing the j-th data value of the completed time mark in a specified sensor storage area, where j represents a data sequence number in one frame of data frame.
As shown in fig. 3, the non-image synchronization acquiring section 33 includes: an IMU synchronization acquisition module 331, a radar synchronization acquisition module 332, and a GPS acquisition module 333.
The IMU synchronization acquisition module 331 is respectively connected to the plurality of IMU sensors. The IMU synchronization acquisition module is configured to acquire, in parallel, an s-th set of data values in a data frame from at least one IMU sensor based on a unit pixel clock and tag current corresponding pixel time information, and to store the time-tagged s-th set of data values in corresponding rows within a designated IMU storage unit column in the IMU storage area.
In one embodiment, the IMU synchronization acquisition module includes: and the IMU merging unit and the IMU control unit. The IMU control unit is configured for an s-th set of data values in a data frame from the at least one IMU sensor and stores the s-th set of data values, marked by the completion timestamp, in a corresponding row within a designated IMU memory cell column in the IMU memory region. The IMU merge unit is configured to retrieve current time information from the time register 31 and write a time stamp matching the retrieved current time in the current s-th set of data values when the s-th set of data values in the current IMU data frame is obtained.
The radar synchronous acquisition module is respectively connected with the plurality of laser radar sensors. The radar synchronization acquisition module is configured to acquire an mth data value in a data frame from at least one lidar sensor in parallel based on the unit pixel clock and tag current corresponding pixel time information, and store the mth data value of the completed time tag into a corresponding row within a designated radar storage unit column in the lidar storage area.
In one embodiment, the radar synchronization acquisition module includes: a radar merging unit and a radar control unit. The radar control unit is configured for an mth data value in a data frame from the at least one lidar sensor and stores the mth data value of the completion time stamp mark in a corresponding row within a designated radar storage unit column in the lidar storage area. The radar merging unit is configured to retrieve current time information from the time register 31 and to write a time stamp matching the retrieved current time in the current mth set of data values when the mth set of data values in the current radar data frame is obtained.
The GPS acquisition module is connected with the GPS sensor. The GPS acquisition module acquires the nth data value in the data frame from the GPS sensor and marks the current corresponding pixel time information based on the unit pixel clock (e.g., recalls the current time information from the time register 31 and writes the recalled current time-matched timestamp in the current nth set of data values), and stores the nth data value labeled with the completed timestamp in a corresponding row within the designated GPS memory cell column in the location information storage area.
In addition, the synchronous acquisition system according to the embodiment of the invention further comprises: an RTC clock module 34 and an instruction control module 35.
In one embodiment, the RTC clock module 34 is configured to obtain current reference time information to generate a unit pixel clock from the time register 31.
In one embodiment, the instruction control module 35 is configured to control the plurality of cameras and the at least one lidar sensor to begin acquisition based on a unit pixel clock. Specifically, the command control module 35 may send acquisition commands to the plurality of cameras and PTP synchronization signals to at least one lidar sensor.
In one embodiment, the time register 31, the camera synchronization acquiring module 32, the non-image synchronization acquiring section 33 and the instruction control module 35 according to the embodiment of the present invention are integrated in an FPGA chip.
Fig. 1 is a step diagram of a method for implementing synchronous acquisition of multiple types of sensors according to an embodiment of the present application. A specific step flow of a method for implementing synchronous acquisition of multiple types of sensors (also referred to as "synchronous acquisition method") according to an embodiment of the present invention will be described below with reference to fig. 1.
In one embodiment, the synchronous acquisition method provided by the embodiment of the invention adopts an FPGA chip to realize synchronous acquisition of the multi-type sensor data with low cost and high precision.
Before the step S110 of the present invention is implemented, the RTC clock module 34 acquires the current reference time information, and time is given to the camera synchronization acquisition module 32, the non-image synchronization acquisition unit 33, and the instruction control module 35. Specifically, during time service, the current reference time information is first obtained by the time register 31 and stored to generate a synchronization-based unit pixel clock in the time register 31. The unit pixel clock is accurate to milliseconds.
In the embodiment of the invention, the reference time information is a 1-frequency multiplied clock signal, and the unit pixel clock is an n-frequency multiplied clock signal.
In one embodiment, the reference time information described in the embodiments of the present invention is from a GPS sensor or FPGA chip that can accept time information from other devices.
Then, after timing the camera synchronization acquisition module 32, the non-image synchronization acquisition part 33 and the instruction control module 35, the synchronous acquisition method of the present invention controls the plurality of cameras and the at least one lidar sensor to start acquisition based on the unit pixel clock. Wherein, the instruction control module 35 sends the acquisition instructions to the plurality of cameras to activate the acquisition work of the plurality of cameras, and sends the PTP synchronization signal to the at least one lidar sensor to activate the acquisition work of the lidar sensor. Thereafter, the process proceeds to step S110 to start the data synchronous collection and storage phase.
The camera synchronization acquisition module 32 and the non-image synchronization acquisition unit 33 collect the sensor data acquisition results from different types in parallel, time stamp all the data in frames, and store them in a specified memory.
In the embodiment of the present invention, the memory may be a memory included in the FPGA chip, or may be a memory that is external to the FPGA chip and accessible by the FPGA chip.
In one embodiment, the memory locations of the various types of sensors and the internal structure of the memory area are determined based on the number of data bits of the corresponding sensor. Therefore, according to the embodiment of the invention, corresponding storage positions are respectively planned for the sensors of different types in the memory according to the data bit numbers corresponding to the sensors of different types, so that the data of the sensors of the specified types are respectively stored, and the processing speed of an upper computer program can be greatly improved.
Step S110 is to acquire, in parallel, the ith pixel value in the pixel frame from the plurality of cameras based on the unit pixel clock, and mark the pixel time information in all the ith pixel values, thereby storing the ith pixel value in the ith row in the specified image storage unit column in the image storage area by the camera synchronization acquisition module 32. Where i represents the pixel number of a frame of pixels. In the embodiment of the present invention, I ranges from 1 to I, where I represents the number of pixels in a frame of pixels.
Step S120 is that the camera synchronization obtaining module 32 adds 1 to the current i value corresponding to step 110, and repeats the above step one until the storage of the current complete pixel frames respectively corresponding to all the cameras is completed.
When the sensor type is a plurality of cameras, the embodiment of the invention divides the required memory into image storage areas. The image memory area includes a plurality of columns of image memory units, each column of image memory units corresponding to one camera, a plurality of rows of image memory elements being formed in each column of image memory units, one pixel value of one camera being stored for a single image memory element, such that an ith pixel value of the corresponding camera received in the same pixel time interval is stored one by one in accordance with a camera serial number for the same memory row of all cameras.
In the embodiment of the invention, in one line of memory in the image storage area, the cameras are formed in a storage mode one by one in sequence.
For example, as shown in fig. 2, if 3 cameras are connected to the current synchronous acquisition method, a camera 1 corresponding to the first column of storage units, a camera 2 corresponding to the second column of storage units, and a camera 3 corresponding to the third column of storage units are planned in the memory required by the current synchronous acquisition method. If the pixel values of the respective cameras are encoded with 10 bits, a single image storage element is divided in units of 10 bits as one storage unit. When video signals from a plurality of cameras are received, respectively obtaining first pixel pixel_11 from three cameras in a first pixel clock (unit pixel time interval), and time stamping the head parts of the pixel pixels_11 from 3 cameras, so that the video signals are sequentially stored in 1 to 3 image storage elements in a row of memory in the order of cameras 1 to 3; next, in the second pixel clock (next unit pixel time interval), continuing to obtain one pixel pixel_2 from the 3 cameras respectively, and marking the time stamps of the heads of the pixel pixels_2 from the 3 cameras, so as to sequentially store the time stamps in the 1 to 3 image storage elements of the secondary line memory according to the sequence from the camera 1 to the camera 3; this is repeated until all pixels required for one complete pixel frame for all cameras are obtained.
In addition, the camera synchronization acquiring module 32 according to the embodiment of the present invention integrates the time information from the time register 31 into the pixel frame of the corresponding camera. Specifically, the camera synchronization acquiring module 32 stores the time information corresponding to the respective cameras in the image storage element of the last row of each column of the image storage unit after storing one complete pixel frame corresponding to each camera.
In addition, in the process of obtaining a complete pixel frame of the plurality of cameras, the camera synchronization obtaining module 32 further continuously calibrates the start signals and the exposure time periods of the plurality of cameras so as to synchronize the timing of the output images of the plurality of cameras.
Specifically, the camera synchronization acquisition module 32 outputs calibration signals to a plurality of cameras simultaneously according to the time information called from the time register 31 to calibrate the timings of the plurality of cameras. The calibration signals include a signal PWO for calibration initiation (Power On) and an RRC signal for calibrating exposure time.
As shown in fig. 1, in the process that the camera synchronization acquisition module 32 continuously acquires pixel frames from a plurality of cameras, the non-image synchronization acquisition section 33 also synchronously acquires real-time acquisition data transmitted by other non-image sensors.
Specifically, the non-image synchronization acquiring section 33 acquires the jth data value in the data frame from the plural kinds of non-image sensors in parallel based on the unit pixel clock and marks the current corresponding pixel time information, thereby storing the jth data value of the current completion time mark in the specified sensor storage area. Where j represents the data sequence number in a frame of non-image sensor data frame.
When the sensor type is at least one IMU sensor, the embodiment of the invention further divides the required memory into IMU storage areas. The IMU memory area includes a plurality of columns of IMU memory cells, each column of IMU memory cells corresponding to one IMU sensor, a plurality of rows of IMU memory elements are formed in each column of IMU memory cells, and one data value in one frame data of one IMU sensor is stored for a single IMU memory element, so that the s-th data value of the corresponding IMU sensor received in the same pixel time interval is stored for the same memory row of all IMU sensors one by one according to an IMU sensor sequence number.
In one embodiment, during the process of the camera synchronization acquiring module 32 continuously acquiring pixel frames from multiple cameras, the IMU synchronization acquiring module 331 simultaneously acquires the s-th set of data values in the data frames from at least one IMU sensor in parallel based on the unit pixel clock, marks the corresponding (real-time) pixel time information (e.g., a timestamp) in the current s-th set of data values, and then stores the s-th set of data values with the completed timestamp in the corresponding row in the designated IMU storage unit column in the IMU storage area. Where s represents the data sequence number in a frame of IMU data frame. In the embodiment of the invention, S ranges from 1 to S, and S represents the number of data in one IMU data frame.
Specifically, in step S1, during a certain unit pixel time interval during the transmission of the pixel frames of the multiple cameras, the IMU synchronization acquiring module 331 acquires, in parallel, an S-th group IMU data value in the IMU data frame from at least one IMU sensor and marks the current pixel time information at the header of the current IMU data value, and then stores the S-th group IMU data value of the current completion time stamp in a row of the S-th corresponding data group in the designated IMU storage unit column. And S2, adding 1 to the current S value in the step S1, and repeating the step S1 in the next unit pixel time interval until the storage of the current IMU data frame is completed.
In an embodiment of the invention, the set of IMU data values includes an acceleration value and an angular velocity value.
In the embodiment of the invention, the data of the same type are stored one by one in a row of memory in the IMU storage area according to the sequence of the IMU sensors.
For example, as shown in fig. 2, if 2 IMU sensors are connected to the current synchronous acquisition method, IMU1 corresponding to the first row of IMU storage units and IMU2 corresponding to the second row of IMU storage units are planned in the memory required by the current synchronous acquisition method. If the acceleration value and the angular velocity value of each IMU sensor are each encoded with 64 bits, then a single IMU memory element is partitioned into one unit of memory according to the 64 bits. When data from a plurality of IMU sensors are received, a first group of acceleration values acc_1 (IMU 1) containing 64 bits and angular velocity values gyo _1 (IMU 1) containing 64 bits are respectively obtained from 2 IMUs in a first pixel clock (unit pixel time interval), and the acceleration groups and the heads of the angular velocity groups in two groups of data from 2 cameras are respectively marked with a time stamp, so that the acceleration groups are sequentially stored in 1 to 2 IMU storage elements of a 1 st row of memory in the order of IMUs 1 to IMU2, and the angular velocity groups are sequentially stored in 1 to 2 IMU storage elements of a 2 nd row of memory; next, in a second pixel clock (next unit pixel time interval), continuing to obtain a second group of acceleration values acc_2 (IMU 1) containing 64 bits and angular velocity values gyo _2 (IMU 1) containing 64 bits from the 2 IMUs, respectively, marking the acceleration groups and the heads of the angular velocity groups in the two groups of data groups from the 2 cameras with a time stamp, respectively, so as to sequentially store the acceleration groups in the 1 to 2 IMU storage elements of the 3 rd row memory in the order of IMU1 to IMU2, and sequentially store the angular velocity groups in the 1 to 2 IMU storage elements of the 4 th row memory; and repeating the steps until all data sets required by one complete data frame corresponding to all IMUs are obtained.
When the sensor type is at least one laser radar sensor, the embodiment of the invention further divides the required memory into laser radar storage areas. The laser radar storage area comprises a plurality of columns of radar storage units, each column of radar storage units corresponds to one laser radar sensor, a plurality of rows of radar storage elements are formed in each column of radar storage units, one data value in one frame data of one laser radar sensor is stored for a single radar storage element, and the m-th data value of the corresponding laser radar sensor received in the same pixel time interval is stored for the same storage row of all the laser radar sensors one by one according to the serial number of the laser radar sensor.
In one embodiment, during the process of the camera synchronization acquisition module 32 continuously acquiring pixel frames from multiple cameras, the radar synchronization acquisition module 332 simultaneously acquires the mth group of data values in the data frames from at least one lidar sensor in parallel based on the unit pixel clock, marks the corresponding (real-time) pixel time information (e.g., a timestamp) in the current mth group of data values, and then stores the mth group of data values with the completed timestamp in the corresponding row within the designated radar storage unit column in the lidar storage area. Where m represents a data sequence number in one frame of radar data frame. In the embodiment of the invention, M ranges from 1 to M, and M represents the number of data in one radar data frame.
Specifically, in step M1, during a certain unit pixel time interval during the transmission of the pixel frames of the plurality of cameras, the 1 st point cloud data value in the radar data frame from the at least one lidar sensor is obtained in parallel by the radar synchronization obtaining module 332 and the current pixel time information is marked at the head of the current data value, and then the 1 st group of data values of the current completion time mark are stored in the 1 st row in the corresponding column radar storage unit. And S2, in the next unit pixel time interval, acquiring the 2 nd data value from the current radar data frame in parallel and storing the 2 nd data value of the current completion time mark into the 2 nd row in the radar storage unit of the corresponding column. And S3, adding 1 to the data sequence number M in the step M2, and repeating the step M2 in the next unit pixel time interval until the storage of the current radar data frame is completed.
In the embodiment of the invention, in a row of memories in the laser radar storage area, data are stored one by one according to the sequence of the laser radar sensors.
For example, as shown in fig. 2, if 1 lidar sensor is connected to the current synchronous acquisition method, the lidar sensor 1 corresponding to the first row of radar storage units is planned in the memory required for the current synchronous acquisition method. When data from the laser radar sensor is received, a first data value point_1 is obtained from the laser radar sensor in a first pixel clock (unit pixel time interval), and the head of the current data value point_1 is marked with a timestamp so as to be stored in a row 1 radar storage element; continuing to obtain a second data value point_2 from the laser radar sensor in a second pixel clock (unit pixel time interval), so as to store the current data value point_2 in the row 2 radar storage element; and then, continuously adding 1 to the current data sequence number value, and repeating the previous step until all data required by one complete data frame corresponding to all the laser radars are obtained.
When the sensor type is a GPS sensor, the embodiment of the invention also divides the required memory into a positioning information storage area. The positioning information storage area includes at least one column of GPS storage elements corresponding to one GPS sensor, and a plurality of rows of GPS storage elements are formed in the GPS storage elements, and one data value in one frame data of one GPS sensor is stored for a single GPS storage element.
In one embodiment, during the process of the camera synchronization acquisition module 32 continuously acquiring pixel frames from multiple cameras, the GPS acquisition module 333 concurrently acquires the kth set of data values in the data frames from the GPS sensor based on the unit pixel clock, marks the corresponding (real-time) pixel time information (e.g., a timestamp) in the current kth set of data values, and then stores the time-marked kth set of data values in the corresponding row within the designated GPS storage unit column in the positioning information storage area. Where k represents the data sequence number in a frame of GPS data frame. In the embodiment of the invention, K ranges from 1 to K, and K represents the number of data in one GPS data frame.
Specifically, in step K1, during a certain unit pixel time interval during transmission of the pixel frames of the plurality of cameras, the 1 st set of GPS data values in the GPS data frame from the GPS sensor are acquired in parallel by the GPS acquisition module 333 and the current pixel time information is marked at the head of the current GPS data value, and then the 1 st set of GPS data values of the current completion time mark are stored in the row of the 1 st corresponding data set in the designated GPS storage unit column. And S2, in the next unit pixel time interval, acquiring the 2 nd data value from the current GPS data frame in parallel and storing the 2 nd data value of the current completion time mark into the 2 nd row in the corresponding column GPS storage unit. And S3, adding 1 to the data sequence number K in the step K2, and repeating the step K2 in the next unit pixel time interval until the storage of the current GPS data frame is completed.
In an embodiment of the present invention, the set of GPS data values includes latitude and longitude values.
For example, as shown in fig. 2, if the current synchronous acquisition method is connected with a GPS sensor, a GPS sensor 1 corresponding to the first row of GPS storage units is planned in the memory required for the current synchronous acquisition method. When data from the GPS sensor is received, a first group of data sets containing latitude values lat_1 and longitude values lon_1 are obtained from the GPS sensor in a first pixel clock (unit pixel time interval), and the heads of the latitude values lat_1 and the longitude values lon_1 in the current data set are respectively marked with a time stamp, so that the latitude values lat_1 are stored in GPS storage elements of a 1 st row memory, and the longitude values lon_1 are stored in GPS storage elements of a 2 nd row memory; next, in a second pixel clock (unit pixel time interval), a second data set containing latitude values lat_2 and longitude values lon_2 is obtained from the GPS sensor, and the heads of the latitude values lat_2 and longitude values lon_2 in the current data set are respectively marked with a time stamp, so that the latitude values lat_2 are stored in the GPS storage element of the 3 rd line memory, and the longitude values lon_2 are stored in the GPS storage element of the 4 th line memory; and then, continuously adding 1 to the current data sequence number value, and repeating the previous step until all data required by one complete data frame corresponding to all GPS are obtained.
In addition, in the embodiment of the invention, based on the identification information of the plurality of cameras and the non-image sensors of different types, the information in the specified memory range is read from the storage areas of different types.
At the time of reading, pixel values of 1 st to last rows in any one column of the storage units may be read to output a data frame corresponding to a corresponding type of sensor.
The invention discloses a method and a system for realizing synchronous acquisition of multiple types of sensors. According to the method and the system, the synchronization and recording of all types of sensor data can be completed without manual intervention, so that the working complexity and strength of data acquisition personnel are reduced, and complete and synchronously aligned data are obtained.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.
In the description of the present invention, unless otherwise indicated, the meaning of "a plurality" is two or more; the terms "upper," "lower," "left," "right," "inner," "outer," "front," "rear," "head," "tail," and the like are used as an orientation or positional relationship based on that shown in the drawings, merely to facilitate description of the invention and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore should not be construed as limiting the invention. Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "connected," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
It is to be understood that the disclosed embodiments are not limited to the specific structures, process steps, or materials disclosed herein, but are intended to extend to equivalents of these features as would be understood by one of ordinary skill in the relevant arts. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearances of the phrase "one embodiment" or "an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment.
While the embodiments of the present invention have been described above, the embodiments are presented for the purpose of facilitating understanding of the invention and are not intended to limit the invention. Any person skilled in the art can make any modification and variation in form and detail without departing from the spirit and scope of the present disclosure, but the scope of the present disclosure is still subject to the scope of the appended claims.

Claims (10)

1. A method for achieving simultaneous acquisition of multiple types of sensors, comprising:
step one, based on a unit pixel clock, acquiring an ith pixel value in a pixel frame from a plurality of cameras in parallel and marking pixel time information, so that the ith pixel value is stored in a designated position in an image storage area, wherein i represents a pixel serial number of the pixel frame of one frame;
step two, adding 1 to the i value, repeating the step one until the storage of the current pixel frame is completed, wherein,
in the process of continuously obtaining pixel frames of a plurality of cameras, based on a unit pixel clock, a j-th data value in a data frame from a plurality of types of non-image sensors is obtained in parallel and the current corresponding pixel time information is marked, so that the j-th data value of the finishing time mark is stored in a specified sensor storage area, and j represents a data sequence number in a data frame of one frame.
2. The method of claim 1, wherein the non-image sensor comprises at least one IMU sensor, wherein the method comprises:
based on the unit pixel clock, the s-th group of data values in the data frame from at least one IMU sensor are obtained in parallel and the current corresponding pixel time information is marked, and the s-th group of data values with the time mark completed are stored in the corresponding rows in the appointed IMU storage unit columns in the IMU storage area, wherein the method comprises the following steps:
s1, in a unit pixel time interval, acquiring an S-th group of IMU data value in an IMU data frame from at least one IMU sensor in parallel, marking current pixel time information at the head of the current IMU data value, and then storing the j-th group of IMU data value of the current finishing time mark into a row of an S-th corresponding data group in a designated IMU storage unit column;
and S2, adding the value of S to 1, and repeating the step S1 in the next unit pixel time interval until the storage of the current IMU data frame is completed.
3. The method of claim 2, wherein the non-image sensor comprises at least one lidar sensor, wherein the method comprises:
Based on the unit pixel clock, obtaining an mth data value in a data frame from at least one laser radar sensor in parallel and marking current corresponding pixel time information, and storing the mth data value of the completed time mark in a corresponding row in a specified radar storage unit column in a laser radar storage area, wherein the method comprises the following steps:
m1, in a unit pixel time interval, parallelly acquiring a 1 st data value in a radar data frame from at least one laser radar sensor, marking current pixel time information at the head of the current data value, and then storing the 1 st data value of the current completion time mark into a 1 st row in a corresponding row of radar storage units;
m2, in the next unit pixel time interval, parallelly acquiring the 2 nd data value in the current radar data frame and storing the 2 nd data value of the current completion time mark into the 2 nd row in the corresponding column radar storage unit;
and M3, adding 1 to the serial number of the data in the M2, and repeating the M2 in the next unit pixel time interval until the storage of the current radar data frame is completed.
4. A method according to claim 3, wherein the non-image sensor comprises at least one GPS sensor, wherein the method comprises:
Based on the unit pixel clock, acquiring a kth group of data values in a data frame from the GPS sensor and marking current corresponding pixel time information, and storing the kth group of data values with the completed time mark into corresponding rows in a specified GPS storage unit column in a positioning information storage area, wherein the method comprises the following steps:
step K1, acquiring a 1 st group of GPS data values in a GPS data frame from a GPS sensor in a unit pixel time interval, marking current pixel time information at the head of the current GPS data value, and then storing the 1 st group of GPS data values of the current finishing time mark into a 1 st row in a positioning information storage unit column;
step K2, in the next unit pixel time interval, acquiring a 2 nd group of GPS data values in the current GPS data frame and storing the 2 nd group of GPS data values of the current completion time mark into a 2 nd row in a positioning information storage unit column;
and step K3, adding 1 to the serial number of the GPS data in the step K2, and repeating the step K2 in the next unit pixel time interval until the storage of the current GPS data frame is completed.
5. The method according to claim 2 to 4, wherein the storage locations of the various sensors and the internal structure of the storage area are determined according to the number of data bits of the corresponding sensor,
In a row of memories in the IMU storage area, storing the same type of data one by one according to the sequence of the IMU sensors to form;
in a line of memory in the image storage area, the plurality of cameras are formed in a storage one by one in the order of the plurality of cameras.
6. The method according to claim 3 or 4, characterized in that the method further comprises:
acquiring current reference time information and generating a synchronization-based unit pixel clock accurate to milliseconds, wherein the acquired reference time information is stored in a time register to generate the unit pixel clock, the reference time information being time information from a GPS sensor or other device;
and controlling the cameras and the at least one laser radar sensor to start acquisition based on the unit pixel clock, wherein an acquisition instruction is sent to the cameras and a PTP synchronization signal is sent to the at least one laser radar sensor.
7. The method according to any one of claims 1-4, further comprising:
in obtaining a frame of pixels for a plurality of cameras, comprising: calibrating the starting signals and exposure time lengths of the cameras so as to synchronize the time sequences of the output images of the cameras;
Based on the identification information of the plurality of cameras and the non-image sensors of different types, information specifying the memory range is read from the storage areas of different types.
8. The method according to any one of claims 1 to 4, wherein the method is implemented using an FPGA.
9. A system for implementing synchronous acquisition of multiple types of sensors, wherein the system is configured to implement the method of any one of claims 1-8, wherein the system comprises:
a time register configured to generate a unit pixel clock;
a camera synchronization acquisition module configured to acquire an i-th pixel value in a pixel frame from a plurality of cameras in parallel based on a unit pixel clock and mark pixel time information, thereby storing the i-th pixel value to a specified position in an image storage area, wherein i represents a pixel number of a pixel frame of one frame;
and a non-image synchronization acquisition section configured to acquire, in parallel, a j-th data value in a data frame from the plurality of types of non-image sensors based on the unit pixel clock and mark current corresponding pixel time information in a process of continuously acquiring pixel frames of the plurality of cameras, thereby storing the j-th data value of the completion time mark in a specified sensor storage area, j representing a data sequence number in one frame of data frame.
10. The system of claim 9, wherein the system further comprises:
an RTC clock module configured to acquire current reference time information to generate the unit pixel clock from the time register;
and an instruction control module configured to control the plurality of cameras and the at least one lidar sensor to start acquisition based on the unit pixel clock, wherein an acquisition instruction is sent to the plurality of cameras and a PTP synchronization signal is sent to the at least one lidar sensor.
CN202410160372.4A 2024-02-05 2024-02-05 Method and system for realizing synchronous acquisition of multiple types of sensors Pending CN117714620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410160372.4A CN117714620A (en) 2024-02-05 2024-02-05 Method and system for realizing synchronous acquisition of multiple types of sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410160372.4A CN117714620A (en) 2024-02-05 2024-02-05 Method and system for realizing synchronous acquisition of multiple types of sensors

Publications (1)

Publication Number Publication Date
CN117714620A true CN117714620A (en) 2024-03-15

Family

ID=90153797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410160372.4A Pending CN117714620A (en) 2024-02-05 2024-02-05 Method and system for realizing synchronous acquisition of multiple types of sensors

Country Status (1)

Country Link
CN (1) CN117714620A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9140555B1 (en) * 2014-03-14 2015-09-22 Google Inc. Navigation using sensor fusion
CN107809560A (en) * 2017-09-05 2018-03-16 百度在线网络技术(北京)有限公司 The method and fpga chip that more mesh cameras are synchronized
CN109729277A (en) * 2018-11-19 2019-05-07 魔门塔(苏州)科技有限公司 Multi-sensor collection timestamp synchronizing device
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111934843A (en) * 2020-07-31 2020-11-13 深圳市智绘科技有限公司 Multi-sensor data synchronous acquisition method for intelligent unmanned system
CN112230240A (en) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 Space-time synchronization system, device and readable medium for laser radar and camera data
CN115294226A (en) * 2022-07-22 2022-11-04 中电海康集团有限公司 Image reconstruction method and device, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9140555B1 (en) * 2014-03-14 2015-09-22 Google Inc. Navigation using sensor fusion
CN107809560A (en) * 2017-09-05 2018-03-16 百度在线网络技术(北京)有限公司 The method and fpga chip that more mesh cameras are synchronized
CN109729277A (en) * 2018-11-19 2019-05-07 魔门塔(苏州)科技有限公司 Multi-sensor collection timestamp synchronizing device
CN111435162A (en) * 2020-03-03 2020-07-21 深圳市镭神智能系统有限公司 Laser radar and camera synchronization method, device, equipment and storage medium
CN111934843A (en) * 2020-07-31 2020-11-13 深圳市智绘科技有限公司 Multi-sensor data synchronous acquisition method for intelligent unmanned system
CN112230240A (en) * 2020-09-30 2021-01-15 深兰人工智能(深圳)有限公司 Space-time synchronization system, device and readable medium for laser radar and camera data
CN115294226A (en) * 2022-07-22 2022-11-04 中电海康集团有限公司 Image reconstruction method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110969663B (en) Static calibration method for external parameters of camera
CN105426121B (en) Boat-carrying multisensor integrated measuring data real-time storage method
US20200202147A1 (en) Systems and methods for automatic labeling of images for supervised machine learning
RU2562707C2 (en) Systems and methods of capture of large-area images by parts including cascade cameras and/or calibration attributes
CN111065043B (en) System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
CN103148850B (en) High-precision star sensor
CN107659367A (en) More sensing unit method for synchronizing time and system
CN112861660B (en) Laser radar array and camera synchronization device, method, equipment and storage medium
US4122521A (en) Correlation system for multiple-sensor reconnaissance vehicle
CN110855902B (en) High-precision multipath aerial survey camera exposure time synchronization device and method based on FPGA
CN111934843A (en) Multi-sensor data synchronous acquisition method for intelligent unmanned system
CN108449552B (en) The method and system at tag image acquisition moment
CN107809560B (en) Method for synchronizing multi-view camera and FPGA chip
CN113219479A (en) Camera and laser radar synchronization method and system of intelligent driving control system
CN114755693B (en) Infrastructure facility measuring system and method based on multi-rotor unmanned aerial vehicle
CN114383611A (en) Multi-machine cooperative laser SLAM method, device and system for mobile robot
CN110082739A (en) Method of data synchronization and equipment
CN112985374A (en) Positioning method, positioning assembly and positioning system
CN117714620A (en) Method and system for realizing synchronous acquisition of multiple types of sensors
CN116684740A (en) Perception training data generation method, device, computer equipment and storage medium
CN111457917A (en) Multi-sensor time synchronization measuring method and system
CN108827326A (en) A kind of acquisition method and its acquisition device of the navigation map based on big data
CN112995524A (en) High-precision acquisition vehicle, and photo exposure information generation system, method and synchronization device thereof
CN110969664B (en) Dynamic calibration method for external parameters of camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination