Nothing Special   »   [go: up one dir, main page]

WO2023103143A1 - 传感器检测的方法、装置、电子设备及可读储存介质 - Google Patents

传感器检测的方法、装置、电子设备及可读储存介质 Download PDF

Info

Publication number
WO2023103143A1
WO2023103143A1 PCT/CN2022/071109 CN2022071109W WO2023103143A1 WO 2023103143 A1 WO2023103143 A1 WO 2023103143A1 CN 2022071109 W CN2022071109 W CN 2022071109W WO 2023103143 A1 WO2023103143 A1 WO 2023103143A1
Authority
WO
WIPO (PCT)
Prior art keywords
point cloud
cloud data
sensor
calibration
coordinate system
Prior art date
Application number
PCT/CN2022/071109
Other languages
English (en)
French (fr)
Inventor
黄超
张�浩
Original Assignee
上海仙途智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海仙途智能科技有限公司 filed Critical 上海仙途智能科技有限公司
Publication of WO2023103143A1 publication Critical patent/WO2023103143A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Definitions

  • the present disclosure relates to the technical field of intelligent driving, and in particular to a sensor detection method, device, electronic equipment and readable storage medium.
  • the environmental perception module includes a variety of sensors, such as lidar, camera, millimeter-wave radar, etc.
  • the vehicle perceives the surrounding environment through these sensors, and then controls the driving of the autonomous vehicle based on the perceived environmental information. Due to the limited detection range of a single sensor, in order to reduce the detection blind area, multiple sensors will be installed on the vehicle, and the point cloud coordinate system generated by the sensors will be converted into a unified coordinate system through calibration. However, when the relative position of the sensor and the vehicle changes, the result of point cloud merging will be abnormal.
  • the operation and maintenance personnel mainly perform sensor calibration anomaly detection and troubleshooting in an offline state after the vehicle has finished running, so it is impossible to obtain the calibration status of the sensor in real time. And since the sensor calibration status cannot be known in time during the driving process of the vehicle, the vehicle continues to run when the sensor calibration is abnormal, which poses a safety risk.
  • the present disclosure provides a sensor detection method, device, electronic equipment, and readable storage medium, so as to realize the automatic detection of the calibration state of the sensor by the vehicle, shorten the detection feedback time, and improve the safety of the vehicle during operation.
  • a sensor detection method includes: respectively acquiring the first point cloud data of a plurality of sensors arranged on the vehicle in the initial coordinate system; based on the first conversion relationship, converting The first point cloud data of the sensor is respectively converted to the vehicle coordinate system to obtain the second point cloud data of the sensor, wherein the first conversion relationship is obtained according to the initial setting pose of the sensor on the vehicle ; Merge the second point cloud data of at least two sensors in the plurality of sensors to obtain combined point cloud data; determine the calibration of the sensor according to the second point cloud data and the combined point cloud data state.
  • the converting the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship includes: converting from the initial coordinate system of the sensor to the vehicle coordinate system
  • the six-degree-of-freedom transformation matrix is used to transform the first point cloud data of the sensor into the vehicle coordinate system, wherein the six-degree-of-freedom transformation matrix includes a rotation matrix and a translation matrix.
  • the determining the calibration state of the sensor according to the first point cloud data, the second point cloud data and the merged point cloud data of the sensor includes: The first point cloud data of a sensor is registered with the merged point cloud data to obtain a second conversion relationship, wherein the first sensor is any sensor in the plurality of sensors; based on the second conversion relationship, convert the first point cloud data of the first sensor to the coordinate system of the merged point cloud to obtain the third point cloud data of the first sensor, and the coordinate system of the merged point cloud is determined by the merged point cloud Data fitting is obtained; the second point cloud data of the first sensor is compared with the third point cloud data, and the calibration state of the first sensor is determined according to the comparison result.
  • the comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the sensor according to the comparison result includes: When the average position offset distance between the second point cloud data and the corresponding point coordinates in the third point cloud data exceeds a first set threshold, it is determined that the calibration of the first sensor is abnormal; If the average position offset distance of the corresponding point coordinates in the third point cloud data does not exceed the first set threshold, it is determined that the calibration of the first sensor is normal.
  • the determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data includes: first point cloud data of the first sensor Registering with the merged point cloud data to obtain a second conversion relationship; when the parameter deviation between the first conversion relationship and the second conversion relationship exceeds a second set threshold, determine the first The calibration of the sensor is abnormal; when the parameter deviation between the first conversion relationship and the second conversion relationship does not exceed a second set threshold, it is determined that the calibration of the first sensor is normal.
  • the merging the second point cloud data of at least two sensors among the plurality of sensors to obtain the combined point cloud data includes: among the plurality of sensors, except The second point cloud data of other sensors other than the first sensor are combined to obtain combined point cloud data.
  • the method further includes: merging multi-frame point cloud data on the merged point cloud data to obtain multi-frame merged point cloud data; based on the multi-frame merged point cloud data, obtaining The relative displacement of the vehicle is used to obtain the moving direction of the vehicle; the positive direction of the coordinate system of the merged point cloud is compared with the moving direction of the vehicle, and the calibration state of the sensor is determined according to the comparison result.
  • the comparison of the positive direction of the coordinate system of the combined point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result include: When the angle of the angle exceeds the first angle threshold, it is determined that at least one sensor calibration of the plurality of sensors is abnormal; when the angle between the two directions does not exceed the first angle threshold, it is determined that the calibration of the plurality of sensors normal.
  • the method further includes: performing ground segmentation processing on the merged point cloud data to obtain ground segmented point cloud data; combining the merged point cloud data with the ground segmented point cloud data A comparison is performed, and the calibration state of the sensor is determined according to the comparison result.
  • the comparing the merged point cloud data with the ground segmented point cloud data, and determining the calibration status of the sensor according to the comparison result includes: in the merged point cloud data When the angle between the positive direction of the coordinate system and the angle between the ground obtained by the ground segmentation point cloud exceeds the second angle threshold, it is determined that at least one sensor in the plurality of sensors is abnormally calibrated; in the merged point cloud If the angle between the positive direction of the coordinate system of the data and the ground angle obtained from the ground segmentation point cloud does not exceed a second angle threshold, it is determined that the calibration of the plurality of sensors is normal.
  • the method further includes: when it is determined that the sensor calibration is abnormal, generating a sensor calibration abnormality report, and sending the abnormality report to the operation and maintenance terminal.
  • the method further includes: generating a sensor calibration detection report according to the calibration state of the sensor at a set time interval, and sending the calibration detection report to the operation and maintenance terminal.
  • a sensor detection device comprising: a first point cloud acquisition module, configured to respectively acquire a first point cloud of a plurality of sensors arranged on a vehicle in an initial coordinate system data; a second point cloud generation module, configured to convert the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship to obtain the second point cloud data of the sensor, wherein the first A conversion relationship is obtained according to the initial setting pose of the sensor on the vehicle; the merged point cloud generation module is used to merge the second point cloud data of at least two sensors in the plurality of sensors to obtain a merged Point cloud data; a calibration detection module, configured to determine the calibration status of the sensor according to the second point cloud data and the merged point cloud data.
  • the second point cloud generation module based on the first conversion relationship, respectively converts the first point cloud data of the sensor into the vehicle coordinate system, specifically for: based on the The initial coordinate system of the sensor is transformed into a six-degree-of-freedom transformation matrix of the vehicle coordinate system, and the first point cloud data of the sensor is respectively transformed into the vehicle coordinate system, wherein the six-degree-of-freedom transformation matrix includes a rotation matrix and a translation matrix.
  • the calibration detection module determines the calibration state of the sensor according to the first point cloud data, the second point cloud data and the merged point cloud data of the sensor, specifically for : registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relationship, wherein the first sensor is any sensor in the plurality of sensors; based on The second conversion relationship is to convert the first point cloud data of the first sensor to the coordinate system of the merged point cloud to obtain the third point cloud data of the first sensor, and the coordinate system of the merged point cloud is given by The combined point cloud data is obtained by fitting; the second point cloud data of the first sensor is compared with the third point cloud data, and the calibration state of the first sensor is determined according to the comparison result.
  • the calibration detection module compares the second point cloud data of the first sensor with the third point cloud data, and determines the calibration state of the sensor according to the comparison result, including: When the average position offset distance between the second point cloud data and the corresponding point coordinates in the third point cloud data exceeds a first set threshold, it is determined that the calibration of the first sensor is abnormal; at the second point If the average position offset distance between the cloud data and the corresponding point coordinates in the third point cloud data does not exceed a first set threshold, it is determined that the calibration of the first sensor is normal.
  • the calibration detection module determines the calibration state of the sensor according to the second point cloud data and the merged point cloud data, including: the first sensor of the first sensor Registering the point cloud data with the merged point cloud data to obtain a second conversion relationship; when the parameter deviation between the first conversion relationship and the second conversion relationship exceeds a second set threshold, determine that the The calibration of the first sensor is abnormal; if the parameter deviation between the first conversion relationship and the second conversion relationship does not exceed a second set threshold, it is determined that the calibration of the first sensor is normal.
  • the merged point cloud generation module merges the second point cloud data of at least two sensors in the plurality of sensors to obtain the merged point cloud data, which is specifically used for: Among the plurality of sensors, the second point cloud data of other sensors except the first sensor are combined to obtain the combined point cloud data.
  • the device further includes a first angle comparison module, configured to: perform multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data; based on the Multi-frame merging of point cloud data, obtaining the relative displacement of the vehicle, and obtaining the moving direction of the vehicle; comparing the positive direction of the coordinate system of the merging point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
  • a first angle comparison module configured to: perform multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data; based on the Multi-frame merging of point cloud data, obtaining the relative displacement of the vehicle, and obtaining the moving direction of the vehicle; comparing the positive direction of the coordinate system of the merging point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
  • the first angle comparison module compares the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determines the calibration state of the sensor according to the comparison result, specifically It is used for: when the angle between the two directions exceeds the first angle threshold, determine that at least one sensor in the plurality of sensors is abnormally calibrated; when the angle between the two directions does not exceed the first angle threshold, It is determined that the calibration of the plurality of sensors is normal.
  • the device further includes a second angle comparison module, configured to: perform ground segmentation processing on the merged point cloud data to obtain ground segmented point cloud data; Comparing with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
  • a second angle comparison module configured to: perform ground segmentation processing on the merged point cloud data to obtain ground segmented point cloud data; Comparing with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
  • the second angle comparison module compares the merged point cloud data with the ground segmented point cloud data, and determines the calibration state of the sensor according to the comparison result, specifically for: When the angle between the positive direction of the coordinate system of the merged point cloud data and the angle between the ground obtained by the ground segmentation point cloud exceeds a second angle threshold, it is determined that at least one sensor calibration of the plurality of sensors is abnormal ; When the angle between the positive direction of the coordinate system of the merged point cloud data and the ground angle obtained by the ground segmentation point cloud does not exceed a second angle threshold, it is determined that the calibration of the plurality of sensors is normal.
  • the device further includes an abnormal report module, configured to: generate a sensor calibration abnormal report when it is determined that there is an abnormal sensor calibration, and send the abnormal report to the operation and maintenance terminal.
  • an abnormal report module configured to: generate a sensor calibration abnormal report when it is determined that there is an abnormal sensor calibration, and send the abnormal report to the operation and maintenance terminal.
  • the device further includes a detection report module, configured to: generate a sensor calibration detection report according to the calibration status of the sensor at a set time interval, and send the detection report to Operation and maintenance terminal.
  • a detection report module configured to: generate a sensor calibration detection report according to the calibration status of the sensor at a set time interval, and send the detection report to Operation and maintenance terminal.
  • an electronic device including: a memory for storing processor-executable instructions; a processor configured to execute the executable instructions in the memory to implement any of the above-mentioned first aspects. The steps of the method of an embodiment.
  • a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the method described in any one of the implementation manners of the above-mentioned first aspect are implemented.
  • a smart vehicle including the above-mentioned electronic device.
  • the technical solution provided by the present disclosure may include the following beneficial effects: through the first point cloud data of the sensor, the second point cloud data and the combined point cloud data, the vehicle can automatically detect the calibration state of the sensor, and can Real-time detection of sensor calibration abnormalities shortens the detection time for sensor calibration abnormalities, simplifies the feedback process from the occurrence of calibration abnormalities to the detection of the calibration abnormalities, facilitates operation and maintenance personnel to handle them at any time, and improves the safety of vehicles during operation.
  • Fig. 1A is a flowchart of a sensor detection method according to an exemplary embodiment of the present disclosure.
  • Fig. 1B is a flowchart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • Fig. 2 is a flowchart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • Fig. 3 is a flowchart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • Fig. 4 is a schematic diagram of a sensor detection device according to an exemplary embodiment of the present disclosure.
  • Fig. 5 is a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • first, second, third, etc. may be used in the present disclosure to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of the present disclosure, first information may also be called second information, and similarly, second information may also be called first information. Depending on the context, the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination.”
  • Fig. 1A shows a flowchart of a sensor detection method according to an exemplary embodiment of the present disclosure.
  • step S101 the first point cloud data in the initial coordinate system of a plurality of sensors arranged on the vehicle are acquired respectively.
  • multiple sensors may be provided on the vehicle, and the sensors may include at least one of commonly used sensors such as lidar, multi-eye camera, and millimeter-wave radar.
  • the initial coordinate system can be set according to the position of the sensor on the vehicle, for example, the position of the sensor on the vehicle is set as the origin of the initial coordinate system.
  • the initial coordinate system can also be obtained from configuration parameters of the vehicle.
  • step S102 based on the first conversion relationship, the first point cloud data of the sensor are respectively transformed into the vehicle coordinate system to obtain the second point cloud data of the sensor.
  • the first conversion relationship is obtained according to the initial setup pose of the sensor on the vehicle and setup information of the vehicle coordinate system.
  • the vehicle coordinate system can be set according to requirements, for example, the midpoint of the front axle or the rear axle of the vehicle can be used as the origin of the vehicle coordinate system, and the midpoint of the front axle or the rear axle can be set forward along the direction of the vehicle body as the positive direction of the X axis.
  • Direction, left along the front axis or rear axis is taken as the positive direction of the Y axis, and upwards is taken as the positive direction of the Z axis.
  • step S103 the second point cloud data of at least two sensors among the plurality of sensors are combined to obtain combined point cloud data.
  • step S104 the calibration state of the sensor is determined according to the second point cloud data and the merged point cloud data.
  • the coordinate system of the merged point cloud is obtained by fitting the coordinate system of the second point cloud data obtained by the plurality of sensors. Although in the case of abnormal sensor calibration, the coordinate system of the second point cloud data acquired by individual sensors deviates from the vehicle coordinate system, after point cloud data merging and coordinate system fitting, the merging point cloud data The coordinate system of is close to the vehicle coordinate system, and the coordinate system of the combined point cloud data can be used as a standard coordinate system close to the vehicle coordinate system.
  • the coordinate system of the second point cloud data is For the vehicle coordinate system, ie to determine the calibration state of the sensor.
  • the present disclosure realizes the automatic detection of the sensor calibration state by the vehicle through the first point cloud data, the second point cloud data and the combined point cloud data of the sensor, and can detect the sensor calibration abnormality in real time during the driving process of the vehicle, shortening the speed of the sensor.
  • the detection time of calibration abnormality simplifies the feedback process from the occurrence of calibration abnormality to the calibration abnormality described in the detection data, which is convenient for operation and maintenance personnel to deal with at any time, and improves the safety of the vehicle during operation.
  • the converting the first point cloud data of the sensor to the vehicle coordinate system based on the first conversion relationship includes: converting from the initial coordinate system of the sensor to the vehicle coordinate system
  • the six-degree-of-freedom transformation matrix is used to transform the first point cloud data of the sensor into the vehicle coordinate system, wherein the six-degree-of-freedom transformation matrix includes a rotation matrix and a translation matrix.
  • the first conversion relationship is a conversion relationship between the sensor coordinate system and the vehicle coordinate system, which can be is an affine matrix or a six-degree-of-freedom transformation matrix containing the pose parameters of the sensor and the vehicle.
  • the first transformation relationship includes a six-degree-of-freedom transformation matrix formed by a rotation matrix and a translation matrix transformed from the initial coordinate system of the sensor to the vehicle coordinate system.
  • the initial coordinate system of the first point cloud and the vehicle coordinate system are both three-dimensional Cartesian coordinate systems, and the first point cloud data is calibrated by a six-degree-of-freedom transformation matrix, and the The first point cloud data of the sensor is transformed into the vehicle coordinate system to obtain the second point cloud data of the sensor.
  • Fig. 1B shows a flowchart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • step S104-1 register the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relationship, wherein the first sensor is one of the plurality of sensors any of the sensors.
  • the registration may be a process of obtaining a conversion relationship from an original point cloud to a target point cloud.
  • a coordinate system capable of converting the first point cloud data coordinate system to the merged point cloud data coordinate system is obtained.
  • the second conversion relationship of the system is obtained.
  • the electronic equipment in the vehicle can use each sensor on the vehicle as the first sensor in turn to detect the calibration status of each sensor respectively;
  • a sensor is used to detect the calibration state of the sensor in a targeted manner.
  • step S104-2 based on the second conversion relationship, the first point cloud data of the first sensor is transformed into the coordinate system of the merged point cloud to obtain the third point cloud data of the first sensor, so The coordinate system of the merged point cloud is obtained by fitting the merged point cloud data.
  • the deviation between the coordinate system obtained by fitting the merged point cloud data and the vehicle coordinate system is usually small, and can be approximated as a standard close to the vehicle coordinate system Coordinate System.
  • the coordinate system of the third point cloud data is consistent with the coordinate system of the merged point cloud data, and is obtained by fitting the merged point cloud data, so it can also be used as a standard coordinate system similar to the vehicle coordinate system.
  • the first point cloud data of the first sensor is converted to the coordinate system of the merged point cloud to obtain the third point cloud data of the first sensor.
  • step S104-3 the second point cloud data of the first sensor is compared with the third point cloud data, and the calibration state of the sensor is determined according to the comparison result.
  • the coordinate system of the third point cloud data is consistent with the coordinate system of the merged point cloud data, it can also be used as a standard coordinate system similar to the vehicle coordinate system. Therefore, by comparing the second point cloud data with the third point cloud data, it can be determined whether the coordinate system of the second point cloud data is the vehicle coordinate system, that is, to determine the calibration state of the first sensor .
  • the present disclosure compares the second point cloud data of the first sensor with the third point cloud data, and determines the selected point cloud data according to the comparison result.
  • the calibration state of the first sensor, wherein, the second point cloud data and the third point cloud data are converted by the first sensor, and the point cloud numbers of the two point cloud data are the same, so it is convenient for comparison processing and real-time detection
  • the sensor calibration abnormality caused by the change of the relative position between the sensor and the vehicle is convenient for operation and maintenance personnel to handle at any time, which improves the safety of the vehicle during operation.
  • the comparing the second point cloud data of the first sensor with the third point cloud data, and determining the calibration state of the sensor according to the comparison result includes: at the second point When the average position offset distance between the cloud data and the corresponding point coordinates in the third point cloud data exceeds a first set threshold, it is determined that the calibration of the first sensor is abnormal; If the average position offset distance of the corresponding point coordinates in the third point cloud data does not exceed the first set threshold, it is determined that the calibration of the first sensor is normal.
  • the abnormal calibration of the sensor may be caused by two situations.
  • the abnormal calibration of the sensor may be caused by a change in the relative position of the sensor and the vehicle.
  • the first point cloud data obtained by the first sensor and the corresponding point coordinates of the second point cloud data There is an offset distance from the average position of .
  • the offset distance exceeds a second set threshold, it is determined that the calibration of the first sensor is abnormal; when the offset distance does not exceed the second set threshold, it is determined that the calibration of the first sensor is normal , wherein the second set threshold can be set according to actual needs.
  • the present disclosure can detect in real time the sensor calibration abnormality caused by the change of the relative position between the sensor and the vehicle, which is convenient for operation and maintenance personnel to handle at any time, and improves the safety of the vehicle during operation.
  • the determining the calibration state of the sensor according to the second point cloud data and the merged point cloud data includes: comparing the first point cloud data and the combined point cloud data of the first sensor The combined point cloud data is registered to obtain a second conversion relationship; when the parameter deviation between the first conversion relationship and the second conversion relationship exceeds a second set threshold, determine the calibration of the first sensor Abnormal: when the parameter deviation between the first conversion relationship and the second conversion relationship does not exceed a second set threshold, it is determined that the calibration of the first sensor is normal.
  • the abnormal calibration of the sensor may also be caused by a parameter error in the first switch-off relationship.
  • the parameters include parameters in a six-degree-of-freedom transformation matrix formed by a rotation matrix and a translation matrix transformed from the initial coordinate system of the sensor to the vehicle coordinate system.
  • the sensor calibration is abnormal due to a parameter error in the first conversion relationship
  • the parameter deviation between the first conversion relationship and the second conversion relationship exceeds a second set threshold; If the first set threshold is not exceeded, the calibration of the first sensor is normal.
  • the second set threshold can be set according to actual needs.
  • the present disclosure can detect in real time the sensor calibration abnormality caused by the parameter error in the first conversion relationship, which is convenient for operation and maintenance personnel to handle at any time, and improves the safety of the vehicle during operation.
  • the merging the second point cloud data of at least two sensors in the multiple sensors to obtain the combined point cloud data includes: among the multiple sensors, except the first The second point cloud data of other sensors outside the sensor are combined to obtain the combined point cloud data.
  • the vehicle includes N sensors, and after determining the first sensor to be detected, the second point cloud data of other (N-1) sensors except the first sensor are combined to obtain Merge point cloud data.
  • Fig. 2 shows a flow chart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • step S201 the merged point cloud data is combined with multi-frame point cloud data to obtain multi-frame merged point cloud data.
  • the combined point cloud data obtained by the sensor is acquired at a set time interval, and multi-frame data of the combined point cloud can be obtained, and point cloud fusion is performed on the multi-frame data of the combined point cloud. , to obtain multi-frame merged point cloud data.
  • the point cloud fusion of the multi-frame data of the point cloud is a means in the related art, and the present disclosure will not repeat it here.
  • step S202 based on the multi-frame merged point cloud data, the relative displacement of the vehicle is obtained to obtain the moving direction of the vehicle.
  • the displacement situation of the merged point cloud in continuous time can be determined, that is, the displacement situation of the vehicle within the set time period, and the moving direction of the vehicle can be obtained from the displacement situation .
  • step S203 the positive direction of the coordinate system of the merged point cloud is compared with the moving direction of the vehicle, and the calibration state of the sensor is determined according to the comparison result.
  • the positive direction of the coordinate system of the merged point cloud should be consistent with the moving direction of the vehicle, and abnormal calibration will cause the positive direction of the coordinate system of the merged point cloud to be consistent with the direction of the vehicle There is an included angle in the moving direction. Therefore, by comparing the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, it can be determined whether any of the sensors that acquire the merged point cloud have a calibration abnormality, that is, A calibration status of the sensor is determined.
  • the present disclosure uses the above method to determine the calibration state of the sensor based on the direction during the running of the vehicle, thereby improving the detection effect of the calibration state.
  • the operation and maintenance personnel can know in time whether the vehicle needs to be optimized or repaired for sensor calibration, shortening the time from the occurrence of calibration abnormality to the detection of the calibration abnormality, so that the operation and maintenance personnel can Timely processing improves the safety of the vehicle during operation.
  • the forward direction of the coordinate system of the merged point cloud is compared with the moving direction of the vehicle, and the calibration state of the sensor is determined according to the comparison result, including: When the angle exceeds the first angle threshold, it is determined that at least one of the multiple sensors is calibrated abnormally; when the angle between the two directions does not exceed the first angle threshold, it is determined that the calibration of the multiple sensors is normal.
  • the first angle threshold it is determined that at least one sensor in the plurality of sensors is abnormally calibrated; when the angle between the two directions does not exceed the first angle threshold, it is determined that the Multiple sensors are calibrated normally, and the first angle threshold can be set according to actual needs.
  • the sensor calibration abnormality detection is performed in real time during the driving process of the vehicle, which shortens the sensor calibration abnormality detection time and simplifies the self-emergence.
  • the feedback process from calibration abnormality to detection of the calibration abnormality is convenient for operation and maintenance personnel to handle at any time, which improves the safety of the vehicle during operation.
  • Fig. 3 shows a flowchart of another sensor detection method according to an exemplary embodiment of the present disclosure.
  • step S301 ground segmentation processing is performed on the merged point cloud data to obtain ground segmentation point cloud data.
  • Ground segmentation processing is a method for obtaining ground segmentation point clouds, such as plane grid method, point cloud normal vector, model fitting method, bin grid method, etc.
  • the specific ground segmentation method is a common method in related technologies. The disclosure will not be repeated here.
  • the ground segmentation point cloud can be obtained through the ground segmentation method.
  • step S302 the combined point cloud data is compared with the ground segmentation point cloud data, and the calibration state of the sensor is determined according to the comparison result.
  • the positive direction of the coordinate system of the merged point cloud is parallel to the ground obtained by the ground segmentation point cloud, and abnormal calibration will cause the positive direction of the coordinate system of the merged point cloud to be in the same direction as the ground
  • the ground obtained by the ground segmentation point cloud has an included angle. Therefore, by comparing the positive direction of the coordinate system of the merged point cloud with the angle between the ground, it can be determined whether there is a sensor among the multiple sensors that acquire the merged point cloud.
  • Calibration abnormality that is, determining the calibration status of the sensor.
  • the merged point cloud data By comparing the merged point cloud data with the ground segmented point cloud data, it can be determined in real time whether any of the multiple sensors that acquire the merged point cloud has a calibration abnormality, that is, determine the calibration status of the sensor.
  • the sensor calibration abnormality detection time is shortened, and the feedback process from the occurrence of the calibration abnormality to the detection of the calibration abnormality is simplified, so that the operation and maintenance personnel can handle it at any time, and the safety of the vehicle during operation is improved.
  • the present disclosure determines the calibration state of the sensor based on the ground state through the above method, and improves the detection effect of the calibration state.
  • the operation and maintenance personnel can know whether the vehicle needs sensor calibration optimization or maintenance in a timely manner, shortening the feedback time from the occurrence of calibration abnormalities to the detection of the calibration abnormalities, enabling operation and maintenance personnel to timely processing to improve the safety of the vehicle during operation.
  • the comparing the merged point cloud data with the ground segmentation point cloud data, and determining the calibration status of the sensor according to the comparison result includes: coordinates of the merged point cloud data When the angle between the positive direction and the angle between the ground obtained by the ground segmentation point cloud exceeds a second angle threshold, it is determined that at least one sensor in the plurality of sensors is abnormally calibrated; in the merged point cloud data If the angle between the positive direction of the coordinate system and the ground angle obtained from the ground segmentation point cloud does not exceed a second angle threshold, it is determined that the calibration of the plurality of sensors is normal.
  • the second angle threshold When the angle between the positive direction of the coordinate system of the merged point cloud data and the angle between the ground exceeds the second angle threshold, it is determined that at least one sensor in the plurality of sensors is abnormally calibrated; at the merged point If the angle between the positive direction of the coordinate system of the cloud data and the angle from the ground does not exceed a second angle threshold, it is determined that the calibration of the plurality of sensors is normal, wherein the second angle threshold can be set according to actual needs.
  • the present disclosure compares the angle between the positive direction of the merged point cloud coordinate system and the ground obtained by the ground segmentation point cloud through the above method, and detects sensor calibration abnormalities in real time during vehicle driving, shortening the detection time for sensor calibration abnormalities , simplifies the feedback process from the occurrence of the calibration anomaly to the detection of the calibration anomaly, which is convenient for operation and maintenance personnel to handle at any time, and improves the safety of the vehicle during operation.
  • the calibration state of the sensor may also be directly determined through the ground segmentation process.
  • the first point cloud data acquired by any two sensors installed on the vehicle are respectively subjected to ground segmentation processing, and the height coordinates of the ground segmentation point cloud data acquired by the arbitrary two sensors are In the case of exceeding the third set threshold, it is determined that at least one of the sensors is abnormally calibrated; when the height coordinates of the ground segmentation point cloud data acquired by any two sensors of the vehicle do not exceed the third set threshold, Make sure the sensor is calibrated properly.
  • the present disclosure compares the height coordinates of the ground segmentation point cloud data obtained by any two sensors respectively through the above method, and detects sensor calibration abnormalities in real time during vehicle driving, shortens the detection time of sensor calibration abnormalities, and simplifies the process of self-calibration abnormalities.
  • the feedback process from the detection of the calibration abnormality is convenient for the operation and maintenance personnel to deal with it at any time, which improves the safety of the vehicle during operation.
  • the method further includes: when it is determined that the sensor calibration is abnormal, generating a sensor calibration abnormality report, and sending the abnormality report to the operation and maintenance terminal.
  • the abnormality report may include all the point cloud data obtained by the sensor obtained by the above method, and other data related to vehicle operation.
  • the operation and maintenance personnel can perform maintenance on the sensor with calibration abnormality; Overall optimization or maintenance of the plurality of sensors may be performed.
  • the operation and maintenance personnel can promptly notify the vehicle user of the current risk.
  • the abnormality when the sensor has a calibration abnormality, the abnormality can be detected in time, shortening the feedback time from the occurrence of the calibration abnormality to the detection of the calibration abnormality, so that the operation and maintenance personnel can handle it in time, The safety of the vehicle during operation is improved.
  • the method further includes: generating a sensor calibration detection report according to the calibration state of the sensor at a set time interval, and sending the calibration detection report to the operation and maintenance terminal.
  • the set time interval can be set every day or every week, and a sensor calibration detection report is generated according to the calibration state of the sensor, and the calibration detection report is sent to the operation and maintenance terminal.
  • a sensor calibration detection report is generated according to the calibration state of the sensor, and the calibration detection report is sent to the operation and maintenance terminal.
  • abnormal calibration of the sensor within the set time include the abnormal report in the calibration detection report; if there is no abnormal calibration of the sensor within the set time, it can be obtained through the above method All the point cloud data acquired by the sensors and the relevant current operating data of the vehicle are included in the calibration detection report.
  • the present disclosure enables operation and maintenance personnel to obtain the calibration state of the sensor and the running state of the vehicle within a set time, which is helpful for a comprehensive evaluation of the vehicle and improves the safety of the vehicle during operation.
  • the present disclosure also provides embodiments of an apparatus for implementing application functions and a corresponding terminal.
  • Fig. 4 shows a schematic diagram of a sensor detection device according to an exemplary embodiment of the present disclosure, and the device may include: a first point cloud acquisition module 401, which is used to respectively acquire a plurality of sensors set on the vehicle at the initial The first point cloud data in the coordinate system; the second point cloud generation module 402 is used to convert the first point cloud data of the sensor to the vehicle coordinate system respectively based on the first conversion relationship to obtain the second point cloud data of the sensor.
  • a first point cloud acquisition module 401 which is used to respectively acquire a plurality of sensors set on the vehicle at the initial The first point cloud data in the coordinate system
  • the second point cloud generation module 402 is used to convert the first point cloud data of the sensor to the vehicle coordinate system respectively based on the first conversion relationship to obtain the second point cloud data of the sensor.
  • Point cloud data wherein, the first conversion relationship is obtained according to the initial setting pose of the sensor on the vehicle; the combined point cloud generation module 403 is used for at least two sensors in the plurality of sensors Merge the second point cloud data to obtain merged point cloud data; the calibration detection module 404 is configured to determine the calibration state of the sensor according to the second point cloud data and the merged point cloud data.
  • the second point cloud generation module based on the first conversion relationship, respectively converts the first point cloud data of the sensor into the vehicle coordinate system, specifically for: based on the The initial coordinate system of the sensor is transformed into a six-degree-of-freedom transformation matrix of the vehicle coordinate system, and the first point cloud data of the sensor is respectively transformed into the vehicle coordinate system, wherein the six-degree-of-freedom transformation matrix includes a rotation matrix and a translation matrix.
  • the calibration detection module determines the calibration state of the sensor according to the first point cloud data, the second point cloud data and the merged point cloud data of the sensor, specifically for : registering the first point cloud data of the first sensor and the merged point cloud data to obtain a second conversion relationship, wherein the first sensor is any sensor in the plurality of sensors; based on The second conversion relationship is to convert the first point cloud data of the first sensor to the coordinate system of the merged point cloud to obtain the third point cloud data of the first sensor, and the coordinate system of the merged point cloud is given by The combined point cloud data is obtained by fitting; the second point cloud data of the first sensor is compared with the third point cloud data, and the calibration state of the first sensor is determined according to the comparison result.
  • the calibration detection module compares the second point cloud data of the first sensor with the third point cloud data, and determines the calibration state of the sensor according to the comparison result, including: When the average position offset distance between the second point cloud data and the corresponding point coordinates in the third point cloud data exceeds a first set threshold, it is determined that the calibration of the first sensor is abnormal; at the second point If the average position offset distance between the cloud data and the corresponding point coordinates in the third point cloud data does not exceed a first set threshold, it is determined that the calibration of the first sensor is normal.
  • the calibration detection module determines the calibration state of the sensor according to the second point cloud data and the merged point cloud data, including: the first sensor of the first sensor Registering the point cloud data with the merged point cloud data to obtain a second conversion relationship; when the parameter deviation between the first conversion relationship and the second conversion relationship exceeds a second set threshold, determine the The calibration of the first sensor is abnormal; if the parameter deviation between the first conversion relationship and the second conversion relationship does not exceed a second set threshold, it is determined that the calibration of the first sensor is normal.
  • the merged point cloud generation module merges the second point cloud data of at least two sensors in the plurality of sensors to obtain the merged point cloud data, which is specifically used for: Among the plurality of sensors, the second point cloud data of other sensors except the first sensor are combined to obtain the combined point cloud data.
  • the device further includes a first angle comparison module, configured to: perform multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data; based on the Multi-frame merging of point cloud data, obtaining the relative displacement of the vehicle, and obtaining the moving direction of the vehicle; comparing the positive direction of the coordinate system of the merging point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
  • a first angle comparison module configured to: perform multi-frame point cloud data merging on the merged point cloud data to obtain multi-frame merged point cloud data; based on the Multi-frame merging of point cloud data, obtaining the relative displacement of the vehicle, and obtaining the moving direction of the vehicle; comparing the positive direction of the coordinate system of the merging point cloud with the moving direction of the vehicle, and determining the calibration state of the sensor according to the comparison result.
  • the first angle comparison module compares the positive direction of the coordinate system of the merged point cloud with the moving direction of the vehicle, and determines the calibration state of the sensor according to the comparison result, specifically It is used for: when the angle between the two directions exceeds the first angle threshold, determine that at least one sensor in the plurality of sensors is abnormally calibrated; when the angle between the two directions does not exceed the first angle threshold, It is determined that the calibration of the plurality of sensors is normal.
  • the device further includes a second angle comparison module, configured to: perform ground segmentation processing on the merged point cloud data to obtain ground segmented point cloud data; Comparing with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
  • a second angle comparison module configured to: perform ground segmentation processing on the merged point cloud data to obtain ground segmented point cloud data; Comparing with the ground segmentation point cloud data, and determining the calibration state of the sensor according to the comparison result.
  • the second angle comparison module compares the merged point cloud data with the ground segmented point cloud data, and determines the calibration state of the sensor according to the comparison result, specifically for: When the angle between the positive direction of the coordinate system of the merged point cloud data and the angle between the ground obtained by the ground segmentation point cloud exceeds a second angle threshold, it is determined that at least one sensor calibration of the plurality of sensors is abnormal ; When the angle between the positive direction of the coordinate system of the merged point cloud data and the ground angle obtained by the ground segmentation point cloud does not exceed a second angle threshold, it is determined that the calibration of the plurality of sensors is normal.
  • the device further includes an abnormal report module, configured to: generate a sensor calibration abnormal report when it is determined that there is an abnormal sensor calibration, and send the abnormal report to the operation and maintenance terminal.
  • an abnormal report module configured to: generate a sensor calibration abnormal report when it is determined that there is an abnormal sensor calibration, and send the abnormal report to the operation and maintenance terminal.
  • the device further includes a detection report module, configured to: generate a sensor calibration detection report according to the calibration status of the sensor at a set time interval, and send the detection report to Operation and maintenance terminal.
  • a detection report module configured to: generate a sensor calibration detection report according to the calibration status of the sensor at a set time interval, and send the detection report to Operation and maintenance terminal.
  • Fig. 5 shows a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • the device may include: a processor, a memory, a network interface and an internal bus.
  • the processor, the memory, and the network interface are connected to each other within the device through the bus.
  • the processor can be implemented by a general-purpose CPU (Central Processing Unit, central processing unit), a microprocessor, an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, and is used to execute related programs , to realize the technical solution provided by this application.
  • a general-purpose CPU Central Processing Unit, central processing unit
  • a microprocessor an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC)
  • ASIC Application Specific Integrated Circuit
  • the memory can be implemented in the form of ROM (Read Only Memory, read-only memory), RAM (Random Access Memory, random access memory), static storage device, and dynamic storage device.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • static storage device static storage device
  • dynamic storage device dynamic storage device.
  • the memory can store an operating system and other application programs. When the technical solution provided by this application is implemented through software or firmware, the relevant program codes are stored in the memory and invoked by the processor for execution.
  • the network interface is used to connect the communication module (not shown in the figure), so as to realize the communication interaction between the device and other devices.
  • the communication module can realize communication through wired means (such as USB, network cable, etc.), and can also realize communication through wireless means (such as mobile network, WIFI, Bluetooth, etc.).
  • a bus consists of a pathway that carries information between various components of a device (eg, processor, memory, network interface).
  • the above-mentioned device only shows a processor, a memory, a network interface, and a bus, in a specific implementation process, the device may also include other components necessary for normal operation.
  • the above-mentioned device may only include components necessary to realize the solution of the present application, and does not necessarily include all the components shown in the figure.
  • the present disclosure also provides a non-transitory computer-readable storage medium including instructions, such as a memory including instructions, the above instructions can be executed by a processor of an electronic device, so as to implement the above wireless headset connection method step.
  • the non-transitory computer readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
  • the present disclosure also provides a smart vehicle, including the above-mentioned electronic device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种检测传感器的方法、装置及可读储存介质,所述方法包括:分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据(S101);基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到(S102);对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据(S103);根据所述传感器的第二点云数据以及所述合并点云数据,确定所述传感器的标定状态(S104)。

Description

传感器检测的方法、装置、电子设备及可读储存介质 技术领域
本公开涉及智能驾驶技术领域,尤其涉及传感器检测的方法、装置、电子设备及可读储存介质。
背景技术
环境感知是智能驾驶系统的关键部分,是规划决策的前端输入,为规划决策提供了重要的依据。环境感知模块包括多种传感器,例如激光雷达、摄像头、毫米波雷达等,车辆通过这些传感器对周围环境进行感知,并进而依据感知到的环境信息来控制自动驾驶车辆的行驶。由于单个传感器的探测范围有限,为了减少探测盲区,车辆上会设置有多个传感器,并通过标定使得传感器分别产生的点云坐标系转换到统一的坐标系中。而当传感器与车辆的相对位置发生变化时将导致点云合并结果异常。相关技术中,主要由运维人员在车辆结束运行的离线状态下进行传感器的标定异常检测排查,因此无法实时获取传感器的标定状态。并且由于无法在车辆行驶过程中及时得知传感器标定状态,在传感器标定异常的情况下继续运行车辆,存在安全风险。
发明内容
有鉴于此,本公开提供一种传感器检测的方法、装置、电子设备及可读储存介质,以实现车辆对传感器标定状态的自动检测,缩短检测反馈时间,提升车辆运行时的安全性。
根据本公开的第一方面,提供了一种传感器检测方法,所述方法包括:分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据;基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到;对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据;根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,包括:基于从所述传感器的初始坐标系转换至车辆坐标系的六自由度转换矩阵,将所述传感器的第一点云数据分别转换至车辆坐标系,其中, 所述六自由度转换矩阵包括旋转矩阵和平移矩阵。
结合本公开提供的任一实施方式,所述根据所述传感器的第一点云数据、第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系,其中,所述第一传感器为所述多个传感器中的任一传感器;基于所述第二转换关系,将所述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据,所述合并点云的坐标系由所述合并点云数据拟合得到;将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述第一传感器的标定状态。
结合本公开提供的任一实施方式,所述将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离超出第一设定阈值的情况下,确定所述第一传感器标定异常;在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离未超出第一设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实施方式,所述根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系;在所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值的情况下,确定所述第一传感器标定异常;在所述第一转换关系与所述第二转换关系中的参数偏差未超出第二设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实施方式,所述对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据,包括:对所述多个传感器中,除第一传感器外的其他传感器的第二点云数据进行合并,得到合并点云数据。
结合本公开提供的任一实施方式,所述方法还包括:对所述合并点云数据进行多帧点云数据合并,得到多帧合并点云数据;基于所述多帧合并点云数据,获取车辆相对位移,得到车辆移动方向;将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态,包括:在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异 常;在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述方法还包括:对所述合并点云数据进行地面分割处理,得到地面分割点云数据;将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述方法还包括:在确定所述传感器标定异常的情况下,生成传感器标定异常报告,并将所述异常报告发送至运维终端。
结合本公开提供的任一实施方式,所述方法还包括:以设定的时间间隔,根据所述传感器的标定状态生成传感器标定检测报告,并将所述标定检测报告发送至运维终端。
根据本公开的第二方面,提供了一种传感器检测装置,所述装置包括:第一点云获取模块,用于分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据;第二点云生成模块,用于基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到;合并点云生成模块,用于对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据;标定检测模块,用于根据第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第二点云生成模块,基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,具体用于:基于从所述传感器的初始坐标系转换至车辆坐标系的六自由度转换矩阵,将所述传感器的第一点云数据分别转换至车辆坐标系,其中,所述六自由度转换矩阵包括旋转矩阵和平移矩阵。
结合本公开提供的任一实施方式,所述标定检测模块根据所述传感器的第一点云数据、第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,具体用于:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系,其中,所述第一传感器为所述多个传感器中的任一传感器;基于所述第二转换关系,将所 述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据,所述合并点云的坐标系由所述合并点云数据拟合得到;将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述第一传感器的标定状态。
结合本公开提供的任一实施方式,所述标定检测模块将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离超出第一设定阈值的情况下,确定所述第一传感器标定异常;在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离未超出第一设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实时方式,所述标定检测模块根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系;在所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值的情况下,确定所述第一传感器标定异常;在所述第一转换关系与所述第二转换关系中的参数偏差未超出第二设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实施方式,所述合并点云生成模块对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据,具体用于:对所述多个传感器中,除第一传感器外的其他传感器的第二点云数据进行合并,得到合并点云数据。
结合本公开提供的任一实施方式,所述装置还包括第一角度比较模块,用于:对所述合并点云数据进行多帧点云数据合并,得到多帧合并点云数据;基于所述多帧合并点云数据,获取车辆相对位移,得到车辆移动方向;将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第一角度比较模块将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态,具体用于:在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述装置还包括第二角度比较模块,用于:对所述合并点云数据进行地面分割处理,得到地面分割点云数据;将所述合并点云数据与所 述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第二角度比较模块将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态,具体用于:在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述装置还包括异常报告模块,用于:在确定所述存在传感器标定异常的情况下,生成传感器标定异常报告,并将所述异常报告发送至运维终端。
结合本公开提供的任一实施方式,所述装置还包括检测报告模块,用于:以设定的时间间隔,根据所述传感器的标定状态生成传感器标定检测报告,并将所述检测报告发送至运维终端。
根据本公开的第三方面,提供了一种电子设备,包括:存储器,用于存储处理器可执行指令;处理器,被配置为执行所述存储器中的可执行指令以实现上述第一方面任一实施方式所述方法的步骤。
根据本公开的第四方面,提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现上述第一方面任一实施方式所述方法的步骤。
根据本公开的第五方面,提供了一种智能车辆,包括上述电子设备。
本公开提供的技术方案可以包括以下有益效果:通过传感器的第一点云数据、第二点云数据以及所述合并点云数据,实现车辆对传感器标定状态的自动检测,能够在车辆行驶过程中实时进行传感器标定异常检测,缩短了传感器标定异常检测时间,简化了自出现标定异常到检测出所述标定异常的反馈流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入公开中并构成本公开的一部分,示出了符合本公开的实施例,并 与公开一起用于解释本公开的原理。
图1A是本公开根据一示例性实施例示出的一种传感器检测方法的流程图。
图1B是本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
图2是本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
图3是本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
图4是本公开根据一示例性实施例示出的一种传感器检测装置示意图。
图5是本公开根据一示例性实施例示出的一种电子设备框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
图1A示出本公开根据一示例性实施例示出的一种传感器检测方法的流程图。
在步骤S101中,分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据。
为了减少探测盲区,可以在车辆上设置多个传感器,所述传感器可以包括激光雷达、多目摄像头、毫米波雷达等常用的传感器中的至少一种。
通过设置在车辆中的电子设备获取车辆上多个传感器在初始坐标系下的第一点云数 据。所述初始坐标系可以根据所述传感器在车辆上的位置设置,例如将传感器在车辆上的位置设置为初始坐标系的原点。所述初始坐标系还可以从所述车辆的配置参数中获取。
在步骤S102中,基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据。
其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿,以及所述车辆坐标系的设置信息得到。其中,所述车辆坐标系可通过需求设定,例如,可以将车辆前轴或后轴中点作为车辆坐标系原点,所述前轴或后轴的中点沿车身方向向前作为X轴正方向,沿前轴或后轴向左作为Y轴正方向,向上作为Z轴正方向。
将所述第一点云数据基于第一转换关系转换为所述第二点云数据,在传感器标定正常的情况下,例如,所述第一转换关系中的参数正确,并且所述传感器的位置以及方向未与初始设置位姿发生偏移的情况下,经转换后得到的所述第二点云数据的坐标系与所述车辆坐标系一致;在传感器标定异常的情况下,例如,所述第一转换关系中的参数错误,或者所述传感器的位置以及方向与初始设置位姿发生偏移的情况下,经转换后得到的第二点云数据的坐标系与车辆坐标系不一致。
在步骤S103中,对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据。
在步骤S104中,根据第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
由所述多个传感器获得的第二点云数据的坐标系拟合得出所述合并点云坐标系。尽管在传感器标定异常的情况下,由个别传感器获取的所述第二点云数据的坐标系与所述车辆坐标系存在偏差,但经过点云数据合并及坐标系拟合后,合并点云数据的坐标系趋近于所述车辆坐标系,可以将所述合并点云数据的坐标系作为近似于车辆坐标系的标准坐标系。
通过比较所述第二点云数据与近似于所述车辆坐标系的所述合并点云数据或经所述合并点云数据处理的其他数据,能够确定所述第二点云数据的坐标系是否为所述车辆坐标系,也即确定所述传感器的标定状态。
本公开通过传感器的第一点云数据、第二点云数据以及所述合并点云数据,实现车辆对传感器标定状态的自动检测,能够在车辆行驶过程中实时进行传感器标定异常检测,缩短了传感器标定异常检测时间,简化了自出现标定异常到检测数所述标定异常的反馈 流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选的实施例中,所述基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,包括:基于从所述传感器的初始坐标系转换至车辆坐标系的六自由度转换矩阵,将所述传感器的第一点云数据分别转换至车辆坐标系,其中,所述六自由度转换矩阵包括旋转矩阵和平移矩阵。
将所述第一点云数据通过第一转换关系转换至车辆坐标系得到所述第二点云数据,所述第一转换关系是所述传感器坐标系与所述车辆坐标系的转换关系,可以是包含所述传感器与车辆的位姿参数的仿射矩阵或六自由度转换矩阵。在本公开示出的一个示例中,所述第一转换关系包括从所述传感器的初始坐标系转换至车辆坐标系的旋转矩阵和平移矩阵构成的六自由度转换矩阵。
在车辆运行过程中,所述第一点云的初始坐标系与所述车辆坐标系均为三维空间直角坐标系,通过六自由度转换矩阵对所述第一点云数据进行标定,将所述传感器的第一点云数据转换至车辆坐标系,得到所述传感器的第二点云数据。
图1B示出本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
在步骤S104-1中,对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系,其中,所述第一传感器为所述多个传感器中的任一传感器。
在本公开中,所述配准可以是获取由原始点云转换至目标点云的转换关系的过程。在一个示例中,通过对所述第一传感器的第一点云数据和所述合并点云数据进行配准,得到能够将所述第一点云数据坐标系转换至所述合并点云数据坐标系的第二转换关系。
在一个示例中,车辆中的电子设备可以依次将车辆上的每个传感器作为所述第一传感器,以分别检测每个传感器的标定状态;也可以单独控制所述车辆上特定传感器作为所述第一传感器,以针对性地对该传感器进行标定状态检测。
在步骤S104-2中,基于所述第二转换关系,将所述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据,所述合并点云的坐标系由所述合并点云数据拟合得到。
由于在通常情况下,多个传感器不会同时标定异常,因此通过对合并点云数据进行拟合所得到的坐标系与车辆坐标系的偏差通常较小,可以近似作为接近于车辆坐标系的标准坐标系。所述第三点云数据的坐标系与所述合并点云数据的坐标系一致,由所述合并点云数据拟合得到,因此也可以作为近似于车辆坐标系的标准坐标系。
基于由上述步骤得到的所述第二转换关系,将所述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据。
在步骤S104-3中,将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态。
由于所述第三点云数据的坐标系与所述合并点云数据的坐标系一致,也可以作为近似于车辆坐标系的标准坐标系。因此通过比较所述第二点云数据与所述第三点云数据,能够确定所述第二点云数据的坐标系是否为所述车辆坐标系,也即确定所述第一传感器的标定状态。
由于合并点云数据与所述第二点云数据有可能无法直接进行比较,因此本公开通过将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述第一传感器的标定状态,其中,所述第二点云数据与第三点云数据均由所述第一传感器转换得到,两点云数据的点云数量相同,因此便于比较处理,实时检测由于所述传感器与车辆相对位置发生变化而导致的传感器标定异常,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,所述将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离超出第一设定阈值的情况下,确定所述第一传感器标定异常;在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离未超出第一设定阈值的情况下,确定所述第一传感器标定正常。
传感器出现标定异常的情况可能由两种情况导致,在一种情况下,可能由所述传感器与车辆的相对位置发生变化而导致传感器标定异常。在由所述第一传感器与车辆相对位置发生变化使所述第一传感器出现标定异常的情况下,由所述第一传感器获取的第一点云数据与所述第二点云数据对应点坐标的平均位置存在偏移距离。在所述偏移距离超出第二设定阈值的情况下,确定所述第一传感器标定异常;在所述偏移距离未超出第二设定阈值的情况下,确定所述第一传感器标定正常,其中,所述第二设定阈值可根据实际需求设定。
本公开通过上述方法,可实时检测由于所述传感器与车辆相对位置发生变化而导致的传感器标定异常,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,所述根据所述第二点云数据以及所述合并点云数据,确定所 述传感器的标定状态,包括:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系;在所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值的情况下,确定所述第一传感器标定异常;在所述第一转换关系与所述第二转换关系中的参数偏差未超出第二设定阈值的情况下,确定所述第一传感器标定正常。
传感器出现标定异常的情况还可能由所述第一转关关系中参数错误导致。在一种情况下,所述参数包括从所述传感器的初始坐标系转换至车辆坐标系的旋转矩阵和平移矩阵构成的六自由度转换矩阵中的参数。在由所述第一转换关系中参数错误导致所述传感器标定异常的情况下,所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值;在所述参数偏差未超出第一设定阈值的情况下,所述第一传感器标定正常。其中,所述第二设定阈值可根据实际需求设定。
本公开通过上述方法,可实时检测由于所述第一转换关系中参数错误而导致的传感器标定异常,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,所述对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据,包括:对所述多个传感器中,除第一传感器外的其他传感器的第二点云数据进行合并,得到合并点云数据。
在一个示例中,所述车辆包括N个传感器,在确定待检测的所述第一传感器后,将除第一传感器外的其他(N-1)个传感器的第二点云数据进行合并,得到合并点云数据。
本公开通过上述方法,在所述第一传感器标定异常的情况下,通过对所述第一传感器之外的其余多个进行合并,避免了由于所述第一传感器的标定异常而影响所述合并点云数据的标定效果,提升标定状态检测效果。
图2示出了本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
在步骤S201中,对所述合并点云数据进行多帧点云数据合并,得到多帧合并点云数据。
在设定时间段内,以设定时间间隔获取所述传感器所得到的合并点云数据,可以得到所述合并点云的多帧数据,将所述合并点云的多帧数据进行点云融合,得到多帧合并点云数据。对所述点云的多帧数据进行点云融合是相关技术中的手段,本公开在此不做过多赘述。
在步骤S202中,基于所述多帧合并点云数据,获取车辆相对位移,得到车辆移动 方向。
通过所述多帧合并点云数据,可以确定所述合并点云在连续时间内的位移情况,也即车辆在所述设定时间段内的位移情况,从位移情况中可以获取车辆的移动方向。
在步骤S203中,将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态。
在传感器标定正常的情况下,所述合并点云的坐标系正方向朝向与所述车辆移动方向应当是一致的,而标定异常会导致所述合并点云的坐标系正方向朝向与所述车辆移动方向存在夹角,因此,通过比较所述合并点云的坐标系正方向与所述车辆移动方向,能够确定所述获取合并点云的多个传感器中,是否有传感器存在标定异常,也即确定所述传感器的标定状态。
本公开通过上述方法,在车辆行驶过程中基于方向确定传感器标定状态,提升标定状态检测效果。通过车辆在行驶中的标定状态检测,可及时使运维人员了解所述车辆是否需要进行传感器标定优化或维修,缩短了自出现标定异常到检测出所述标定异常的时间,使运维人员能够及时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,所述将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态,包括:在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常。
在所述多个传感器中的至少一个传感器标定异常的情况下,所述合并点云的坐标系正方向朝向与所述车辆移动方向存在夹角。在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常,其中,所述第一角度阈值可根据实际需求设定。
本公开通过上述方法,通过比较所述合并点云坐标系正方向与所述车辆移动方向夹角,在车辆行驶过程中实时进行传感器标定异常检测,缩短了传感器标定异常检测时间,简化了自出现标定异常到检测出所述标定异常的反馈流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
图3示出了本公开根据一示例性实施例示出的另一种传感器检测方法的流程图。
在步骤S301中,对所述合并点云数据进行地面分割处理,得到地面分割点云数据。
地面分割处理是获取地面分割点云的方法,例如包括平面栅格法、点云法向量、模型拟合法、面元网格法等方法,具体的地面分割方法是相关技术中的常用手段,本公开在此不做赘述。在本公开实施例中,通过所述地面分割法,可以得到地面分割点云。
在步骤S302中,将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态。
在传感器标定正常的情况下,所述合并点云的坐标系正方向朝向与地面分割点云获取的地面是平行关系,而标定异常会导致所述合并点云的坐标系正方向朝向与所述地面分割点云获取的地面存在夹角,因此,通过比较所述合并点云的坐标系正方向与所述地面夹角,能够确定所述获取合并点云的多个传感器中,是否有传感器存在标定异常,也即确定所述传感器的标定状态。
通过比较所述合并点云数据与所述地面分割点云数据,能够实时确定所述获取合并点云的多个传感器中,是否有传感器存在标定异常,也即确定所述传感器的标定状态。缩短了传感器标定异常检测时间,简化了自出现标定异常到检测出所述标定异常的反馈流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
本公开通过上述方法,基于地面状态确定所述传感器的标定状态,提升标定状态检测效果。通过基于地面的标定状态检测,可及时使运维人员了解所述车辆是否需要进行传感器标定优化或维修,缩短了自出现标定异常到检测出所述标定异常的反馈时间,使运维人员能够及时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,所述将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常。
在所述合并点云数据的坐标系正方向与由所述地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在所述合并点云数据的坐标系正方向与由所述地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常,其中,所述第二角度阈值可根据实际需求设定。
本公开通过上述方法,比较所述合并点云坐标系正方向与由所述地面分割点云获取 的地面间夹角,在车辆行驶过程中实时进行传感器标定异常检测,缩短了传感器标定异常检测时间,简化了自出现标定异常到检测出所述标定异常的反馈流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选实施例中,还可直接通过所述地面分割处理确定所述传感器的标定状态。
在地面处于水平并平整的情况下,将设置在车辆上的任意两个传感器获取的第一点云数据分别进行地面分割处理,在所述任意两个传感器分别获取的地面分割点云数据高度坐标超出第三设定阈值的情况下,确定所述传感器中至少一个传感器标定异常;在所述车辆任意两个传感器分别获取的地面分割点云数据高度坐标未超出第三设定阈值的情况下,确定所述传感器标定正常。
本公开通过上述方法,比较所述任意两个传感器分别获取的地面分割点云数据高度坐标,在车辆行驶过程中实时进行传感器标定异常检测,缩短了传感器标定异常检测时间,简化了自出现标定异常到检测出所述标定异常的反馈流程,便于运维人员能够随时进行处理,提升了车辆运行时的安全性。
在一个可选的实施例中,所述方法还包括:在确定所述传感器标定异常的情况下,生成传感器标定异常报告,并将所述异常报告发送至运维终端。所述异常报告,可以包括通过上述方法获取的由传感器获取的全部点云数据,和其他车辆运行相关数据。
在能够确定车辆中具体传感器标定异常的情况下,运维人员能够针对出现标定异常的传感器进行维修处理,在确定车辆中所述多个传感器中至少一个传感器存在标定异常的情况下,运维人员可以对所述多个传感器进行整体优化或维修。此外,在由于所述传感器标定异常导致车辆存在安全隐患的情况下,运维人员可以及时通知车辆用户当前风险。
本公开通过上述方法,在所述传感器出现标定异常的情况下,该异常能够被及时检测,缩短了自出现标定异常到检测出所述标定异常的反馈时间,使运维人员能够及时进行处理,提升了车辆运行时的安全性。
在一个可选的实施例中,所述方法还包括:以设定的时间间隔,根据所述传感器的标定状态生成传感器标定检测报告,并将所述标定检测报告发送至运维终端。
所述设定的时间间隔,可以以每日或每周为设定时间,根据所述传感器的标定状态生成传感器标定检测报告,并将所述标定检测报告发送至运维终端。在设定时间内所述传感器出现标定异常的情况下,将所述异常报告包含至所述标定检测报告;在设定时 间内所述传感器未出现标定异常的情况下,可以将通过上述方法获取的由传感器获取的全部点云数据和所述车辆当前的相关运行数据包含至所述标定检测报告。
本公开通过上述方法,使得运维人员能够获取设定时间内所述传感器的标定状态以及所述车辆运行状态,有助于对车辆的全面评估,提升车辆在运行时的安全性。
对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本公开并不受所描述的动作顺序的限制,因为依据本公开,某些步骤可以采用其他顺序或者同时进行。
其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于可选实施例,所涉及的动作和模块并不一定是本公开所必须的。
与前述应用功能实现方法实施例相对应,本公开还提供了应用功能实现装置及相应的终端的实施例。
图4示出了本公开根据一示例性实施例示出的一种传感器检测装置示意图,所述装置可以包括:第一点云获取模块401,用于分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据;第二点云生成模块402,用于基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到;合并点云生成模块403,用于对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据;标定检测模块404,用于根据第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第二点云生成模块,基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,具体用于:基于从所述传感器的初始坐标系转换至车辆坐标系的六自由度转换矩阵,将所述传感器的第一点云数据分别转换至车辆坐标系,其中,所述六自由度转换矩阵包括旋转矩阵和平移矩阵。
结合本公开提供的任一实施方式,所述标定检测模块根据所述传感器的第一点云数据、第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,具体用于:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系,其中,所述第一传感器为所述多个传感器中的任一传感器;基于所述第二转换关系,将所述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据,所述合并点云的坐标系由所述合并点云数据拟合得到;将所述第一传感器的 第二点云数据与第三点云数据进行比较,根据比较结果确定所述第一传感器的标定状态。
结合本公开提供的任一实施方式,所述标定检测模块将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离超出第一设定阈值的情况下,确定所述第一传感器标定异常;在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离未超出第一设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实时方式,所述标定检测模块根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系;在所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值的情况下,确定所述第一传感器标定异常;在所述第一转换关系与所述第二转换关系中的参数偏差未超出第二设定阈值的情况下,确定所述第一传感器标定正常。
结合本公开提供的任一实施方式,所述合并点云生成模块对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据,具体用于:对所述多个传感器中,除第一传感器外的其他传感器的第二点云数据进行合并,得到合并点云数据。
结合本公开提供的任一实施方式,所述装置还包括第一角度比较模块,用于:对所述合并点云数据进行多帧点云数据合并,得到多帧合并点云数据;基于所述多帧合并点云数据,获取车辆相对位移,得到车辆移动方向;将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第一角度比较模块将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态,具体用于:在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述装置还包括第二角度比较模块,用于:对所述合并点云数据进行地面分割处理,得到地面分割点云数据;将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态。
结合本公开提供的任一实施方式,所述第二角度比较模块将所述合并点云数据与 所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态,具体用于:在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常。
结合本公开提供的任一实施方式,所述装置还包括异常报告模块,用于:在确定所述存在传感器标定异常的情况下,生成传感器标定异常报告,并将所述异常报告发送至运维终端。
结合本公开提供的任一实施方式,所述装置还包括检测报告模块,用于:以设定的时间间隔,根据所述传感器的标定状态生成传感器标定检测报告,并将所述检测报告发送至运维终端。对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中上述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本公开方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
图5示出了本公开根据一示例性实施例示出的一种电子设备框图。
如图5所示,该设备可以包括:处理器、存储器、网络接口和内部总线。其中处理器、存储器、网络接口通过总线实现彼此之间在设备内部的通信连接。
处理器可以采用通用的CPU(Central Processing Unit,中央处理器)、微处理器、应用专用集成电路(Application Specific Integrated Circuit,ASIC)、或者一个或多个集成电路等方式实现,用于执行相关程序,以实现本申请所提供的技术方案。
存储器可以采用ROM(Read Only Memory,只读存储器)、RAM(Random Access Memory,随机存取存储器)、静态存储设备,动态存储设备等形式实现。存储器可以存储操作系统和其他应用程序,在通过软件或者固件来实现本申请所提供的技术方案时,相关的程序代码保存在存储器中,并由处理器来调用执行。
网络接口用于连接通信模块(图中未示出),以实现本设备与其他设备的通信交互。其中通信模块可以通过有线方式(例如USB、网线等)实现通信,也可以通过无线方式(例如移动网络、WIFI、蓝牙等)实现通信。
总线包括一通路,在设备的各个组件(例如处理器、存储器、网络接口)之间传输信息。
需要说明的是,尽管上述设备仅示出了处理器、存储器、网络接口以及总线,但是在具体实施过程中,该设备还可以包括实现正常运行所必需的其他组件。此外,本领域的技术人员可以理解的是,上述设备中也可以仅包含实现本申请方案所必需的组件,而不必包含图中所示的全部组件。
在示例性实施例中,本公开还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器,上述指令可由电子设备的处理器执行,以实上述无线耳机连接方法的步骤。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
在示例性实施例中,本公开还提供了一种智能车辆,包括上述电子设备。
本领域技术人员在考虑说明书及实践这里公开的公开后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限制。

Claims (16)

  1. 一种传感器检测方法,包括:
    分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据;
    基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到;
    对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据;
    根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
  2. 根据权利要求1所述的方法,其特征在于,所述基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,包括:
    基于从所述传感器的初始坐标系转换至车辆坐标系的六自由度转换矩阵,将所述传感器的第一点云数据分别转换至车辆坐标系,其中,所述六自由度转换矩阵包括旋转矩阵和平移矩阵。
  3. 根据权利要求1所述的方法,其特征在于,所述根据所述传感器的第一点云数据、第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:
    对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系,其中,所述第一传感器为所述多个传感器中的任一传感器;
    基于所述第二转换关系,将所述第一传感器的第一点云数据转换至合并点云的坐标系,得到所述第一传感器的第三点云数据,所述合并点云的坐标系由所述合并点云数据拟合得到;
    将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述第一传感器的标定状态。
  4. 根据权利要求3所述的方法,其特征在于,所述将所述第一传感器的第二点云数据与第三点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:
    在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离超出第一设定阈值的情况下,确定所述第一传感器标定异常;
    在所述第二点云数据与所述第三点云数据中对应点坐标的平均位置偏移距离未超出第一设定阈值的情况下,确定所述第一传感器标定正常。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述第二点云数据以及所述合并点云数据,确定所述传感器的标定状态,包括:
    对所述第一传感器的第一点云数据与所述合并点云数据进行配准,得到第二转换关系;
    在所述第一转换关系与所述第二转换关系中的参数偏差超出第二设定阈值的情况下,确定所述第一传感器标定异常;
    在所述第一转换关系与所述第二转换关系中的参数偏差未超出第二设定阈值的情况下,确定所述第一传感器标定正常。
  6. 根据权利要求3所述的方法,其特征在于,所述对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据,包括:
    对所述多个传感器中,除第一传感器外的其他传感器的第二点云数据进行合并,得到合并点云数据。
  7. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    对所述合并点云数据进行多帧点云数据合并,得到多帧合并点云数据;
    基于所述多帧合并点云数据,获取车辆相对位移,得到车辆移动方向;
    将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态。
  8. 根据权利要求7所述的方法,其特征在于,所述将所述合并点云的坐标系正方向朝向与所述车辆移动方向进行比较,根据比较结果确定所述传感器的标定状态,包括:
    在两方向夹角的角度超出第一角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;
    在两方向夹角的角度未超出第一角度阈值的情况下,确定所述多个传感器标定正常。
  9. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    对所述合并点云数据进行地面分割处理,得到地面分割点云数据;
    将所述合并点云数据与所述地面分割点云数据进行比较;
    根据比较结果确定所述传感器的标定状态。
  10. 根据权利要求9所述的方法,其特征在于,所述将所述合并点云数据与所述地面分割点云数据进行比较,根据比较结果确定所述传感器的标定状态,包括:
    在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面间夹角的角度超出第二角度阈值的情况下,确定所述多个传感器中的至少一个传感器标定异常;
    在所述合并点云数据的坐标系正方向与由所述地面分割点云获取的地面夹角的角度未超出第二角度阈值的情况下,确定所述多个传感器标定正常。
  11. 根据权利要求1至10任一项所述的方法,其特征在于,所述方法还包括:
    在确定所述传感器标定异常的情况下,生成传感器标定异常报告,并将所述异常报告发送至运维终端。
  12. 根据权利要求1至10任一项所述的方法,其特征在于,所述方法还包括:
    以设定的时间间隔,根据所述传感器的标定状态生成传感器标定检测报告,并将所述标定检测报告发送至运维终端。
  13. 一种传感器检测装置,包括:
    第一点云获取模块,用于分别获取设置在车辆上的多个传感器在初始坐标系下的第一点云数据;
    第二点云生成模块,用于基于第一转换关系,将所述传感器的第一点云数据分别转换至车辆坐标系,得到所述传感器的第二点云数据,其中,所述第一转换关系根据所述传感器在所述车辆上的初始设置位姿得到;
    合并点云生成模块,用于对所述多个传感器中的至少两个传感器的第二点云数据进行合并,得到合并点云数据;
    标定检测模块,用于根据所述传感器的第一点云数据、第二点云数据以及所述合并点云数据,确定所述传感器的标定状态。
  14. 一种电子设备,其特征在于,所述电子设备包括:
    存储器,用于存储处理器可执行指令;
    处理器,被配置为执行所述存储器中的可执行指令以实现权利要求1至12任一项所述方法的步骤。
  15. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述程序被处理器执行时实现权利要求1至12任一项所述的方法的步骤。
  16. 一种智能车辆,其特征在于,包括权利要求14所述的电子设备。
PCT/CN2022/071109 2021-12-07 2022-01-10 传感器检测的方法、装置、电子设备及可读储存介质 WO2023103143A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111486395.7 2021-12-07
CN202111486395.7A CN115235525B (zh) 2021-12-07 2021-12-07 传感器检测方法、装置、电子设备及可读储存介质

Publications (1)

Publication Number Publication Date
WO2023103143A1 true WO2023103143A1 (zh) 2023-06-15

Family

ID=83666049

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/071109 WO2023103143A1 (zh) 2021-12-07 2022-01-10 传感器检测的方法、装置、电子设备及可读储存介质

Country Status (2)

Country Link
CN (1) CN115235525B (zh)
WO (1) WO2023103143A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112146682A (zh) * 2020-09-22 2020-12-29 福建牧月科技有限公司 智能汽车的传感器标定方法、装置、电子设备及介质
WO2021001341A1 (de) * 2019-07-02 2021-01-07 Valeo Schalter Und Sensoren Gmbh Kalibrieren eines aktiven optischen sensorsystems anhand eines kalibriertargets
CN112241007A (zh) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 自动驾驶环境感知传感器的标定方法、布置结构及车辆
CN112577517A (zh) * 2020-11-13 2021-03-30 上汽大众汽车有限公司 一种多元定位传感器联合标定方法和系统
WO2021098448A1 (zh) * 2019-11-18 2021-05-27 商汤集团有限公司 传感器标定方法及装置、存储介质、标定系统和程序产品
CN113064143A (zh) * 2020-08-14 2021-07-02 百度(美国)有限责任公司 用于具有多LiDAR传感器的自动驾驶车辆的再校准确定系统
CN113610745A (zh) * 2021-01-20 2021-11-05 腾讯科技(深圳)有限公司 标定评价参数获取方法和装置、存储介质及电子设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6649191B2 (ja) * 2016-06-29 2020-02-19 クラリオン株式会社 車載処理装置
CN107167090A (zh) * 2017-03-13 2017-09-15 深圳市速腾聚创科技有限公司 车辆外廓尺寸测量方法及系统
US11346950B2 (en) * 2018-11-19 2022-05-31 Huawei Technologies Co., Ltd. System, device and method of generating a high resolution and high accuracy point cloud
CN116577760A (zh) * 2019-08-21 2023-08-11 深圳市速腾聚创科技有限公司 一种外参标定装置
CN111007530B (zh) * 2019-12-16 2022-08-12 武汉汉宁轨道交通技术有限公司 激光点云数据处理方法、装置及系统
CN113748693B (zh) * 2020-03-27 2023-09-15 深圳市速腾聚创科技有限公司 路基传感器的位姿校正方法、装置和路基传感器
CN113487479B (zh) * 2021-06-30 2024-08-02 北京易控智驾科技有限公司 车端实时检测识别高精度地图边界方法和系统
CN113658256B (zh) * 2021-08-16 2024-07-16 智道网联科技(北京)有限公司 基于激光雷达的目标检测方法、装置及电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021001341A1 (de) * 2019-07-02 2021-01-07 Valeo Schalter Und Sensoren Gmbh Kalibrieren eines aktiven optischen sensorsystems anhand eines kalibriertargets
WO2021098448A1 (zh) * 2019-11-18 2021-05-27 商汤集团有限公司 传感器标定方法及装置、存储介质、标定系统和程序产品
CN112241007A (zh) * 2020-07-01 2021-01-19 北京新能源汽车技术创新中心有限公司 自动驾驶环境感知传感器的标定方法、布置结构及车辆
CN113064143A (zh) * 2020-08-14 2021-07-02 百度(美国)有限责任公司 用于具有多LiDAR传感器的自动驾驶车辆的再校准确定系统
CN112146682A (zh) * 2020-09-22 2020-12-29 福建牧月科技有限公司 智能汽车的传感器标定方法、装置、电子设备及介质
CN112577517A (zh) * 2020-11-13 2021-03-30 上汽大众汽车有限公司 一种多元定位传感器联合标定方法和系统
CN113610745A (zh) * 2021-01-20 2021-11-05 腾讯科技(深圳)有限公司 标定评价参数获取方法和装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN115235525B (zh) 2023-05-23
CN115235525A (zh) 2022-10-25

Similar Documents

Publication Publication Date Title
US20230185292A1 (en) Industrial safety monitoring configuration using a digital twin
CN112819896B (zh) 传感器的标定方法及装置、存储介质和标定系统
US10974730B2 (en) Vehicle perception system on-line diangostics and prognostics
JPWO2018079297A1 (ja) 故障検知装置
US20210043079A1 (en) Vehicle information interacting method, device, apparatus and storage medium
US20200300967A1 (en) Sensor verification
WO2021017072A1 (zh) 基于激光雷达的 slam 闭环检测方法及检测系统
CN111684382A (zh) 可移动平台状态估计方法、系统、可移动平台及存储介质
JP2019191145A (ja) 充電台の識別方法、装置、ロボット及びコンピュータ読取可能な記憶媒体
KR20220136300A (ko) 레이저 레이더와 위치 결정 기기의 캘리브레이션 방법, 기기 및 자율 주행 차량
US12008784B2 (en) Systems and methods for image-based electrical connector assembly detection
CN111752825A (zh) 一种面向智能电动汽车的即插即用软件平台及其检测方法
EP4279950A1 (en) Fault diagnosis and handling method for vehicle-mounted laser radar, apparatus, medium and vehicle
Li et al. Occupancy grid map formation and fusion in cooperative autonomous vehicle sensing
CN112946609A (zh) 激光雷达与相机的标定方法、装置、设备及可读存储介质
WO2023103143A1 (zh) 传感器检测的方法、装置、电子设备及可读储存介质
CN110426714B (zh) 一种障碍物识别方法
CN117392241B (zh) 自动驾驶中的传感器标定方法、装置及电子设备
KR20220046274A (ko) 고장 진단 시스템 및 그 방법
CN112835029A (zh) 面向无人驾驶的多传感器障碍物探测数据融合方法及系统
CN115236696B (zh) 确定障碍物的方法、装置、电子设备及存储介质
CN115100287B (zh) 外参标定方法及机器人
CN110232714B (zh) 深度相机的标定方法及系统
CN112180910B (zh) 一种移动机器人障碍物感知方法和装置
CN114227699A (zh) 机器人动作调整方法、设备以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22902588

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE