CN112859106A - Laser radar, laser detection method and vehicle comprising laser radar - Google Patents
Laser radar, laser detection method and vehicle comprising laser radar Download PDFInfo
- Publication number
- CN112859106A CN112859106A CN201911185467.7A CN201911185467A CN112859106A CN 112859106 A CN112859106 A CN 112859106A CN 201911185467 A CN201911185467 A CN 201911185467A CN 112859106 A CN112859106 A CN 112859106A
- Authority
- CN
- China
- Prior art keywords
- image acquisition
- radar
- acquisition unit
- image
- laser
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims description 31
- 239000000523 sample Substances 0.000 claims abstract description 13
- 238000002592 echocardiography Methods 0.000 claims abstract description 3
- 230000001960 triggered effect Effects 0.000 claims description 4
- 238000012545 processing Methods 0.000 description 13
- 230000004927 fusion Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000006854 communication Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000005622 photoelectricity Effects 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention relates to a lidar comprising: a radar opto-mechanical unit configured to emit probe beams at a plurality of angles in a first plane and to receive radar echoes; a control unit configured to generate point cloud data from the radar echo; and the image acquisition unit interface is suitable for being externally connected with an image acquisition unit, and the image acquisition unit is configured to be capable of synchronously acquiring image data around the laser radar with the radar optical-mechanical unit under the control of the control unit.
Description
Technical Field
The invention relates to the technical field of photoelectricity, in particular to a laser radar, a laser detection method and a vehicle comprising the laser radar.
Background
LiDAR is a general name of laser active detection sensor equipment, and the working principle of the LiDAR is roughly as follows: laser radar's transmitter launches a bundle of laser, and after laser beam met the object, through diffuse reflection, returned to laser receiver, radar module multiplies the velocity of light according to the time interval of sending and received signal, divides by 2 again, can calculate the distance of transmitter and object. Depending on the number of laser beams, there are generally, for example, a single line laser radar, a 4-line laser radar, an 8/16/32/64-line laser radar, and the like. One or more laser beams are emitted along different angles in the vertical direction and scanned in the horizontal direction to realize the detection of the three-dimensional profile of the target area. The multiple measurement channels (lines) correspond to the scan planes at multiple tilt angles, so that the more laser beams in the vertical field, the higher the angular resolution in the vertical direction, and the greater the density of the laser point cloud.
The surrounding environment is identified and judged by the point cloud of the laser radar alone, and the point cloud is not visual enough in some cases, so that image information needs to be provided in a supplementary mode. The existing product combining the laser radar and the camera is not flexible enough, and the type and the installation position of the camera are fixed, so that the requirements of customers cannot be met. And the accuracy of data space and time synchronization of the laser radar and the camera is not high, the laser radar and the laser radar are fused after the pictures are obtained, the time synchronization is poor, and meanwhile, the difficulty is increased for the space synchronization. Part of the filtering algorithm may lose normal data.
The statements in the background section are merely prior art as they are known to the inventors and do not, of course, represent prior art in the field.
Disclosure of Invention
In view of at least one of the problems of the prior art, the present invention provides a lidar comprising:
a radar opto-mechanical unit configured to emit probe beams at a plurality of angles in a first plane and to receive radar echoes;
a control unit configured to generate point cloud data from the radar echo; and
and the image acquisition unit interface is suitable for being externally connected with an image acquisition unit, and the image acquisition unit is configured to be capable of synchronously acquiring image data around the laser radar with the radar optical-mechanical unit under the control of the control unit.
According to one invention of the invention, the laser radar further comprises the image acquisition unit which is arranged separately relative to the radar optical-mechanical unit and the control unit, the control unit triggers the image acquisition unit to perform image data acquisition operation through the image acquisition unit interface, and the image data of the image acquisition unit can be sent to the control unit through the image acquisition unit interface.
According to one invention of the present invention, the lidar comprises N image acquisition unit interfaces and N image acquisition units, N is greater than 1, wherein each image acquisition unit is connected to one of the image acquisition unit interfaces, and the fields of view of any two image acquisition units are at least partially non-overlapping.
According to an invention of the present invention, the control unit is configured to: when the laser radar emits a detection light beam or receives a radar echo at one angle in the first plane, one of the image acquisition units is triggered to acquire an image corresponding to the one angle.
According to one invention of the present invention, the image acquisition unit includes a lens and an area array/linear array CMOS or an area array/linear array CCD.
According to one aspect of the invention, the controller is configured to register the point cloud data for a plurality of angles of the first plane with the respective images.
The invention also provides a laser detection method, which comprises the following steps:
step S101: emitting a probe beam at one of the angles in a first plane and receiving a radar echo;
step S102: generating point cloud data corresponding to the angle according to the radar echo;
step S103: acquiring an image corresponding to the angle in the first plane; and
step S104: registering the point cloud data with the image.
According to an aspect of the present invention, the step S103 is performed when a probe beam is emitted or a radar echo is received at one of the angles in the first plane.
According to an invention of the present invention, the step S103 includes: and acquiring an image corresponding to the angle in the first plane through an area array/linear array CMOS or an area array/linear array CCD.
According to an invention of the present invention, the laser detection method further includes: repeating the steps S101, S102, S103 and S104 at a plurality of angles in the first plane.
According to one aspect of the invention, the laser detection method is implemented by a lidar according to any of claims 1-6.
The invention also provides a vehicle comprising a laser detection system according to any of claims 1-6.
By the scheme of the embodiment of the invention, the cost of the laser radar can be partially reduced, the arrangement space is small, and the dynamic calibration is convenient. Because the circuit board and the chip of the photoelectric processing of the image acquisition unit are arranged in the laser radar body, the fusibility of the generated RGB image and the point cloud image can be ensured. In addition, the laser radar body is provided with an image acquisition unit interface which is connected with an image acquisition unit arranged outside the laser radar body, so that the position and the angle of the external image acquisition unit can not be limited, and the use degree of freedom is higher. Because the inside control panel (lower storehouse board) of laser radar body both is responsible for the control of image acquisition unit, is responsible for the control of the laser emission device of radar ray apparatus unit and/or photoelectric detection part again, and the two is controlled by same control unit also promptly, consequently can promote the synchronism. Meanwhile, the point cloud image and the RGB image can be referenced and interacted with each other, so that alignment can be better realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 shows a schematic diagram of a lidar in accordance with an embodiment of the invention;
FIG. 2 illustrates a lidar in accordance with a preferred embodiment of the present invention;
FIG. 3 illustrates a signal processing scheme of a lidar in accordance with a preferred embodiment of the present invention;
fig. 4 illustrates a laser detection method according to a preferred embodiment of the present invention.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, it should be noted that unless otherwise explicitly stated or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection, either mechanically, electrically, or in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, "above" or "below" a first feature means that the first and second features are in direct contact, or that the first and second features are not in direct contact but are in contact with each other via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly above and obliquely above the second feature, or simply meaning that the first feature is at a lesser level than the second feature.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
Fig. 1 shows a schematic diagram of a lidar 10 according to an embodiment of the invention, described in detail below with reference to fig. 1.
As shown in fig. 1, the laser radar 10 includes a radar optical-mechanical unit 11, a control unit 12, and an image acquisition unit interface 13. The radar optical-mechanical unit 11 may include a laser emitting unit and a laser receiving unit, wherein the laser emitting unit is configured to emit a detection beam at a plurality of angles in a first plane (typically, a plurality of angles in a horizontal plane), for example, may include a light source such as an Edge Emitting Laser (EEL) or a vertical cavity surface emitting laser (VCESL), and the laser receiving unit may receive a radar echo, for example, may include a photodetector such as an APD, SiPM, SPAD, and convert an optical signal into an electrical signal.
The radar opto-mechanical unit 11 may be an opto-mechanical unit of a mechanical radar or an opto-mechanical unit of a solid-state radar. Taking the mechanical radar opto-mechanical unit as an example, it can rotate in a horizontal plane (first plane) about an axis O in the vertical direction in fig. 1 and emit probe beams at various angles in the horizontal plane, for example, every 0.1 degrees (each group of probe beams may include, for example, 16, 32, 40, 64, or 128 laser beams, respectively directed at different angles in the vertical plane (or second plane) so that objects of different heights can be detected). The detection light beam is diffusely reflected on an obstacle outside the laser radar, and part of the reflected light beam (or referred to as radar echo) returns to the laser radar, is received by a detector of the laser radar and is converted into an electric signal, which is not described herein again.
The control unit 12 is configured to generate point cloud data from the radar returns. The detector converts the received radar echo from an optical signal to an electrical signal, and the control unit 12 receives the electrical signal and performs corresponding signal processing including, but not limited to, signal amplification, filtering, analog-to-digital conversion, and the like, thereby generating point cloud data. The point cloud data may characterize one or more of the distance, angle, reflectivity, etc. of the lidar external obstacle. According to a preferred embodiment of the present invention, the control unit 12 may be implemented by a lower bulkhead in the lidar.
The image acquisition unit interface 13 is adapted to be externally connected with an image acquisition unit, and the image acquisition unit can synchronously acquire image data around the laser radar with the radar optical-mechanical unit 11 under the control of the control unit 12. In fig. 1 is schematically shown having four image acquisition unit interfaces, respectively 13-1, 13-2, 13-3 and 13-4. The present invention is not limited to a specific number of image capturing unit interfaces, and may include more or less image capturing unit interfaces, for example, 1, 2, 3, 5, 6 or more, and may be determined according to the image capturing range, the viewing angle of each image capturing unit, the resolution, and other factors.
Preferably, the number N of image capturing unit interfaces is greater than 1, each image capturing unit is connected to one of the image capturing unit interfaces, and the fields of view of any two image capturing units are at least partially non-overlapping.
As shown in fig. 1, a double arrow is provided between the control unit 12 and the image capturing unit interface 13, wherein a downward arrow indicates that the control unit 12 can send a start or trigger instruction to an external image capturing unit through the image capturing unit interface 13; the upward arrow indicates that the external image capturing unit can transmit the photoelectric signal or image data of the captured image to the control unit 12 through the image capturing unit interface 13.
Those skilled in the art will readily appreciate that the bi-directional arrows or bi-directional communications shown in fig. 1 are a preferred embodiment of the present invention, and that unidirectional arrows or unidirectional communications, such as only downward arrows or upward arrows, are also within the scope of the present invention. The control unit triggers the image acquisition unit to acquire image data through the image acquisition unit interface, and the image data of the image acquisition unit can be sent to the control unit through the image acquisition unit interface. Therefore, the control unit 12 can control both the transmission and reception of the radar optical-mechanical unit 11 and the image acquisition unit to acquire images and synchronize the two.
According to a preferred embodiment of the present invention, when the control unit 12 controls the radar optical-mechanical unit 11 to emit a detection pulse at an angle in a first plane (a horizontal plane in fig. 1), the image acquisition unit is triggered to perform an image data acquisition operation. Alternatively, when the control unit 12 controls the radar optical-mechanical unit 11 to receive the radar echo at an angle in the horizontal plane, the image acquisition unit is triggered to perform an image data acquisition operation, and an image corresponding to the one angle is acquired. The range of the image collected by the image collecting unit corresponds to the current detection position of the radar optical-mechanical unit 11, or includes the current detection position of the radar optical-mechanical unit 11.
After receiving the image data (not the image itself) uploaded by the external image acquisition unit, the control unit 12 may blend the image data into the point cloud data of the laser radar and output the data together. In this way, the finally output data includes, in addition to the point cloud data, image data of the point, such as RGB data. When subsequent signal processing is performed, not only can the surrounding environment be reconstructed from the point cloud data, but also the situation of the surrounding environment can be more truly reproduced through the fused image data.
According to a preferred embodiment of the invention, as shown in fig. 2, the image acquisition units (14-1,14-2) are part of the lidar. The control unit 12 comprises a first controller 12-1 and a second controller 12-2, wherein the first controller 12-1 is for example integrated on the lower panel of the lidar, is electrically connected to the image acquisition units, and can control the exposure time of each image acquisition unit, for example in synchronization with the emission of a probe beam or the reception of a radar echo by the radar light engine unit 11. The second controller 12-2 is electrically connected to each image capturing unit, and is configured to receive the photoelectric signal of each image capturing unit and perform image data processing, such as generating an RGB image, and then transmitting the generated RGB image to the first controller 12-1, and the point cloud image and the RGB image are fused together by the first controller 12-1.
According to a preferred embodiment of the present invention, the image acquisition unit is provided separately from the radar-optical machine unit 11 (and the control unit 12). Radar ray apparatus unit 11 with control unit 12 can set up on laser radar's the radar body, and image acquisition unit set up in outside radar ray apparatus unit 11 and the control unit 12, also set up outside laser radar, and then image acquisition unit is connected to through flexible data line on the image acquisition unit interface 13, be connected with laser radar or carry out the information interaction through the image acquisition unit interface. In this way, the deployment of the lidar can be more flexible and convenient, for example, the body of the lidar can be arranged on the roof of a vehicle, while the image acquisition units are arranged at the four corners of the vehicle.
As shown in fig. 2, the image pickup unit includes a lens 14-1 and a photosensor 14-2. The lens 14-1 focuses a light beam from the outside onto the photosensor 14-2. The photosensors are, for example, CMOS sensor arrays or CCD arrays, which can generate electrical signals representing picture pixels.
According to a preferred embodiment of the present invention, the photosensor 14-2 may be an area/line CMOS or an area/line CCD, wherein the lens 14-1 is used for converging light beams from the outside onto the area/line CMOS or the area/line CCD, and the area/line CMOS or the area/line CCD converts light signals into electrical signals and outputs the electrical signals. The point cloud data and the RGB image of the laser radar may be aligned by using a line bundle as a unit, that is, aligned in a line-to-line manner (for example, covering an object in a range of 0.1 degree in the horizontal direction), so that the accuracy is improved, and the data fusion effect of different detectors is further improved. According to one embodiment of the invention. The registration of the point cloud data with the image data may be performed by the control unit 12 (e.g., the first controller 12-1), and output after registration and fusion.
According to one embodiment, the image acquisition unit does not include additional image processing circuitry, but rather, the electrical signals acquired by the photosensor 14-2 are directly uploaded to the control unit 12 (e.g., lower panel) of the lidar, and corresponding signal processing is performed by the control unit 12 and image data (e.g., RGB images) is generated. Alternatively, the lens 14-1 and the area/line CMOS or area/line CCD 14-2 may also be part of an area/line camera, in such a way that the data uploaded to the control unit 12 of the laser radar may be RGB image data without generating image data in the control unit 12 from electrical signals of a photosensor. These are all within the scope of the present invention.
According to a preferred embodiment of the present invention, the control unit 12 may synchronously acquire image signals in the following manner. The laser radar 10 shown in the figure includes four image capturing units for illustration, each image capturing unit covers a 90-degree field of view to implement seamless splicing of 360-degree fields of view in a horizontal plane, and the four image capturing units cover respective fields of view of 0-90 degrees, 90-180 degrees, 180-270 degrees, and 270-360 degrees. The radar optical mechanical unit 11 of the laser radar 10 rotates 360 degrees in the horizontal plane, and transmits the probe beam and receives the radar echo every 0.1 degree. When transmitting and receiving at a position with an angle of, for example, 60 degrees, the control unit 12 synchronously triggers the first image acquisition unit, so that the area array/linear array CMOS or the area array/linear array CCD therein is exposed in alignment with the 60-degree azimuth, acquires an image of the position, and uploads image data, which is fused with point cloud data at the 60-degree position by the control unit 12.
Fig. 3 shows a signal processing mode of the laser radar according to a preferred embodiment of the present invention. The portions shown by the dotted lines can be arranged inside the laser radar body, the image acquisition unit (shown in the figure, a CMOS photosensitive device) is arranged outside the laser radar body, and meanwhile, a circuit for processing photoelectric signals of the image acquisition unit is also arranged inside the laser radar body. Through such mode, set up RGB image processing's circuit and chip in laser radar's body, laser radar's lower storehouse board both can control external image acquisition unit, can control laser emission device and/or detecting device in laser radar's the radar ray apparatus unit again. In specific implementation, the lower cabin plate of the existing laser radar can be adopted, and a circuit board can be additionally arranged near the existing lower cabin plate.
In the embodiment of the invention, the external image acquisition unit interface is additionally arranged on the body of the laser radar and is connected with the image acquisition unit (such as an area array/linear array CMOS or an area array/linear array CCD) arranged outside the body of the laser radar, and the number of the interfaces can be set according to actual requirements.
By the technical scheme of the embodiment of the invention, the synchronization of space and time can be better realized, and the laser radar and a subsequent signal processing system are helped to better perform data fusion and data rationality processing. If an area array/linear array CMOS or an area array/linear array CCD is used, a user-customized area array/linear array CMOS/CCD can be adopted; if a non-area array/linear array image acquisition unit is used, an interface conversion plate can be arranged in the laser radar.
By the scheme of the embodiment of the invention, the cost of the laser radar can be partially reduced, the arrangement space is small, and the dynamic calibration is convenient. Because the circuit board and the chip of the photoelectric processing of the image acquisition unit are arranged in the laser radar body, the fusibility of the generated RGB image and the point cloud image can be ensured. In addition, the laser radar body is provided with an image acquisition unit interface which is connected with an image acquisition unit arranged outside the laser radar body, so that the position and the angle of the external image acquisition unit can not be limited, and the use degree of freedom is higher. Because the inside control panel (lower storehouse board) of laser radar body both is responsible for the control of image acquisition unit, is responsible for the control of the laser emission device of radar ray apparatus unit and/or photoelectric detection part again, and the two is controlled by same control unit also promptly, consequently can promote the synchronism. Meanwhile, the point cloud image and the RGB image can be referenced and interacted with each other, so that alignment can be better realized.
Fig. 4 illustrates a laser detection method 100 according to a preferred embodiment of the present invention, which is described in detail below with reference to fig. 4. The laser detection method 100 may be implemented, for example, by the laser radar 10 described above.
In step S101: a probe beam is emitted at one of the angles in the first plane and a radar echo is received.
In step S102: and generating point cloud data corresponding to the angle according to the radar echo.
In step S103: acquiring an image corresponding to the angle in the first plane.
In step S104: registering the point cloud data with the image.
Those skilled in the art will readily appreciate that although steps S101, S102, S103, and S104 are described above in a certain order, the scope of the present invention is not limited to the order of the individual steps. The steps described above may be performed in another order, or some of the steps may be performed simultaneously, all within the scope of the present invention.
According to a preferred embodiment of the present invention, the step S103 is performed when a probe beam is emitted or a radar echo is received at one of the angles in the first plane, so that the point cloud data can be synchronized with the image.
According to a preferred embodiment of the present invention, the step S103 includes: and acquiring an image corresponding to the angle in the first plane through an area array/linear array CMOS or an area array/linear array CCD.
According to a preferred embodiment of the present invention, the laser detection method further comprises: repeating the steps S101, S102, S103 and S104 at a plurality of angles in the first plane.
According to a preferred embodiment of the invention, the laser detection method is implemented by a lidar according to any of claims 1-6.
The invention also relates to a vehicle comprising a laser detection system as described above. Preferably, the vehicle includes four image acquisition units, which are respectively disposed at four corners of the vehicle, and each image acquisition unit is responsible for acquiring an image within a certain field range.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (12)
1. A lidar comprising:
a radar opto-mechanical unit configured to emit probe beams at a plurality of angles in a first plane and to receive radar echoes;
a control unit configured to generate point cloud data from the radar echo; and
and the image acquisition unit interface is suitable for being externally connected with an image acquisition unit, and the image acquisition unit is configured to be capable of synchronously acquiring image data around the laser radar with the radar optical-mechanical unit under the control of the control unit.
2. The lidar of claim 1, further comprising the image acquisition unit separately disposed from the radar optical-mechanical unit and the control unit, wherein the control unit triggers the image acquisition unit to perform an image data acquisition operation through the image acquisition unit interface, and image data of the image acquisition unit can be sent to the control unit through the image acquisition unit interface.
3. The lidar of claim 2, wherein the lidar comprises N image acquisition unit interfaces and N image acquisition units, N being larger than 1, wherein each image acquisition unit is connected to one of the image acquisition unit interfaces and the fields of view of any two image acquisition units are at least partially non-overlapping.
4. The lidar according to claim 3, wherein the control unit is configured to: when the laser radar emits a detection light beam or receives a radar echo at one angle in the first plane, one of the image acquisition units is triggered to acquire an image corresponding to the one angle.
5. The lidar according to any of claims 2-4, wherein the image acquisition unit comprises a lens and an area/line CMOS or an area/line CCD.
6. The lidar of claim 5, wherein the controller is configured to register point cloud data for a plurality of angles of the first plane with respective images.
7. A laser detection method, comprising:
step S101: emitting a probe beam at one of the angles in a first plane and receiving a radar echo;
step S102: generating point cloud data corresponding to the angle according to the radar echo;
step S103: acquiring an image corresponding to the angle in the first plane; and
step S104: registering the point cloud data with the image.
8. The laser detection method of claim 7, wherein the step S103 is performed when a probe beam is emitted or a radar echo is received at one of the angles in the first plane.
9. The laser detection method according to claim 7 or 8, wherein the step S103 includes: and acquiring an image corresponding to the angle in the first plane through an area array/linear array CMOS or an area array/linear array CCD.
10. The laser detection method of claim 7 or 8, further comprising: repeating the steps S101, S102, S103 and S104 at a plurality of angles in the first plane.
11. The laser detection method according to claim 7 or 8, wherein the laser detection method is implemented by a lidar according to any of claims 1-6.
12. A vehicle comprising a laser detection system as claimed in any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911185467.7A CN112859106A (en) | 2019-11-27 | 2019-11-27 | Laser radar, laser detection method and vehicle comprising laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911185467.7A CN112859106A (en) | 2019-11-27 | 2019-11-27 | Laser radar, laser detection method and vehicle comprising laser radar |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112859106A true CN112859106A (en) | 2021-05-28 |
Family
ID=75985069
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911185467.7A Pending CN112859106A (en) | 2019-11-27 | 2019-11-27 | Laser radar, laser detection method and vehicle comprising laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112859106A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114296094A (en) * | 2021-12-31 | 2022-04-08 | 探维科技(苏州)有限公司 | Radar detection method, device, system and medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2012139675A (en) * | 2012-09-18 | 2014-03-27 | Министерство Промышленности И Торговли Российской Федерации | CIRCLE REVIEW RADAR STATION |
CN107219533A (en) * | 2017-08-04 | 2017-09-29 | 清华大学 | Laser radar point cloud and image co-registration formula detection system |
US20180081060A1 (en) * | 2014-10-31 | 2018-03-22 | James W. Justice | Active Continuous Awareness Surveillance System (ACASS): a Multi-mode 3D LIDAR for Diverse Applications |
CN107991681A (en) * | 2017-11-22 | 2018-05-04 | 杭州爱莱达科技有限公司 | Laser radar and its scan method based on diffraction optics |
CN108957478A (en) * | 2018-07-23 | 2018-12-07 | 上海禾赛光电科技有限公司 | Multisensor synchronous sampling system and its control method, vehicle |
-
2019
- 2019-11-27 CN CN201911185467.7A patent/CN112859106A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2012139675A (en) * | 2012-09-18 | 2014-03-27 | Министерство Промышленности И Торговли Российской Федерации | CIRCLE REVIEW RADAR STATION |
US20180081060A1 (en) * | 2014-10-31 | 2018-03-22 | James W. Justice | Active Continuous Awareness Surveillance System (ACASS): a Multi-mode 3D LIDAR for Diverse Applications |
CN107219533A (en) * | 2017-08-04 | 2017-09-29 | 清华大学 | Laser radar point cloud and image co-registration formula detection system |
CN107991681A (en) * | 2017-11-22 | 2018-05-04 | 杭州爱莱达科技有限公司 | Laser radar and its scan method based on diffraction optics |
CN108957478A (en) * | 2018-07-23 | 2018-12-07 | 上海禾赛光电科技有限公司 | Multisensor synchronous sampling system and its control method, vehicle |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114296094A (en) * | 2021-12-31 | 2022-04-08 | 探维科技(苏州)有限公司 | Radar detection method, device, system and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2021200905B2 (en) | Synchronized spinning lidar and rolling shutter camera system | |
US20220357427A1 (en) | Hybrid lidar receiver and lidar methods | |
JP3374175B2 (en) | Data transmission device with data transmission function for position display | |
US9435891B2 (en) | Time of flight camera with stripe illumination | |
US11269065B2 (en) | Muilti-detector with interleaved photodetector arrays and analog readout circuits for lidar receiver | |
JP2018152632A (en) | Imaging apparatus and imaging method | |
US20050088644A1 (en) | Surface profile measurement | |
CN110988842A (en) | Laser radar 2D receiver array architecture | |
US11662443B2 (en) | Method and apparatus for determining malfunction, and sensor system | |
JP2015081921A (en) | Sensor including scanning unit moving around rotation shaft | |
CN110986816B (en) | Depth measurement system and measurement method thereof | |
GB2374743A (en) | Surface profile measurement | |
CN112859106A (en) | Laser radar, laser detection method and vehicle comprising laser radar | |
CN116930920A (en) | Laser radar and laser radar control method | |
EP4070125A1 (en) | Time of flight sensing method | |
CN113406710A (en) | Detector module, detector device and inspection device | |
CN111474552A (en) | Laser ranging method and device and self-moving equipment | |
KR20220037939A (en) | 3d imaging device with digital micromirror device and operating method thereof | |
US20220317303A1 (en) | Optical sensor | |
JP7215472B2 (en) | Imaging device and imaging method | |
CN212160075U (en) | Sensor receiving chip, distance measuring sensing device and laser distance measuring system | |
WO2022006718A1 (en) | Tof measurement system | |
CN115902818A (en) | Signal detection system, radar system and detection method of image fusion laser | |
KR20220037940A (en) | 3d imaging device with digital micromirror device and operating method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: No.2 building, no.468 xinlai Road, Jiading District, Shanghai, 201821 Applicant after: Shanghai Hesai Technology Co.,Ltd. Address before: No.2 building, no.468 xinlai Road, Jiading District, Shanghai, 201821 Applicant before: Shanghai Hesai Technology Co., Ltd |