CN118424320A - Road condition identification and vehicle path planning method for unstructured road - Google Patents
Road condition identification and vehicle path planning method for unstructured road Download PDFInfo
- Publication number
- CN118424320A CN118424320A CN202410895619.7A CN202410895619A CN118424320A CN 118424320 A CN118424320 A CN 118424320A CN 202410895619 A CN202410895619 A CN 202410895619A CN 118424320 A CN118424320 A CN 118424320A
- Authority
- CN
- China
- Prior art keywords
- road
- passed
- road section
- determining
- road surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000001514 detection method Methods 0.000 claims abstract description 52
- 230000004927 fusion Effects 0.000 claims abstract description 9
- 238000012545 processing Methods 0.000 claims description 17
- 230000004888 barrier function Effects 0.000 claims description 14
- 238000013459 approach Methods 0.000 claims description 5
- 238000007621 cluster analysis Methods 0.000 claims description 5
- 230000009194 climbing Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides a road condition identification and vehicle path planning method for unstructured roads, and relates to the technical field of automatic driving. According to the invention, the position coordinates of each point of the road section to be passed are obtained by carrying out coordinate fusion on detection data such as laser radar data, inertia data, camera shooting data and the like on the road section to be passed on the unstructured road, roadblock judgment is carried out, the obstacle condition of the road section to be passed, such as the distance between the obstacle and a vehicle, the type, gradient and size of the obstacle and the like, is determined, and path planning is carried out based on the obstacle condition, so that the vehicle path of the road section to be passed is obtained, the vehicle navigation on the unstructured road is realized, and the accident risk of the vehicle on the unstructured road is reduced.
Description
Technical Field
The invention relates to the technical field of automatic driving, in particular to a road condition identification and vehicle path planning method for unstructured roads.
Background
In the field of automatic driving, an automatic driving vehicle senses the surrounding environment through sensors such as a laser radar, a millimeter wave radar, an ultrasonic radar, a camera and the like, performs obstacle detection and type recognition, and estimates the position, the size, the direction and the type of the obstacle so as to realize planning of the path and the speed of the automatic driving vehicle. But less sensitive and identifiable to the road surface environment on unstructured roads. Unstructured roads such as mines, fields and the like have uncertain road surface conditions and complex types of barriers.
The existing navigation technology is mainly focused on realizing the requirement of obstacle avoidance through the identification of environmental obstacles, the influence of road surface conditions on driving is less considered, and a vehicle generally only needs to identify lane lines. On unstructured roads in the field, no lane lines can be identified, and even no existing roads can be driven. In addition to environmental obstacles, obstacles such as irregularities of the road surface are one of the main factors affecting the driving path. The current navigation technology can not accurately identify the road surface condition and the obstacle information, so that the vehicles are difficult to pass on the unstructured road, and the vehicle safety accidents are easy to occur.
Disclosure of Invention
The invention provides a road condition identification and vehicle path planning method for an unstructured road, which can realize vehicle navigation on the unstructured road and reduce accident risk of vehicles on the unstructured road.
In a first aspect, the present invention provides a method for identifying road conditions and planning vehicle paths for unstructured roads, the method comprising: acquiring detection data of a road section to be passed on an unstructured road, wherein the detection data comprises laser radar data, inertia data and camera shooting data; carrying out coordinate fusion and feature analysis on the detection data to obtain road surface information of the road section to be passed, wherein the road surface information comprises position coordinates of each point on the road section to be passed; judging a roadblock based on road surface information of a road section to be passed, and determining the obstacle condition of the road section to be passed; obstacle conditions include distance of the obstacle from the vehicle, type of obstacle, grade, and size; and planning a path based on the obstacle condition of the road section to be passed to obtain the vehicle path of the road section to be passed.
In one possible implementation, the method is applied to a road condition recognition navigation system, wherein the road condition recognition navigation system comprises a laser radar, an inertial sensor and a camera; hard connection is adopted among the laser radar, the inertial sensor and the camera; the mass center of the inertial sensor is coaxial with the mass center of the laser radar in the vertical direction; the lidar data includes: laser emission angle, laser horizontal rotation angle and laser detection distance; the inertial data includes: inertial course angle, inertial pitch angle, inertial roll angle; coordinate fusion is carried out on the detection data to obtain the road surface information of the road section to be passed, and the method comprises the following steps: determining a radar course angle, a radar pitch angle and a radar roll angle of the laser radar based on the inertial data and the corresponding relation between the inertial data and the laser radar data; for any point on a road section to be passed, determining an X coordinate of the point based on a laser detection distance, a laser emission angle, a radar pitch angle and a laser horizontal rotation angle of the point; determining the Y coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point; and determining the Z coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the radar roll angle of the point.
In one possible implementation manner, the road barrier judgment is performed based on the road surface information of the road section to be passed, and the determining of the obstacle condition of the road section to be passed includes: for any point on a road section to be passed, determining a first distance between the point and a vehicle based on the X coordinate of the point; determining a second distance between the point location and the standard surface based on the Z coordinate of the point location; the standard surface is a horizontal surface where the transmitting center of the laser radar is positioned; determining the type of the obstacle at the point position based on the second distance of the point position and the preset laser radar emission height; the laser radar emission height is the distance between a standard surface and an ideal road surface, and the ideal road surface is the horizontal plane where the vehicle is positioned when the pitch angle and the roll angle of the vehicle are zero; determining the gradient of the point based on the Z coordinate and the X coordinate of the point, the Z coordinate and the X coordinate of the adjacent point of the point and the preset laser radar emission height; and determining the size of the obstacle at the point position based on the X coordinate and the Y coordinate of the point position and the X coordinate and the Y coordinate of the adjacent point position of the point position.
In one possible implementation manner, determining the type of the obstacle at the point location based on the second distance of the point location and the preset laser radar emission height includes: calculating the difference between the laser radar emission height and the second distance of the point location, and determining the difference as the road surface difference of the point location; if the pavement difference of the point location is smaller than the negative value of the first pavement difference, determining that the point location is a pit-type obstacle; if the road surface difference of the point location is larger than the first road surface difference, determining the point location as a convex obstacle.
In one possible implementation manner, the route planning is performed based on the obstacle condition of the road section to be passed, to obtain the vehicle route of the road section to be passed, including: determining a plurality of paths on a road section to be passed based on the obstacle condition; calculating the comfort level and the passing efficiency of each path; based on the comfort and the traffic efficiency, a vehicle path of the road section to be traffic is determined among the plurality of paths.
In one possible implementation manner, the method further includes, before performing path planning based on the obstacle condition of the road section to be travelled to obtain the vehicle path of the road section to be travelled: determining the road condition grade of the road section to be passed based on the obstacle condition of the road section to be passed; if the road condition grade is the primary road condition and the secondary road condition, determining that the road section to be passed can be passed; if the road condition grades are the three-level road condition and the four-level road condition, determining whether the road section to be passed can be passed or not based on the performance parameters of the target vehicle and the obstacle condition of the road section to be passed; and if the road condition grade is five road conditions, determining that the road section to be passed cannot be passed.
In one possible implementation manner, determining the road condition grade of the road section to be passed based on the obstacle condition of the road section to be passed includes: determining the lowest road surface of the pit-type obstacle and the highest road surface of the convex-type obstacle on the road section to be passed based on the obstacle condition of the road section to be passed; determining the maximum value of the road surface difference of the road section to be passed based on the lowest road surface and the highest road surface; the maximum value of the road surface difference is the maximum value of the distance between the lowest road surface and the ideal road surface and the distance between the highest road surface and the ideal road surface; the ideal road surface is the horizontal plane where the vehicle is located when the pitch angle and the roll angle of the vehicle are zero; determining the maximum gradient of pit-type barriers and protruding-type barriers on a road section to be passed based on the gradient of various barriers in the barrier condition of the road section to be passed; if the maximum value of the road surface difference is smaller than or equal to the first road surface difference and the maximum gradient is smaller than or equal to the first gradient, determining the road section to be passed as a first-level road condition; if the maximum value of the road surface difference is larger than the first road surface difference and smaller than or equal to the second road surface difference, and the maximum gradient is larger than the first gradient and smaller than or equal to the second gradient, determining that the road section to be passed is a secondary road condition; if the maximum value of the road surface difference is larger than the second road surface difference and smaller than or equal to the third road surface difference, and the maximum gradient is larger than the second gradient and smaller than or equal to the third gradient, determining that the road section to be passed is a three-level road condition; if the maximum value of the road surface difference is larger than the third road surface difference and smaller than or equal to the fourth road surface difference, and the maximum gradient is larger than the third gradient and smaller than or equal to the fourth gradient, determining that the road section to be passed is a four-level road condition; if the maximum value of the road surface difference is larger than the fourth road surface difference and the maximum gradient is larger than the fourth gradient, determining that the road section to be passed is a five-level road condition; the first road surface difference is smaller than the second road surface difference, the second road surface difference is smaller than the third road surface difference, the third road surface difference is smaller than the fourth road surface difference, the first gradient is smaller than the second gradient, the second gradient is smaller than the third gradient, and the third gradient is smaller than the fourth gradient.
In one possible implementation manner, determining whether the road section to be passed is passable based on the passing capability of the target vehicle and the obstacle condition of the road section to be passed includes: acquiring performance parameters of a target vehicle; performance parameters include chassis height, maximum grade, approach angle, departure angle, and minimum turning radius. Determining the traffic capacity of the target vehicle based on the performance parameters of the target vehicle; and determining whether the target vehicle has the capability of passing through the road section to be passed or not based on the passing capability of the target vehicle and the obstacle grade of the road section to be passed.
In one possible implementation manner, based on the road surface information of the road section to be passed, the method further includes, before determining the obstacle condition of the road section to be passed: extracting Z coordinates of each point on a road section to be passed to form a target data set; based on the target data set, performing cluster analysis to obtain a cluster center of the target data set; updating the initial value of the laser radar emission height based on the clustering center to obtain the updated laser radar emission height
Based on the target data set, performing cluster analysis to obtain a cluster center of the target data set, including: step one, randomly determining any data in a target data set as a clustering center; step two, calculating the sum of the distances between each data in the target data set and the clustering center, and determining the sum as the minimum distance sum; step three, updating the value of the clustering center again; step four, recalculating the sum of the distances between each data and the clustering center, and determining the sum as a second value; fifthly, if the second value is smaller than or equal to the minimum distance sum, determining the second value as the minimum distance sum; if the second value is greater than the minimum distance sum, the minimum distance sum is kept unchanged; step six, if the current iteration number is smaller than the maximum iteration number, repeating the steps three to six; if the current iteration number is greater than or equal to the maximum iteration number, exiting the iteration process, and executing the step seven; step seven, determining the minimum distance and the corresponding clustering center as the clustering center of the target data set
In a second aspect, an embodiment of the present invention provides a road condition recognition navigation system for unstructured roads, including: the data acquisition module is used for acquiring detection data of a road section to be passed on an unstructured road, wherein the detection data comprises laser radar data, inertia data and camera shooting data; the data processing module is used for carrying out coordinate fusion on the detection data to obtain road surface information of the road section to be passed, wherein the road surface information comprises position coordinates of each point on the road section to be passed; judging a roadblock based on road surface information of a road section to be passed, and determining the obstacle condition of the road section to be passed; obstacle conditions include distance of the obstacle from the vehicle, type of obstacle, grade, and size; and planning a path based on the obstacle condition of the road section to be passed to obtain the vehicle path of the road section to be passed.
In a third aspect, an embodiment of the present invention provides a terminal device, the terminal device comprising a memory storing a computer program and a processor for invoking and running the computer program stored in the memory to perform the steps of the method according to the first aspect and any possible implementation manner of the first aspect.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to the first aspect and any one of the possible implementations of the first aspect.
The invention provides a road condition identification and vehicle path planning method of an unstructured road.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flow chart of a road condition recognition and vehicle path planning method for unstructured roads provided by an embodiment of the invention;
FIG. 2 is a schematic view of lidar detection when a vehicle is in an ideal plane according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of lidar detection when a vehicle is on a bumpy road surface according to an embodiment of the present invention;
fig. 4 is a schematic diagram of laser radar detection under three-dimensional rectangular coordinates according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of laser radar detection in the YOZ plane according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a road condition recognition navigation system for unstructured roads according to an embodiment of the present invention;
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In the description of the present invention, "/" means "or" unless otherwise indicated, for example, A/B may mean A or B. "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. Further, "at least one", "a plurality" means two or more. The terms "first," "second," and the like do not limit the number and order of execution, and the terms "first," "second," and the like do not necessarily differ.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g." in an embodiment should not be taken as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion that may be readily understood.
Furthermore, references to the terms "comprising" and "having" and any variations thereof in the description of the present application are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules but may, alternatively, include other steps or modules not listed or inherent to such process, method, article, or apparatus.
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the following description will be made with reference to the accompanying drawings of the present invention by way of specific embodiments.
As shown in fig. 1, the embodiment of the invention provides a road condition recognition and vehicle path planning method for unstructured roads. The method is applied to a road condition recognition navigation system. The road condition recognition navigation system comprises a laser radar, an inertial sensor and a camera; hard connection is adopted among the laser radar, the inertial sensor and the camera; the centroid of the inertial sensor is coaxial with the centroid of the lidar in the vertical direction. The method comprises steps S101-S104.
S101, acquiring detection data of a road section to be passed on an unstructured road.
In the embodiment of the application, the detection data comprise laser radar data, inertial data and camera shooting data.
In some embodiments, the embodiments of the present invention may be used to relatively position inertial sensors, lidar and monocular cameras on the roof of a vehicle. The laser radar and the inertial sensor are connected in a hardware plug-in manner, the mass center is kept coaxial in the vertical direction, and the camera is kept fixed in position with the laser radar and the inertial sensor through the specific mounting bracket.
In some embodiments, the lidar data comprises: laser emission angle, laser horizontal rotation angle and laser detection distance.
In some embodiments, the inertial data includes: inertial course angle, inertial pitch angle, inertial roll angle.
And S102, carrying out coordinate fusion on the detection data to obtain the road surface information of the road section to be passed.
In the embodiment of the application, the road surface information comprises the position coordinates of each point on the road section to be passed.
In some embodiments, the embodiments of the present invention may establish the NED mobile coordinate system with the laser emission center hole of the lidar as the origin. The laser emission center hole O (0, 0) is set as the origin of the vehicle movement coordinates, the vehicle advancing direction X is a positive value, the direction vertical to the X in the horizontal plane points to the right side of the vehicle and is a positive value of Y, and the direction vertical to the earth center is a positive value of Z.
For example, the laser emission angle is α i (i=1, 2,3 …), the laser detection distance is L i (i=1, 2,3 …), and the inertial heading angle Φ j, the inertial pitch angle β j, and the inertial roll angle γ j (j=1, 2,3 …) measured by the inertial sensor.
As a possible implementation manner, the embodiment of the present invention may determine the road surface information of the road section to be travelled through steps S1021-S1024.
S1021, determining the radar course angle, the radar pitch angle and the radar roll angle of the laser radar based on the inertial data and the corresponding relation between the inertial data and the laser radar data.
The correspondence includes, for example, a first correspondence f1 between radar heading angles and inertial heading angles, a second correspondence f2 between radar pitch angles and inertial pitch angles, and a third correspondence f3 between radar roll angles and inertial roll angles.
For example, the embodiment of the invention can perform real vehicle calibration according to the measurement angle of the inertial sensor, and calculate the radar heading angle phi k, the radar pitch angle beta k and the radar roll angle gamma k of the laser radar relative to the horizontal plane, (k=1, 2 and 3 …). Radar heading angle phi k=f1φj, radar pitch angle beta k= f2βj, radar roll angle gamma k=f3γj.
It should be noted that, in the embodiment of the present invention, multiple angle scanning measurement may be performed on each point located at different distances from the vehicle, so as to obtain detection data of each point located on the road section to be passed.
Exemplary, the embodiment of the invention can respectively measure the specific distance and the multi-angle of the specific object at 100 meters, 50 meters and 25 meters so as to detect the laser radar data and the inertial sensor data.
In addition, the embodiment of the invention can also respectively calibrate the synchronous time of the laser radar, the inertial sensor and the camera. Calibrating the camera, determining the corresponding relation between the pixels and the actual object, and generating the three-dimensional coordinates of the imaging point of the obstacle.
In the embodiment of the invention, an upper plane, a middle plane and a lower plane can be selected based on the vehicle body contour in the NED coordinate system, a plurality of characteristic points are selected from the periphery of the vehicle body of each plane, the characteristic point coordinates P i(xi,yi,zi are generated, i=1, 2 and 3 … …, and a vehicle body contour coordinate set is generated by all the characteristic points.
The upper plane is selected from the highest plane of the vehicle, the lower plane is selected from the height of the chassis of the vehicle body, and the middle plane is selected from the middle position of the vehicle. And measuring ground conditions and the size of obstacles by taking the height H of the laser radar transmitting position of the roof as a reference and adopting a height difference method. The laser radar can calculate the horizontal angle resolution according to the rotation emission frequency and the emission interval time, and the horizontal angle of laser emission is called omega t, t=1, 2,3 …
S1022, for any point on the road section to be passed, determining the X coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point.
By way of example, embodiments of the present invention may determine the X coordinate based on the following formula.
;
Wherein,Is the X coordinate of the ith point location,For the laser detection distance of the ith spot,For the laser emission angle of the ith spot,Is the radar pitch angle of the ith point location,Is the laser horizontal corner of the ith point.
S1023, determining the Y coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point.
By way of example, embodiments of the present invention may determine the Y coordinate based on the following formula.
;
Wherein,Is the Y coordinate of the ith point location,For the laser detection distance of the ith spot,For the laser emission angle of the ith spot,Is the radar pitch angle of the ith point location,Is the laser horizontal corner of the ith point.
S1024, determining the Z coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the radar roll angle of the point.
By way of example, embodiments of the present invention may determine the Z coordinate based on the following formula.
;
Wherein,Is the Z coordinate of the ith point location,For the laser detection distance of the ith spot,For the laser emission angle of the ith spot,Is the radar pitch angle of the ith point location,Is the radar roll angle of the ith point location.
In this way, the embodiment of the invention can respectively calculate the X coordinate, the Y coordinate and the Z coordinate of each point on the road section to be passed to obtain the obstacle coordinate set。i =1,2,3……,t=1,2,3……。
In some embodiments, after the point position coordinates are calculated, the embodiment of the invention can also fuse the shooting data collected by the camera with the point position coordinates to obtain the road surface information of the road section to be passed, thereby improving the accuracy of identifying the obstacle. And the user can observe the road ahead on the display device conveniently, so that the convenience of the user is improved.
And S103, judging the roadblock based on the road surface information of the road section to be passed, and determining the obstacle condition of the road section to be passed.
In the embodiment of the application, the obstacle condition comprises the distance between the obstacle and the vehicle, the type of the obstacle, the gradient and the size.
As a possible implementation, step S103 may be implemented specifically as steps S1031-S1035.
S1031, for any point on the road section to be passed, determining a first distance between the point and the vehicle based on the X coordinate of the point.
The first distance is the distance between the point location and the laser radar transmitting center in the vehicle advancing direction, so that the distance between the vehicle and the point location can be obtained.
S1032, determining a second distance between the point location and the standard surface based on the Z coordinate of the point location.
In some embodiments, the standard plane is a horizontal plane at which the lidar emission center is located.
S1033, determining the type of the obstacle at the point position based on the second distance of the point position and the preset laser radar emission height.
In some embodiments, the lidar emission height H is the distance between the standard surface and an ideal road surface, which is the horizontal plane at which the vehicle is at when the pitch and roll angles of the vehicle are zero.
Step S1033 may be embodied as steps A1-A3, for example.
A1, calculating a difference value between the laser radar emission height and the second distance of the point location, and determining the difference value as the road surface difference of the point location.
A2, if the pavement difference of the point position is smaller than the negative value of the first pavement difference, determining that the point position is a pit-type obstacle.
A3, if the pavement difference of the point location is larger than the first pavement difference, determining that the point location is a convex obstacle.
For example, the judgment strategy of the obstacle judges the height S of the obstacle by using the value of H-Z, wherein the H-Z is less than or equal to 30mm and less than or equal to 30mm for reducing errors and calculated quantity, the judgment strategy is not marked as an obstacle area, the H-Z is less than or equal to 30mm, and the judgment strategy is marked as a pit-type obstacle; H-Z > 30mm, marked as a protruding obstacle.
S1034, determining the gradient of the point based on the Z coordinate and the X coordinate of the point, the Z coordinate and the X coordinate of the adjacent point of the point and the preset laser radar emission height.
Illustratively, embodiments of the present invention may calculate the grade P using |H-Z/X N+1-XN |.
S1035, determining the size of the obstacle at the point position based on the X coordinate and the Y coordinate of the point position and the X coordinate and the Y coordinate of the adjacent point position of the point position.
Illustratively, embodiments of the present invention may use |X N+1-XN∣×∣YN+1-YN | to calculate the size of the obstacle.
And S104, planning a path based on the obstacle condition of the road section to be passed, and obtaining the vehicle path of the road section to be passed.
As a possible implementation, step S104 may be implemented specifically as steps S1041-S1045.
S1041, determining a plurality of paths on the road section to be passed based on the obstacle condition.
S1042, calculate the comfort and traffic efficiency of each path.
By way of example, the embodiment of the invention can calculate the influence coefficient of the obstacle according to the road surface difference and the size of a plurality of obstacles on each path; calculating a gradient influence coefficient based on the gradient of the obstacle; calculating a curvature influence coefficient based on the curvature of the path; and carrying out weighted summation based on the obstacle influence coefficient, the gradient influence coefficient and the curvature influence coefficient to obtain the comfort level of each path.
According to the embodiment of the invention, the obstacle is classified into the size grades based on the road surface difference and the size of the obstacle, different scores are set for the obstacles with different size grades, and the sum of the scores of a plurality of obstacles is calculated to obtain the influence coefficient of the obstacle.
According to the embodiment of the invention, the gradient grades of the obstacles can be classified based on the gradient of the obstacles, different gradient scores are set for different gradient grades, and the sum of the gradient scores of a plurality of obstacles is calculated to obtain a gradient influence coefficient;
The embodiment of the invention can divide the curvature grade of each turning part based on the curvature of the turning part in the path; different curvature grades are provided with different curvature scores, and the sum of the curvature scores of a plurality of turning positions on each path is calculated to obtain a curvature influence coefficient.
Finally, the embodiment of the invention can calculate the comfort level score of each path and evaluate the comfort level of each path based on the obstacle influence coefficient, the gradient influence coefficient, the curvature influence coefficient and the weight value of each path.
For any one path, the embodiment of the invention calculates the vehicle running speed of the vehicle on the path based on the obstacle condition on the path; and carrying out weighted summation based on the path length, the vehicle running speed and the number of obstacles and the weights of the paths to obtain the traffic efficiency of the path.
According to the embodiment of the invention, the influence coefficient of the obstacle can be calculated according to the road surface differences and the sizes of a plurality of obstacles on the path; the vehicle running speed is calculated based on the obstacle influence coefficient on the path and the correspondence between the obstacle influence coefficient and the vehicle running speed.
And S1043, determining the vehicle path of the road section to be passed in a plurality of paths based on the comfort level and the passing efficiency.
The embodiment of the invention can calculate the weighted sum of the comfort level and the traffic efficiency based on the comfort level and the traffic efficiency and the weight values of the comfort level and the traffic efficiency, and determine the path with the largest weighted sum as the vehicle path of the road section to be passed.
The invention provides a road condition identification and vehicle path planning method of an unstructured road, which is characterized in that detection data such as laser radar data, inertia data, camera shooting data and the like of a road section to be passed on the unstructured road are subjected to coordinate fusion to obtain position coordinates of each point of the road section to be passed, roadblock judgment is carried out, the condition of an obstacle of the road section to be passed, such as the distance between the obstacle and a vehicle, the type, gradient, size and the like of the obstacle are determined, path planning is carried out based on the condition of the obstacle, the vehicle path of the road section to be passed is obtained, vehicle navigation on the unstructured road is realized, and the accident risk of the vehicle on the unstructured road is reduced.
Optionally, the method for identifying road conditions and planning vehicle paths of unstructured roads provided by the embodiment of the invention further includes steps S201-S204 before step S104.
S201, determining the road condition grade of the road section to be passed based on the obstacle condition of the road section to be passed.
As a possible implementation, step S201 may be implemented specifically as steps S2011-S2018.
S2011, determining the lowest road surface of the pit-type obstacle and the highest road surface of the convex-type obstacle on the road section to be passed based on the obstacle condition of the road section to be passed.
And S2012, determining the maximum value of the road surface difference of the road section to be passed based on the lowest road surface and the highest road surface.
The maximum value of the road surface difference is the maximum value of the distance between the lowest road surface and the ideal road surface and the distance between the highest road surface and the ideal road surface; the ideal road surface is the horizontal plane where the vehicle is located when the pitch angle and roll angle of the vehicle are zero.
S2013, determining the maximum gradient of the pit-shaped obstacle and the protruding-shaped obstacle on the road section to be passed based on the gradient of various obstacles in the obstacle condition of the road section to be passed.
S2014, if the maximum value of the road surface differences is smaller than or equal to the first road surface differences and the maximum gradient is smaller than or equal to the first gradient, determining the road section to be passed as the first road condition.
S2015, if the maximum value of the road surface difference is larger than the first road surface difference and smaller than or equal to the second road surface difference, and the maximum gradient is larger than the first gradient and smaller than or equal to the second gradient, determining that the road section to be passed is the second road condition.
S2016, if the maximum value of the road surface differences is larger than the second road surface differences and smaller than or equal to the third road surface differences, and the maximum gradient is larger than the second gradient and smaller than or equal to the third gradient, determining that the road section to be passed is the three-level road condition.
And S2017, if the maximum value of the road surface difference is larger than the third road surface difference and smaller than or equal to the fourth road surface difference, and the maximum gradient is larger than the third gradient and smaller than or equal to the fourth gradient, determining that the road section to be passed is the four-level road condition.
And S2018, if the maximum value of the road surface difference is larger than the fourth road surface difference and the maximum gradient is larger than the fourth gradient, determining that the road section to be passed is a five-level road condition.
The first road surface difference is smaller than the second road surface difference, the second road surface difference is smaller than the third road surface difference, the third road surface difference is smaller than the fourth road surface difference, the first gradient is smaller than the second gradient, the second gradient is smaller than the third gradient, and the third gradient is smaller than the fourth gradient.
The embodiment of the invention can express the grade of the road condition according to the value of the H-Z and the gradient P, wherein the smaller the value is, the better the road condition is. The ranking table is shown in table 1.
TABLE 1
S202, if the road condition grade is the primary road condition and the secondary road condition, determining that the road section to be passed can be passed.
And S203, if the road condition grades are the three-level road condition and the four-level road condition, determining whether the road section to be passed can be passed or not based on the performance parameters of the target vehicle and the obstacle condition of the road section to be passed.
As a possible implementation, step S203 may be specifically implemented as steps S2031-S2033.
S2031, acquiring a performance parameter of the target vehicle.
In some embodiments, the performance parameters include chassis height, maximum grade, approach angle, departure angle, and minimum turning radius.
By way of example, the embodiment of the invention can acquire the performance parameters of the vehicle by adopting a way that the road condition recognition navigation system is communicated with the control equipment of the vehicle. Or the embodiment of the invention can also manually input the performance parameters of the vehicle in a user input mode.
S2032, determining the traffic capacity of the target vehicle based on the performance parameters of the target vehicle;
S2033, determining whether the target vehicle has the capability of passing through the road section to be passed or not based on the traffic capability of the target vehicle and the road condition grade of the road section to be passed.
By way of example, the embodiment of the invention can determine whether the vehicle has safety risk according to the chassis height of the vehicle and the road condition grade on the road section to be passed.
In another exemplary embodiment, the method and the device can calculate the climbing limit of the vehicle according to the maximum climbing, the approach angle and the departure angle of the vehicle, and determine whether the vehicle has scratch risk or not based on the climbing limit of the vehicle and the road condition grade on the road section to be passed.
If the vehicle has no safety risk and no scratch risk, determining that the target vehicle has the capability of passing through the road section to be passed. Otherwise, determining that the target vehicle does not have the capability of passing through the road section to be passed.
S204, if the road condition grade is five road conditions, determining that the road section to be passed cannot be passed.
It should be noted that, if the road condition grade of the road section to be passed is the primary road condition and the secondary road condition, the user can drive the automobile to pass through directly. Or, the user can travel through the route planned in the step S104, so that the comfort in the driving process is improved, the passing time is reduced, and the user can pass through quickly.
If the road condition grade of the road section to be passed is three-level road condition and four-level road condition, the road condition identification navigation system can evaluate the passing capacity of the vehicle and the obstacle grade of the road section to be passed, and when the target vehicle is determined to have the capacity of passing the road section to be passed, the user can drive through according to the path planned in the step S104, so that the comfort in the driving process is improved, the passing time is reduced, and the vehicle can pass through quickly.
If the road condition grade of the road section to be passed is five-grade road conditions, the road condition of the road section to be passed is poor, the safety risk of the vehicle exists, and the user is recommended to pass by after detouring or correcting the obstacle.
Optionally, the method for identifying road conditions and planning vehicle paths of unstructured roads provided by the embodiment of the invention further includes steps S301-S303 before step S103.
S301, extracting Z coordinates of each point on the road section to be passed to form a target data set.
S302, performing cluster analysis based on the target data set to obtain a cluster center of the target data set.
As a possible implementation manner, the embodiment of the present invention may calculate an average value of each data in the target data set, and determine the average value as a cluster center.
As another possible implementation manner, the embodiment of the present invention may calculate the mode in the target data set, and use the mode as the cluster center.
As another possible implementation manner, the embodiment of the present invention may determine the cluster center of the target data set based on the following steps.
Step one, randomly determining any data in a target data set as a clustering center.
And step two, calculating the sum of the distances between each data in the target data set and the clustering center, and determining the sum as the minimum distance sum.
And thirdly, updating the value of the clustering center again.
And step four, recalculating the sum of the distances between each data and the clustering center, and determining the sum as a second value.
Fifthly, if the second value is smaller than or equal to the minimum distance sum, determining the second value as the minimum distance sum; if the second value is greater than the minimum distance sum, the minimum distance sum is maintained.
Step six, if the current iteration number is smaller than the maximum iteration number, repeating the steps three to six; and if the current iteration number is greater than or equal to the maximum iteration number, exiting the iteration process, and executing the step seven.
And seventhly, determining the minimum distance and the corresponding cluster center as the cluster center of the target data set.
S303, updating the initial value of the laser radar emission height based on the clustering center to obtain the updated laser radar emission height.
The laser radar emission height is the distance between the laser radar emission center of the vehicle and the ideal road surface when the vehicle is on the ideal road surface. When the vehicle is on unstructured roads such as bumpy roads, the vehicle is pitching back and forth and rolling left and right, so that the distance between the laser radar emission center of the vehicle and the road surface is changed, and acquisition of detection data is affected.
Therefore, on unstructured roads, the embodiment of the invention can update the laser radar emission height by calculating the average value and the mode of the Z coordinate and clustering, thereby ensuring the accuracy of the obstacle judging process and further reducing the probability of vehicle accidents.
As shown in fig. 2, the embodiment of the invention provides a laser radar detection schematic diagram when a vehicle is in an ideal plane. The vehicle is stopped at the initial position, calibration is carried out, and a plane formed by four grounding points of the wheels is taken as an ideal plane. The radar scans the rough road surface, and the approximate outline of the front obstacle or pit can be known through calculation.
The vehicle is calibrated on an initial road surface, and the distance between the laser radar transmitting center and an ideal road surface is H. The ideal plane M represents the vehicle stopping on an initial absolute level. In the figure, the curve where BCD is located represents the actual road surface condition, the position lower than the ideal plane represents the pit, and the position higher than the ideal plane represents the protruding obstacle. The horizontal plane of the laser emission center is parallel to the ideal plane. Let +.O 1 OA beAngle O 3 OC is,,Is the included angle between the laser emitting line and the horizontal plane of the laser emitting center. O1a=o2e=o3f=h. And (3) installing a level meter on the plane of the vehicle body, determining the angle between the real-time emission angle of the radar and the horizontal plane, and carrying out coordinate conversion.
As shown in fig. 2, the principle of obstacle recognition in an ideal state is as follows. According to the embodiment of the invention, the horizontal plane of the laser emission center is determined through calibration, and the height of the laser emission center from the ideal plane M is determined to be H;
the emission angle of the light beam is shown in figure 2 、∠Respectively forming included angles between the two laser beams and a horizontal line OO 3; the horizontal angle of rotation of the laser radar is k. OB and OC are the emission angles of the laser radar、Measuring the obtained distances L1 and L2; from the data measured by the lidar, calculate O 2 b=l1×,O3C=L2×; Calculating the difference value of O 2B-H、O3 C-H, which is larger than a pavement difference threshold value, and marking the difference value as a pit-type obstacle; less than a road surface difference threshold, marked as a protruding obstacle;、 The value of (2) is the height value of a pit or an obstacle.
As shown in fig. 3, the embodiment of the invention provides a laser radar detection schematic diagram when a vehicle is on a bumpy road. Fig. 4 is a schematic diagram of laser radar detection under three-dimensional rectangular coordinates according to an embodiment of the present invention; fig. 5 is a schematic diagram of laser radar detection under a YOZ plane according to an embodiment of the present invention. Wherein, the O point is a laser emission center hole, and for any point F in space, let +.FOE be alpha and +.BOE be omega, then: alpha is the laser emission angle, and omega is the laser horizontal rotation angle.
As shown in fig. 3, in an actual road surface, considering an unstructured road environment, a vehicle cannot travel on an ideal plane M, and jolts in X and Y directions occur during travel.
As shown in FIG. 3, the pitch angle is the pitch angle of the vehicle when the vehicle jolts in the X directionHeight h1=lx of center of emission of lidarL is the laser measurement distance. As shown in FIG. 5, when the vehicle jolts in the Y direction, the roll angle isHeight h2=h1× of the center of emission of the lidar。
Thus, a laser radar real-time measurement formula H=LX can be obtained。
As shown in fig. 3 to 5, the obstacle is a bump road surfaceThe coordinates of the components are%×)。
Obstacle coordinate set: t= (Xi, yi, zi), where xi=li;,i=1,2,3……,t=1,2,3……。
The scheme expands the application range of the vehicle, has important significance for automatic driving of the vehicle on outdoor unstructured roads such as mines, enables the vehicle to automatically select favorable roads for driving according to the power performance of the vehicle, and improves the driving safety of the vehicle.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
The following are device embodiments of the invention, for details not described in detail therein, reference may be made to the corresponding method embodiments described above.
Fig. 6 is a schematic structural diagram of a road condition recognition navigation system for unstructured roads according to an embodiment of the present invention. The road condition recognition navigation system 400 includes a data acquisition module 401 and a data processing module 402.
The data acquisition module 401 is configured to acquire detection data of a road section to be passed on an unstructured road, where the detection data includes laser radar data, inertial data and camera shooting data;
The data processing module 402 is configured to coordinate-fuse the detection data to obtain road surface information of the road section to be passed, where the road surface information includes position coordinates of points on the road section to be passed; judging a roadblock based on road surface information of a road section to be passed, and determining the obstacle condition of the road section to be passed; obstacle conditions include distance of the obstacle from the vehicle, type of obstacle, grade, and size; and planning a path based on the obstacle condition of the road section to be passed to obtain the vehicle path of the road section to be passed.
In one possible implementation, the data processing module 402 is specifically configured to determine a radar heading angle, a radar pitch angle, and a radar roll angle of the lidar based on the inertial data and a correspondence between the inertial data and the lidar data; for any point on a road section to be passed, determining an X coordinate of the point based on a laser detection distance, a laser emission angle, a radar pitch angle and a laser horizontal rotation angle of the point; determining the Y coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point; and determining the Z coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the radar roll angle of the point.
In one possible implementation manner, the data processing module 402 is specifically configured to determine, for any point on the road segment to be travelled, a first distance between the point and the vehicle based on an X coordinate of the point; determining a second distance between the point location and the standard surface based on the Z coordinate of the point location; the standard surface is a horizontal surface where the transmitting center of the laser radar is positioned; determining the type of the obstacle at the point position based on the second distance of the point position and the preset laser radar emission height; the laser radar emission height is the distance between a standard surface and an ideal road surface, and the ideal road surface is the horizontal plane where the vehicle is positioned when the pitch angle and the roll angle of the vehicle are zero; determining the gradient of the point based on the Z coordinate and the X coordinate of the point, the Z coordinate and the X coordinate of the adjacent point of the point and the preset laser radar emission height; and determining the size of the obstacle at the point position based on the X coordinate and the Y coordinate of the point position and the X coordinate and the Y coordinate of the adjacent point position of the point position.
In one possible implementation, the data processing module 402 is specifically configured to calculate a difference between the laser radar emission height and the second distance of the point location, and determine the difference as a road surface difference of the point location; if the pavement difference of the point location is smaller than the negative value of the first pavement difference, determining that the point location is a pit-type obstacle; if the road surface difference of the point location is larger than the first road surface difference, determining the point location as a convex obstacle.
In one possible implementation, the data processing module 402 is specifically configured to determine, based on the obstacle situation, a plurality of paths on the road segment to be travelled; calculating the comfort level and the passing efficiency of each path; based on the comfort and the traffic efficiency, a vehicle path of the road section to be traffic is determined among the plurality of paths.
In a possible implementation manner, the data processing module 402 is further configured to determine a road condition grade of the road section to be passed based on an obstacle condition of the road section to be passed; if the road condition grade is the primary road condition and the secondary road condition, determining that the road section to be passed can be passed; if the road condition grades are the three-level road condition and the four-level road condition, determining whether the road section to be passed can be passed or not based on the performance parameters of the target vehicle and the obstacle condition of the road section to be passed; and if the road condition grade is five road conditions, determining that the road section to be passed cannot be passed.
In one possible implementation manner, the data processing module 402 is specifically configured to determine, based on the obstacle condition of the road section to be passed, a lowest road surface of the pit-type obstacle and a highest road surface of the protruding-type obstacle on the road section to be passed; determining the maximum value of the road surface difference of the road section to be passed based on the lowest road surface and the highest road surface; the maximum value of the road surface difference is the maximum value of the distance between the lowest road surface and the ideal road surface and the distance between the highest road surface and the ideal road surface; the ideal road surface is the horizontal plane where the vehicle is located when the pitch angle and the roll angle of the vehicle are zero; determining the maximum gradient of pit-type barriers and protruding-type barriers on a road section to be passed based on the gradient of various barriers in the barrier condition of the road section to be passed; if the maximum value of the road surface difference is smaller than or equal to the first road surface difference and the maximum gradient is smaller than or equal to the first gradient, determining the road section to be passed as a first-level road condition; if the maximum value of the road surface difference is larger than the first road surface difference and smaller than or equal to the second road surface difference, and the maximum gradient is larger than the first gradient and smaller than or equal to the second gradient, determining that the road section to be passed is a secondary road condition; if the maximum value of the road surface difference is larger than the second road surface difference and smaller than or equal to the third road surface difference, and the maximum gradient is larger than the second gradient and smaller than or equal to the third gradient, determining that the road section to be passed is a three-level road condition; if the maximum value of the road surface difference is larger than the third road surface difference and smaller than or equal to the fourth road surface difference, and the maximum gradient is larger than the third gradient and smaller than or equal to the fourth gradient, determining that the road section to be passed is a four-level road condition; if the maximum value of the road surface difference is larger than the fourth road surface difference and the maximum gradient is larger than the fourth gradient, determining that the road section to be passed is a five-level road condition; the first road surface difference is smaller than the second road surface difference, the second road surface difference is smaller than the third road surface difference, the third road surface difference is smaller than the fourth road surface difference, the first gradient is smaller than the second gradient, the second gradient is smaller than the third gradient, and the third gradient is smaller than the fourth gradient.
In one possible implementation, the data processing module 402 is specifically configured to obtain a performance parameter of the target vehicle; the performance parameters include chassis height, maximum grade, approach angle, departure angle, and minimum turning radius; the data processing module 402 is specifically configured to determine a traffic capacity of the target vehicle based on the performance parameter of the target vehicle; and determining whether the target vehicle has the capability of passing through the road section to be passed or not based on the passing capability of the target vehicle and the obstacle grade of the road section to be passed.
In one possible implementation manner, the data processing module 402 is specifically configured to extract a Z coordinate of each point on the road section to be travelled, so as to form a target data set; based on the target data set, performing cluster analysis to obtain a cluster center of the target data set; and updating the initial value of the laser radar emission height based on the clustering center to obtain the updated laser radar emission height.
Fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 7, the terminal device 500 includes: a processor 501, a memory 502 and a computer program 503 stored in said memory 502 and executable on said processor 501. The steps of the method embodiments described above, such as steps S101-S104 shown in fig. 1, are implemented when the processor 501 executes the computer program 503. Or the processor 501, when executing the computer program 503, implements the functions of the modules/units in the above-described device embodiments, for example, the functions of the data acquisition module 401 and the data processing module 402 shown in fig. 6.
Illustratively, the computer program 503 may be split into one or more modules/units that are stored in the memory 502 and executed by the processor 501 to accomplish the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments are used for describing the execution of the computer program 503 in the terminal device 500. For example, the computer program 503 may be divided into a data acquisition module 401 and a data processing module 402 as shown in fig. 6.
The Processor 501 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 502 may be an internal storage unit of the terminal device 500, for example, a hard disk or a memory of the terminal device 500. The memory 502 may also be an external storage device of the terminal device 500, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the terminal device 500. Further, the memory 502 may also include both an internal storage unit and an external storage device of the terminal device 500. The memory 502 is used for storing the computer program and other programs and data required by the terminal. The memory 502 may also be used to temporarily store data that has been output or is to be output.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.
Claims (10)
1. A method for identifying road conditions and planning vehicle paths for unstructured roads, the method comprising:
Acquiring detection data of a road section to be passed on an unstructured road, wherein the detection data comprises laser radar data, inertia data and camera shooting data;
Coordinate fusion is carried out on the detection data to obtain pavement information of the road section to be passed, wherein the pavement information comprises position coordinates of each point on the road section to be passed;
Judging a roadblock based on road surface information of a road section to be passed, and determining the obstacle condition of the road section to be passed; the obstacle conditions include the distance of the obstacle from the vehicle, the type of obstacle, the grade and the size;
And carrying out path planning based on the obstacle condition of the road section to be passed to obtain the vehicle path of the road section to be passed.
2. The method for traffic recognition and vehicle path planning for unstructured roads according to claim 1, wherein the method is applied to a traffic recognition navigation system comprising a laser radar, an inertial sensor and a camera; the laser radar, the inertial sensor and the camera are in hard connection; the centroid of the inertial sensor is coaxial with the centroid of the laser radar in the vertical direction;
the lidar data includes: laser emission angle, laser horizontal rotation angle and laser detection distance;
The inertial data includes: inertial course angle, inertial pitch angle, inertial roll angle;
the coordinate fusion is carried out on the detection data to obtain the road surface information of the road section to be passed, which comprises the following steps:
Determining a radar course angle, a radar pitch angle and a radar roll angle of the laser radar based on the inertial data and the corresponding relation between the inertial data and the laser radar data;
For any point on the road section to be passed, determining the X coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point;
Determining the Y coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the laser horizontal rotation angle of the point;
and determining the Z coordinate of the point based on the laser detection distance, the laser emission angle, the radar pitch angle and the radar roll angle of the point.
3. The method for identifying road conditions and planning vehicle paths according to claim 1, wherein the performing the roadblock judgment based on the road surface information of the road section to be passed, determining the obstacle condition of the road section to be passed comprises:
for any point on the road section to be passed, determining a first distance between the point and the vehicle based on the X coordinate of the point;
determining a second distance between the point location and the standard surface based on the Z coordinate of the point location; the standard surface is a horizontal surface where the transmitting center of the laser radar is located;
Determining the type of the obstacle at the point position based on the second distance of the point position and the preset laser radar emission height; the laser radar emission height is the distance between the standard surface and an ideal road surface, and the ideal road surface is the horizontal plane where the vehicle is located when the pitch angle and the roll angle of the vehicle are zero;
Determining the gradient of the point based on the Z coordinate and the X coordinate of the point, the Z coordinate and the X coordinate of the adjacent point of the point and the preset laser radar emission height;
and determining the size of the obstacle at the point position based on the X coordinate and the Y coordinate of the point position and the X coordinate and the Y coordinate of the adjacent point position of the point position.
4. The method for identifying the road condition and planning the vehicle path of the unstructured road according to claim 3, wherein the determining the type of the obstacle at the point location based on the second distance of the point location and the preset laser radar emission height comprises:
calculating the difference between the laser radar emission height and the second distance of the point location, and determining the difference as the road surface difference of the point location;
If the pavement difference of the point location is smaller than the negative value of the first pavement difference, determining that the point location is a pit-type obstacle;
If the road surface difference of the point location is larger than the first road surface difference, determining the point location as a convex obstacle.
5. The method for identifying road conditions and planning vehicle paths on unstructured roads according to claim 1, wherein the step of planning paths based on the obstacle condition of the road section to be travelled to obtain the vehicle paths of the road section to be travelled comprises the steps of:
Determining a plurality of paths on the road section to be passed based on the obstacle condition;
Calculating the comfort level and the passing efficiency of each path;
And determining the vehicle path of the road section to be passed in the plurality of paths based on the comfort level and the passing efficiency.
6. The method for identifying road conditions and planning vehicle paths according to claim 1, wherein the step of planning paths based on the obstacle condition of the to-be-passed road section, before obtaining the vehicle paths of the to-be-passed road section, further comprises:
determining the road condition grade of the road section to be passed based on the obstacle condition of the road section to be passed;
If the road condition grade is the primary road condition and the secondary road condition, determining that the road section to be passed can be passed;
If the road condition grade is three-level road condition and four-level road condition, determining whether the road section to be passed can be passed or not based on the performance parameters of the target vehicle and the obstacle condition of the road section to be passed;
and if the road condition grade is five-grade road condition, determining that the road section to be passed cannot be passed.
7. The method for recognizing road conditions and planning vehicle paths on unstructured roads according to claim 6, wherein the determining the road condition level of the road section to be passed based on the obstacle condition of the road section to be passed comprises:
determining the lowest road surface of the pit-type obstacle and the highest road surface of the convex-type obstacle on the road section to be passed based on the obstacle condition of the road section to be passed;
Determining the maximum value of the road surface difference of the road section to be passed based on the lowest road surface and the highest road surface; the maximum value of the road surface difference is the maximum value of the distance between the lowest road surface and the ideal road surface and the distance between the highest road surface and the ideal road surface; the ideal road surface is a horizontal plane where the vehicle is located when the pitch angle and the roll angle of the vehicle are zero;
determining the maximum gradient of pit-type barriers and protruding-type barriers on a road section to be passed based on the gradient of various barriers in the barrier condition of the road section to be passed;
If the maximum value of the road surface difference is smaller than or equal to the first road surface difference and the maximum gradient is smaller than or equal to the first gradient, determining the road section to be passed as a first-level road condition;
If the maximum value of the road surface difference is larger than the first road surface difference and smaller than or equal to the second road surface difference, and the maximum gradient is larger than the first gradient and smaller than or equal to the second gradient, determining that the road section to be passed is a secondary road condition;
if the maximum value of the road surface difference is larger than the second road surface difference and smaller than or equal to the third road surface difference, and the maximum gradient is larger than the second gradient and smaller than or equal to the third gradient, determining that the road section to be passed is a three-level road condition;
If the maximum value of the road surface difference is larger than the third road surface difference and smaller than or equal to the fourth road surface difference, and the maximum gradient is larger than the third gradient and smaller than or equal to the fourth gradient, determining that the road section to be passed is a four-level road condition;
If the maximum value of the road surface difference is larger than the fourth road surface difference and the maximum gradient is larger than the fourth gradient, determining that the road section to be passed is a five-level road condition;
the first road surface difference is smaller than the second road surface difference, the second road surface difference is smaller than the third road surface difference, the third road surface difference is smaller than the fourth road surface difference, the first gradient is smaller than the second gradient, the second gradient is smaller than the third gradient, and the third gradient is smaller than the fourth gradient.
8. The method for recognizing road conditions and planning vehicle paths on unstructured roads according to claim 6, wherein the determining whether the road section to be passed is passable based on the passing ability of the target vehicle and the obstacle condition of the road section to be passed comprises:
acquiring performance parameters of a target vehicle; the performance parameters comprise chassis height, maximum climbing grade, approach angle, departure angle and minimum turning radius;
determining the traffic capacity of the target vehicle based on the performance parameters of the target vehicle;
and determining whether the target vehicle has the capability of passing through the road section to be passed or not based on the passing capability of the target vehicle and the obstacle grade of the road section to be passed.
9. The method for identifying road conditions and planning vehicle paths according to claim 1, wherein the steps of performing the roadblock judgment based on the road surface information of the road section to be passed, and before determining the obstacle condition of the road section to be passed, further comprise:
Extracting Z coordinates of each point on the road section to be passed to form a target data set;
Based on the target data set, performing cluster analysis to obtain a cluster center of the target data set;
and updating the initial value of the laser radar emission height based on the clustering center to obtain the updated laser radar emission height.
10. A road condition recognition navigation system for unstructured roads, the road condition recognition navigation system comprising:
the data acquisition module is used for acquiring detection data of a road section to be passed on an unstructured road, wherein the detection data comprises laser radar data, inertia data and camera shooting data;
The data processing module is used for carrying out coordinate fusion on the detection data to obtain the road surface information of the road section to be passed, wherein the road surface information comprises the position coordinates of each point on the road section to be passed; judging a roadblock based on road surface information of a road section to be passed, and determining the obstacle condition of the road section to be passed; the obstacle conditions include the distance of the obstacle from the vehicle, the type of obstacle, the grade and the size; and carrying out path planning based on the obstacle condition of the road section to be passed to obtain the vehicle path of the road section to be passed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410895619.7A CN118424320A (en) | 2024-07-05 | 2024-07-05 | Road condition identification and vehicle path planning method for unstructured road |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410895619.7A CN118424320A (en) | 2024-07-05 | 2024-07-05 | Road condition identification and vehicle path planning method for unstructured road |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118424320A true CN118424320A (en) | 2024-08-02 |
Family
ID=92335674
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410895619.7A Pending CN118424320A (en) | 2024-07-05 | 2024-07-05 | Road condition identification and vehicle path planning method for unstructured road |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118424320A (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104569998A (en) * | 2015-01-27 | 2015-04-29 | 长春理工大学 | Laser-radar-based vehicle safety running region detection method and device |
CN104950313A (en) * | 2015-06-11 | 2015-09-30 | 同济大学 | Road-surface abstraction and road gradient recognition method |
CN111381248A (en) * | 2020-03-23 | 2020-07-07 | 湖南大学 | Obstacle detection method and system considering vehicle bump |
CN111435163A (en) * | 2020-03-18 | 2020-07-21 | 深圳市镭神智能系统有限公司 | Ground point cloud data filtering method and device, detection system and storage medium |
CN112731451A (en) * | 2021-01-05 | 2021-04-30 | 东风商用车有限公司 | Method and system for detecting ground obstacle based on laser radar |
CN113147750A (en) * | 2021-05-21 | 2021-07-23 | 清华大学 | Safety decision system and method for controlling vehicle running |
CN113283494A (en) * | 2021-05-21 | 2021-08-20 | 福建盛海智能科技有限公司 | Detection method of ground pits and terminal |
CN113296116A (en) * | 2021-05-14 | 2021-08-24 | 汤恩智能科技(苏州)有限公司 | Obstacle detection method, driving device, and storage medium |
US20210270612A1 (en) * | 2020-03-02 | 2021-09-02 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method, apparatus, computing device and computer-readable storage medium for positioning |
CN113420587A (en) * | 2021-05-10 | 2021-09-21 | 江苏大学 | Vehicle active collision avoidance method based on pavement pit detection |
CN113686330A (en) * | 2021-07-01 | 2021-11-23 | 江苏和正特种装备有限公司 | Road trafficability detection method based on map three-dimensional data |
CN114325634A (en) * | 2021-12-23 | 2022-04-12 | 中山大学 | Method for extracting passable area in high-robustness field environment based on laser radar |
CN115200608A (en) * | 2022-06-10 | 2022-10-18 | 北京航天控制仪器研究所 | Method for calibrating installation error of water laser radar and inertial navigation |
CN115761286A (en) * | 2022-11-02 | 2023-03-07 | 中国船舶集团有限公司第七〇七研究所 | Method for detecting navigation obstacle of unmanned surface vehicle based on laser radar under complex sea condition |
CN116148858A (en) * | 2022-12-08 | 2023-05-23 | 深圳市顺禾电器科技有限公司 | Blind area obstacle early warning method and device, electronic equipment and storage medium |
CN117826173A (en) * | 2022-09-27 | 2024-04-05 | 珠海一微半导体股份有限公司 | Obstacle identification method based on laser point cloud, chip and robot |
-
2024
- 2024-07-05 CN CN202410895619.7A patent/CN118424320A/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104569998A (en) * | 2015-01-27 | 2015-04-29 | 长春理工大学 | Laser-radar-based vehicle safety running region detection method and device |
CN104950313A (en) * | 2015-06-11 | 2015-09-30 | 同济大学 | Road-surface abstraction and road gradient recognition method |
US20210270612A1 (en) * | 2020-03-02 | 2021-09-02 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method, apparatus, computing device and computer-readable storage medium for positioning |
CN111435163A (en) * | 2020-03-18 | 2020-07-21 | 深圳市镭神智能系统有限公司 | Ground point cloud data filtering method and device, detection system and storage medium |
CN111381248A (en) * | 2020-03-23 | 2020-07-07 | 湖南大学 | Obstacle detection method and system considering vehicle bump |
CN112731451A (en) * | 2021-01-05 | 2021-04-30 | 东风商用车有限公司 | Method and system for detecting ground obstacle based on laser radar |
CN113420587A (en) * | 2021-05-10 | 2021-09-21 | 江苏大学 | Vehicle active collision avoidance method based on pavement pit detection |
CN113296116A (en) * | 2021-05-14 | 2021-08-24 | 汤恩智能科技(苏州)有限公司 | Obstacle detection method, driving device, and storage medium |
CN113283494A (en) * | 2021-05-21 | 2021-08-20 | 福建盛海智能科技有限公司 | Detection method of ground pits and terminal |
CN113147750A (en) * | 2021-05-21 | 2021-07-23 | 清华大学 | Safety decision system and method for controlling vehicle running |
CN113686330A (en) * | 2021-07-01 | 2021-11-23 | 江苏和正特种装备有限公司 | Road trafficability detection method based on map three-dimensional data |
CN114325634A (en) * | 2021-12-23 | 2022-04-12 | 中山大学 | Method for extracting passable area in high-robustness field environment based on laser radar |
CN115200608A (en) * | 2022-06-10 | 2022-10-18 | 北京航天控制仪器研究所 | Method for calibrating installation error of water laser radar and inertial navigation |
CN117826173A (en) * | 2022-09-27 | 2024-04-05 | 珠海一微半导体股份有限公司 | Obstacle identification method based on laser point cloud, chip and robot |
CN115761286A (en) * | 2022-11-02 | 2023-03-07 | 中国船舶集团有限公司第七〇七研究所 | Method for detecting navigation obstacle of unmanned surface vehicle based on laser radar under complex sea condition |
CN116148858A (en) * | 2022-12-08 | 2023-05-23 | 深圳市顺禾电器科技有限公司 | Blind area obstacle early warning method and device, electronic equipment and storage medium |
Non-Patent Citations (6)
Title |
---|
于春和,刘济林: "越野环境下基于四线激光雷达的障碍检测", 《南京理工大学学报》, vol. 30, no. 5, 31 October 2006 (2006-10-31), pages 618 - 622 * |
刘欢: "非结构化环境下车载激光雷达障碍物检测技术研究", 《工程科技Ⅱ辑》, 15 March 2022 (2022-03-15), pages 29 - 63 * |
杨光红,王俊生: "《无人系统基础》", 31 August 2021, 械工业出版社, pages: 44 - 47 * |
段建民;李龙杰;郑凯华;: "基于车载4线激光雷达的前方道路可行驶区域检测", 汽车技术, no. 02, 24 February 2016 (2016-02-24) * |
王广君, 田金文, 柳健: "激光成像雷达前视成像仿真及障碍物识别方法研究", 红外与激光工程, no. 06, 25 December 2001 (2001-12-25) * |
马超杰;许鹏程;吴丹;杨华;: "成像激光雷达在自动目标识别中的应用", 航空兵器, no. 04, 15 August 2008 (2008-08-15) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11630197B2 (en) | Determining a motion state of a target object | |
CN111352110B (en) | Method and device for processing radar data | |
CN111309013B (en) | Collision distance determining method and system, vehicle and storage medium | |
US9563808B2 (en) | Target grouping techniques for object fusion | |
US20230112799A1 (en) | Bounding box estimation and object detection | |
JP2023073257A (en) | Output device, control method, program, and storage medium | |
Kim et al. | Placement optimization of multiple lidar sensors for autonomous vehicles | |
US7933433B2 (en) | Lane marker recognition apparatus | |
WO2018181974A1 (en) | Determination device, determination method, and program | |
US11321950B2 (en) | Apparatus and method for detecting lane information, and computer-readable recording medium storing computer program programmed to execute same method | |
US20210207977A1 (en) | Vehicle position estimation device, vehicle position estimation method, and computer-readable recording medium for storing computer program programmed to perform said method | |
CN109583416B (en) | Pseudo lane line identification method and system | |
CN110415550B (en) | Automatic parking method based on vision | |
CN110632617B (en) | Laser radar point cloud data processing method and device | |
US20190346273A1 (en) | Host vehicle position estimation device | |
CN105785370A (en) | Object Detecting Device, Radar Device, And Object Detection Method | |
US20210055399A1 (en) | Radar device | |
CN112789208B (en) | Apparatus and method for estimating vehicle position | |
US20130158871A1 (en) | Fusion of road geometry model information gathered from disparate sources | |
CN113743171A (en) | Target detection method and device | |
EP3872454A1 (en) | Measurement accuracy calculation device, host position estimation device, control method, program, and storage medium | |
CN112455430A (en) | Method for detecting inclined parking spaces without parking space lines, parking method and parking system | |
Kellner et al. | Road curb detection based on different elevation mapping techniques | |
CN111289969A (en) | Vehicle-mounted radar moving target fusion method and device | |
CN111325187A (en) | Lane position identification method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |