Nothing Special   »   [go: up one dir, main page]

CN111433780A - Lane line detection method, lane line detection apparatus, and computer-readable storage medium - Google Patents

Lane line detection method, lane line detection apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN111433780A
CN111433780A CN201880068401.7A CN201880068401A CN111433780A CN 111433780 A CN111433780 A CN 111433780A CN 201880068401 A CN201880068401 A CN 201880068401A CN 111433780 A CN111433780 A CN 111433780A
Authority
CN
China
Prior art keywords
lane line
candidate lane
candidate
determining
lines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880068401.7A
Other languages
Chinese (zh)
Inventor
崔健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111433780A publication Critical patent/CN111433780A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

A lane line detection method, apparatus (500), computer-readable storage medium, the method comprising: determining a plurality of candidate lane lines to be clustered in the image (S100); determining a position parameter of an end point of each candidate lane line in the image (S200); clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines (S300); the clustered lane line candidates are determined as the lane lines detected from the image (S400).

Description

Lane line detection method, lane line detection apparatus, and computer-readable storage medium Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a lane line detection method, a lane line detection device, and a computer-readable storage medium.
Background
In some scenarios, such as an automatic driving system and an ADAS (advanced driving assistance system), the detection technology of lane lines is of great significance, and the accuracy of the detection result will directly affect the performance and reliability of the system.
In a related lane line detection method, a lane line in an image is detected by a feature extraction method or a straight line or curve detection method. However, in actual situations, there may be dashed lane lines (including wear-induced dashed lane lines, variable lane lines, etc.), and one dashed lane line may include more than two segment lane lines. In the above detection mode, one broken lane line is detected as several different lane lines, and the detection result has low accuracy.
Disclosure of Invention
The invention provides a lane line detection method, a device and a computer readable storage medium, which can prevent a dotted lane line from being detected into a plurality of different lane lines and are beneficial to improving the detection precision.
In a first aspect of the embodiments of the present invention, a lane line detection method is provided, where the method includes:
determining a plurality of candidate lane lines to be clustered in the image;
determining the position parameters of the end points of the candidate lane lines in the image;
clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines;
and determining the clustered candidate lane lines as the lane lines detected from the images.
In a second aspect of the embodiments of the present invention, there is provided an electronic device, including: a memory and a processor;
the memory for storing program code;
the processor, configured to invoke the program code, when the program code is executed, is configured to perform the following:
determining a plurality of candidate lane lines to be clustered in the image;
determining the position parameters of the end points of the candidate lane lines in the image;
clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines;
and determining the clustered candidate lane lines as the lane lines detected from the images.
In a third aspect of embodiments of the present invention, there is provided a computer-readable storage medium,
the computer readable storage medium stores thereon computer instructions that, when executed, implement the lane line detection method described in the foregoing embodiments.
Based on the technical scheme, in the embodiment of the invention,
in the embodiment of the invention, the candidate lane lines are clustered based on the relationship among the end point position parameters of each candidate lane line, the candidate lane lines belonging to the same dotted lane line can be aggregated into one category, so that the detection of one dotted lane line into several different lane lines is prevented, and the detection precision is improved; in addition, the position parameters according to clustering are the end points of the candidate lane lines, so that the two candidate lane lines cannot be clustered when the distance between the middle parts of the two candidate lane lines is close, and wrong clustering can be prevented.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments described in the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings of the embodiments of the present invention.
Fig. 1 is a schematic flow chart of a lane line detection method according to an embodiment of the present invention;
FIG. 2 is a schematic illustration of a lane line candidate determined from an image according to an embodiment of the invention;
FIG. 3 is a schematic diagram illustrating a process of clustering lane line candidates according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a current lane line candidate and a first lane line candidate found according to an embodiment of the invention;
fig. 5 is a schematic diagram illustrating a process of calculating a specified endpoint location parameter of a current candidate lane line and a target endpoint location parameter of a found first candidate lane line according to an embodiment of the present invention;
fig. 6 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. In addition, the features in the embodiments and the examples described below may be combined with each other without conflict.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein and in the claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the associated listed items.
Although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. Depending on the context, moreover, the word "if" is used may be interpreted as "at … …," or "when … …," or "in response to a determination.
The following describes the lane line detection method according to the embodiment of the present invention, but the present invention is not limited thereto. In one embodiment, referring to fig. 1, a lane line detection method may include the steps of:
s100: determining a plurality of candidate lane lines to be clustered in the image;
s200: determining the position parameters of the end points of the candidate lane lines in the image;
s300: clustering the candidate lane lines according to the relation between the endpoint position parameters of the candidate lane lines;
s400: and determining the clustered candidate lane lines as the lane lines detected from the images.
The execution subject of the lane line detection method according to the embodiment of the present invention may be an electronic device, and more specifically, may be a processor of the electronic device. The electronic device can be an imaging device and performs corresponding processing on the acquired image; or, the electronic device may be a mobile device equipped with an imaging device, where the mobile device acquires an image captured by the imaging device and performs corresponding processing, and the mobile device is, for example, a ground robot, an unmanned aerial vehicle, a vehicle, or the like. Of course, the electronic device is not limited to any particular one, and may have an image processing capability.
Specifically, the lane line detection method of the embodiment of the invention can be applied to vehicles equipped with an automatic driving system and an ADAS, and the lane line in the acquired image is detected in the driving process of the vehicle, so that the driving control, planning and the like are realized.
In step S100, a plurality of candidate lane lines to be clustered in the image are determined.
The image may be a road image captured by the electronic device or acquired from an imaging device. The candidate lane lines can be initially detected from the image by a relevant lane line detection method, such as by feature extraction, straight line or curve detection methods.
However, since a broken lane line may be detected as several different lane lines in the related lane line detection manner, several lane lines may exist in the candidate lane lines determined in step S100, and referring to fig. 2, the candidate lane lines determined from the image include L1-L5, but L1-L3 of the candidate lane lines actually belong to the same lane line, and thus, the candidate lane lines determined in step S100 need to be clustered to group L1-L3 into the same category.
In step S200, the position parameters of the end points of the respective candidate lane lines in the image are determined.
Since the candidate lane lines are determined from the image, the positions of the candidate lane lines in the image can be determined, and accordingly, the position parameters of the end points of the candidate lane lines in the image can be determined.
The position parameter is a parameter that can characterize the position of the endpoint of the candidate lane line in the image, and the position parameter may include a vector parameter, and/or a scalar parameter, and is not limited in particular. For example, the position parameters may include tangent vectors and normal vectors of the end points on the candidate lane lines; alternatively, the position parameter may include coordinates of the end point in a coordinate system to which the image is applied, and the distance from the imaging device that acquired the image to the ground may be used as a scale when calculating the coordinates of the end point in the coordinate system to which the image is applied.
It is understood that, in the embodiment of the present invention, the lane line candidate may refer to a linear region having a certain width in the image. When the end point position parameter is calculated, a skeleton line can be extracted from the candidate lane lines, and the end point of the skeleton line is determined as the end point of the candidate lane line; alternatively, the end point of the candidate lane line may be determined from the end of the candidate lane line in a preset manner (for example, the middle point of the end is determined as the end point), and is not limited specifically.
In step S300, the candidate lane lines are clustered according to the relationship between the end point position parameters of the candidate lane lines.
The relationship between the end point position parameters of each candidate lane line can be obtained through vector operation and/or scalar operation, and the position relationship between the end points of the candidate lane lines can be represented. Taking two candidate lane lines as an example, whether the two candidate lane lines need to be aggregated into one category can be determined according to the relationship between the end point position parameters of the two candidate lane lines. After pairwise relations of all the candidate lane lines are judged, clustering can be completed on all the candidate lane lines.
Candidate lane lines whose relationship between the end point position parameters satisfies a certain condition may be grouped into one category. After the candidate lane lines are clustered according to the relation between the end point position parameters, broken candidate lane lines on the dotted line lane line can be clustered.
For example, in fig. 2, L1 and L2 are grouped into one category, L2 and L3 are grouped into one category, L4 belongs to one category, L5 belongs to one category, and L1-L3 belongs to one category after clustering is completed, that is, L1-L3 are candidate lane lines after clustering.
In step S400, the clustered lane line candidates are determined as the lane lines detected from the image.
Continuing with FIG. 2, L4 is a lane line detected from the image, L5 is a lane line detected from the image, and L1-L3 are lane lines detected from the image.
In a related clustering mode, firstly, each object (pixel point or area in an image) is taken as a category, and the minimum distance between the categories is calculated; secondly, aggregating two categories with the minimum distance smaller than a threshold value into one category; and thirdly, calculating the minimum distance between the clustered categories, and returning to the second step for calculation until all the categories cannot be aggregated. In the clustering method, when the shortest distance between a first lane line segment and a second lane line segment in the dashed lane line is greater than the shortest distance between the first lane line segment and a side lane line, the first lane line segment may be grouped with the side lane line into one type, and the first lane line segment and the second lane line segment in the dashed lane line cannot be grouped into one type, resulting in erroneous clustering.
In the embodiment of the invention, the candidate lane lines are clustered based on the relationship among the end point position parameters of each candidate lane line, the candidate lane lines belonging to the same dotted lane line can be aggregated into one category, so that the detection of one dotted lane line into several different lane lines is prevented, and the detection precision is improved; moreover, the position parameters according to clustering are end points of the candidate lane lines, so that the two candidate lane lines cannot be clustered when the middle parts of the two candidate lane lines are close to each other, and wrong clustering can be prevented; the calculation amount is less, and the power consumption and the cost are lower.
In one embodiment, in step S200, the determining the position parameter of the end point of each candidate lane line in the image includes:
performing curve fitting processing on each candidate lane line in the image by using a preset first curve model, and calculating position parameters of end points of the fitted candidate lane lines;
and the position parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding fitted candidate lane lines.
Curve fitting processing is performed on each candidate lane line to be clustered, which is determined in step S100. The curve fitting process may be, for example, a least square curve fitting method, but may also be other methods, such as a method of approximating discrete data by an analytical expression.
The first curve model used in the curve fitting process is, for example, a polynomial curve model, and accordingly, the fitted curve is a polynomial curve, but the first curve model is not limited to this, and may be other curve models, such as a logarithmic function model, a piecewise function model, and the like.
After the curve fitting process, the tangent vector and the normal vector of the end point of the corresponding candidate lane line can be calculated according to the first curve model and the parameters obtained by fitting, and the operation mode of the tangent vector and the normal vector of the curve in the mathematical operation can be referred to specifically, which is not described herein again.
In the embodiment of the invention, the candidate lane line is subjected to curve fitting, and the tangent vector and the normal vector of the end point on the fitted candidate lane line are used as the end point position parameters of the candidate lane line, so that the direction of the candidate lane line is considered in the relation between the end point position parameters, the embodiment of the invention is suitable for detecting the lane line at the curve, and the problem of lane crossing at the curve can be avoided. But also to the detection of straight lane lines.
In one embodiment, referring to fig. 3, in step S300, the clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines includes the following steps:
s301: traversing each candidate lane line according to a specified sequence, and judging whether a target candidate lane line needing to be clustered and merged with the current candidate lane line exists in the candidate lane lines which are not traversed according to the relation between the end point position parameters of the current candidate lane line which is traversed and the candidate lane lines which are not traversed;
s302: if yes, determining the current candidate lane line and the target candidate lane line as belonging to the same category, performing curve fitting processing on the current candidate lane line and the target candidate lane line in the image by using a preset second curve model to obtain a fitted candidate lane line, calculating end point position parameters of the fitted candidate lane line, and returning to the step of traversing each candidate lane line according to the specified sequence.
In step S301, each candidate lane line is traversed in the designated order, and the subsequent steps are executed each time one candidate lane line is traversed. The designated order is not particularly limited, and may be determined according to the position of the candidate lane line in the image, or may be determined according to the length of the candidate lane line, for example.
Preferably, the designated sequence is a sequence of lengths of the candidate lane lines from long to short. That is, during traversal, the candidate lane lines with longer lengths are traversed first, and all the candidate lane lines may be sorted according to length before traversal.
Referring to fig. 2, for example, traversing to L4 first, since there is no target candidate lane line to be merged with L4 from the candidate lane lines not traversed, no processing is performed, then traversing to L5, since there is no target candidate lane line to be merged with L5 from the candidate lane lines not traversed, then traversing to L1, L2 is a target candidate lane line to be merged with L1, and then executing step S302.
Referring to fig. 2, for example, when the vehicle is traversed to L3, it is described that, in the current traversal process, for any traversed current candidate lane line, a target candidate lane line that needs to be merged with the current candidate lane line in a clustering manner does not exist in the non-traversed candidate lane line, and it is described that the clustering is completed and the clustering is completed.
When traversing each candidate lane line according to a specified sequence, judging whether the number of the candidate lane lines is more than 1, if so, judging whether a target candidate lane line needing to be clustered and merged with the current candidate lane line exists in the candidate lane lines which are not traversed according to the relation between the end point position parameters of the traversed current candidate lane line and the candidate lane lines which are not traversed; if not, finishing clustering.
In step S302, when a target candidate lane line to be clustered and merged with a current candidate lane line exists in the candidate lane lines that are not traversed, the current candidate lane line and the target candidate lane line are determined to belong to the same category, curve fitting processing is performed on the current candidate lane line and the target candidate lane line in the image by using a preset second curve model to obtain a fitted candidate lane line, and an endpoint position parameter of the fitted candidate lane line is calculated.
Referring to fig. 2, for example, L2 is a target candidate lane line to be merged with L1, and thus L1 and L2 are determined to belong to the same category (the category may be a new category, may be a category of L1, and may also be a category of L2, and is not limited to the above category, as long as it is distinguished from other categories), and a curve fitting process is performed on L1 and L2 using the second curve model.
The curve fitting process here can also adopt a least square method curve fitting mode and the like, and is not described herein again. The second curve model may be, for example, a polynomial curve model, a logarithmic function model, a piecewise function model, or the like.
In the embodiment, while the current candidate lane line and the target candidate lane line are determined to belong to the same category, curve fitting processing is performed on the current candidate lane line and the target candidate lane line, that is, a method of clustering and fitting is adopted, so that the candidate lane line and the endpoint position parameters thereof can be adjusted in the aggregation process, and the clustering accuracy is improved.
As an alternative, the curve fitting process may not be performed during the clustering process, and after the clustering process is completed, the curve fitting process may be performed on the candidate lane lines belonging to the same category.
In step S302, the step of traversing each candidate lane line according to the designated order is returned, that is, the step of traversing each candidate lane line according to the designated order is restarted. In the new traversal, the specified order is still followed, and of course, the length order and the number of the candidate lane lines are different due to the clustering and curve fitting, that is, the length of the fitted candidate lane lines is longer, and the number of all the candidate lane lines is reduced.
In one embodiment, in step S301, the determining whether there is a target candidate lane line to be clustered and merged with the current candidate lane line in the candidate lane lines not traversed according to the relationship between the end point position parameters of the current candidate lane line traversed and the candidate lane lines not traversed includes the following steps:
s3011: judging whether at least one first candidate lane line meeting specified conditions exists in the candidate lane lines which are not traversed according to the specified endpoint position parameters of the current candidate lane line;
s3012: if yes, calculating the relation between the specified endpoint position parameter and the target endpoint position parameter of the first candidate lane line aiming at each first candidate lane line, judging whether the relation meets the set relation, and if yes, the first candidate lane line is the target candidate lane line.
Each candidate lane line has two end points, but the purpose of determining the target candidate lane line can be achieved by executing corresponding processing according to one end point on the candidate lane line, and the calculation amount can be reduced.
In step S3011, a first candidate lane line satisfying a specified condition is found from candidate lane lines that are not traversed by the specified endpoint position parameter, and step S3012 is performed for the first candidate lane line, so that the amount of calculation required for performing step S3012 can be reduced. Of course, the determination process may not be executed, and all of the candidate lane lines that have not been traversed may be determined as the first candidate lane line.
In step S3012, the relationship between the target endpoint position parameter and the designated endpoint position parameter of each first lane line candidate is calculated, and if the calculated relationship satisfies the set relationship, it indicates that the first lane line candidate is the target lane line candidate.
The designated endpoint of the current candidate lane line may be any one of two endpoints of the current candidate lane line, and when the designated endpoint is determined, the target endpoint position parameter of the first candidate lane line is also correspondingly determined.
Optionally, the designated end point is an end point with a smaller coordinate value in the designated direction of the current candidate lane line, and the target end point is an end point with a larger coordinate value in the designated direction of the target candidate lane line; or,
the specified end point is an end point with a larger coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a smaller coordinate value in the specified direction of the target candidate lane line.
In one embodiment, the specified direction is a vertical direction or a horizontal direction in a coordinate system applied to the image. Of course, if the image is subjected to a transformation process (such as an inverse perspective transformation process) before step S300, the designated direction is a vertical direction or a horizontal direction in the coordinate system applied to the transformed image.
Referring to fig. 2 and 4, for example, L1 is a current candidate lane line, L2 is a first determined candidate lane line, P1 is a designated end point of L1 (an end point of the current candidate lane line applied to the image having a larger coordinate value in the vertical direction in the coordinate system), and correspondingly, P2 is a target end point of L2 (an end point of the target candidate lane line applied to the image having a smaller coordinate value in the vertical direction in the coordinate system), a relationship between P1 and P2 is calculated, and when the relationship satisfies a set relationship, P2 is the target candidate lane line.
In one embodiment, the position parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding candidate lane lines;
in step S3012, the calculating a relationship between the specified endpoint location parameter and the target endpoint location parameter of the first candidate lane line includes:
s30121: calculating a first tangential distance obtained by projecting the first vector on a tangent vector of the specified end point, a first normal distance obtained by projecting the first vector on a normal vector of the specified end point, a second tangential distance obtained by projecting the second vector on a tangent vector of the target end point, and a second normal distance obtained by projecting the second vector on a normal vector of the target end point; the first vector is a vector from the specified end point to a target end point, and the second vector is a vector from the target end point to a specified end point;
s30122: determining the larger of the first tangential distance and the second tangential distance as a target tangential distance, and determining the larger of the first normal distance and the second normal distance as a target normal distance;
s30123: determining the target tangential distance and the target normal distance as a relationship between the specified endpoint location parameter and the target endpoint location parameter.
Referring to fig. 4 and 5, taking the second tangential distance obtained by projecting the second vector onto the tangent vector of the target endpoint as an example,
Figure PCTCN2018118186-APPB-000001
is the second vector (i.e. the vector from the target endpoint to the specified endpoint), n is the tangent vector of the target endpoint, and P2Q1 is
Figure PCTCN2018118186-APPB-000002
A second tangential distance projected on n, wherein P1Q1 is perpendicular to P2Q 1. Other distances are similar and will not be described in detail herein.
In one embodiment, in step S3012, the determining whether the relationship satisfies the set relationship includes:
s30124: and when the target tangential distance is smaller than a set tangential threshold value and the target normal distance is smaller than a set normal threshold value, determining that the relationship meets the set relationship.
The target tangential distance is the greater of the first tangential distance and the second tangential distance, the target normal distance is the greater of the first normal distance and the second normal distance, and when the greater satisfies the set relationship, the smaller inevitably satisfies the set relationship.
The specific values of the set tangential threshold and the set normal threshold are not limited, and can be determined according to the specific lane line condition.
Under normal conditions, only one target candidate lane line is determined from all the first candidate lane lines, that is, the relationship between the target endpoint position parameter and the designated endpoint position parameter of only one first candidate lane line meets the set relationship. Therefore, after the relationship between the target endpoint position parameter and the designated endpoint position parameter of each first candidate lane line is calculated, the target tangential distances or the target normal distances of all the first candidate lane lines can be ranked, and the relationship with the minimum target tangential distance or target normal distance is selected to be compared with the set relationship. Of course, the relationship corresponding to each first lane line candidate may be compared with the set relationship.
In one embodiment, in step S3011, determining whether there is at least one first candidate lane line meeting a specified condition in the candidate lane lines that are not traversed according to the specified endpoint location parameter of the current candidate lane line, includes the following steps:
s30111: determining a boundary for determining a search range according to the specified endpoint position parameter;
s30112: determining a search range required for searching for a first candidate lane line in the image according to the boundary;
s30113: and searching the candidate lane line in the search range in the image, and if the candidate lane line is searched, determining the searched candidate lane line as the first candidate lane line.
The method comprises the steps of determining a boundary according to an appointed endpoint position parameter, determining a search range determined according to the boundary more appropriately, avoiding missing a first candidate lane line, and determining the candidate lane line searched in the search range as the first candidate lane line.
In one embodiment, the boundary is a normal to the specified endpoint on the current candidate lane line;
in step S30112, determining a search range required for searching for a first candidate lane line in the image according to the boundary, including the following steps:
determining a first region and a second region on both sides of the boundary in the image, wherein the current lane line candidate is located in the first region;
and determining the second area as the search range.
Continuing with fig. 2, taking L1 as an example of a current subsequent lane line, taking P1 as a designated end point of L1, determining a normal line of L01 at P1 as a boundary, dividing the image into a second region where L1 is not located and a first region where L1 is located by the boundary, determining the second region as a search range, searching for a candidate lane line in the search range, determining whether the candidate lane line is located in the search range according to end points at two ends of the candidate lane line, for example, determining whether end points at two ends of L2 and L3 are both located in the search range, finding L2 and L3 in the search range, and determining L2 and L3 as first candidate lane lines.
In one embodiment, after the clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines in step S300, the method further includes the following steps:
s310: performing curve fitting processing on each clustered candidate lane line by using a preset third curve model; wherein the highest term order of the third curve model is greater than the highest term order of the second curve model;
in step S400, the determining the clustered lane line candidates as the lane lines detected from the image includes:
and determining each lane line candidate after performing curve fitting processing using the third curve model as a lane line detected from the image.
And executing curve fitting processing on the candidate lane lines belonging to the same category, wherein the curve fitting processing can also adopt a least square method curve fitting mode and the like, and the details are not repeated here. The third curve model may be, for example, a polynomial curve model, a logarithmic function model, a piecewise function model, or the like.
Specifically, the second curve model may be, for example: x is a x y 2+ b y + c; the third curve model may be, for example: x is a x y 3+ b y 2+ c y + d.
The highest term frequency of the second curve model is lower than that of the third curve model, so that the clustering result error caused by the generation of a very curved curve in the clustering iteration process is prevented.
In one embodiment, in step S100, the determining several candidate lane lines to be clustered in the image may include the following steps:
s101: detecting a plurality of candidate lane lines to be classified from the image according to a preset lane line detection mode;
s102: determining a corresponding category for each candidate lane line to be classified;
s103: and determining each candidate lane line after the category is determined as a plurality of candidate lane lines to be clustered.
In step S101, the preset lane line detection mode may be, for example, a lane line detection mode such as edge detection, feature extraction, and the like, and the candidate lane lines in the image are detected as candidate lane lines to be classified.
In step S102, a corresponding category is determined for each candidate lane line to be classified, for example, different colors may be marked on different candidate lane lines, each color represents one category, and when performing subsequent clustering, if two categories are aggregated into the same category, the colors of the two categories are unified, for example, one color of the two categories is modified into the other color of the two categories.
It is to be understood that color marking is only one way of determining the category for the candidate lane lines, and is not particularly limited thereto, and marking in the image may not be required, for example, a category identifier (including characters, numbers, and the like) may be set for the candidate lane lines, and the candidate lane lines and the category identifier are stored in a memory in a corresponding manner, so that the category identifier corresponding to the candidate lane lines may be determined by determining the candidate lane lines, and similarly, the determination of the corresponding category for each candidate lane line to be classified may be implemented.
In S103, each candidate lane line after determining the category is determined as the plurality of candidate lane lines to be clustered, and then the subsequent steps S200 to S400 are performed.
In one embodiment, after the step S102 of determining the corresponding category for each candidate lane line to be classified, the method further includes the following steps:
s112: performing skeleton extraction processing on each candidate lane line to refine each candidate lane line;
in step S103, the determining each candidate lane line after determining the category as the plurality of candidate lane lines to be clustered includes:
and determining each refined candidate lane line as the plurality of candidate lane lines to be clustered.
Since the candidate lane lines detected in step S101 usually have a certain width, and the number of pixels to be processed in the subsequent fitting or other steps is too large due to too many pixels included, and the calculation amount is large, the frame extraction process is performed on each candidate lane line in step S112, so that the subsequent fitting or other processing amount can be greatly reduced.
The skeleton extraction processing may be performed on each candidate lane line, for example, by only retaining a single pixel point located in the middle of the candidate lane line in the width direction to obtain a refined candidate lane line. Of course, the present invention is not particularly limited to this, and for example, an edge in the longitudinal direction of the lane line candidate may be extracted. After the skeleton extraction process is performed, certain image processing, such as smoothing, may be performed on the candidate lane lines.
In one embodiment, after the skeleton extraction process is performed on each candidate lane line in step S112, the method further includes the following steps:
s122: performing inverse perspective transformation processing on each thinned candidate lane line so that the candidate lane line is under a target view angle in the image, wherein the target view angle is a view angle for overlooking the candidate lane line when the candidate lane line is collected;
in step S103, the determining each refined candidate lane line as the plurality of candidate lane lines to be clustered includes:
and determining each candidate lane line after the inverse perspective transformation processing as a plurality of candidate lane lines to be clustered.
After the refined candidate lane lines are subjected to inverse perspective transformation processing, the candidate lane lines are all positioned at a target view angle, the size proportion of the candidate lane lines accords with the real size proportion at the target view angle, the phenomenon of candidate lane line distortion cannot occur, and the robustness can be improved.
The inverse perspective transformation processing method is not limited, for example, a coordinate transformation relationship for transforming the candidate lane lines from the current view angle to the target view angle may be pre-established and stored in the electronic device, and during the calculation, the electronic device may directly call the coordinate transformation relationship to determine to implement the inverse perspective transformation processing on each of the refined candidate lane lines.
The coordinate transformation relationship for transforming from the current view angle to the target view angle may be established according to the focal length of the imaging device, the optical axis position parameter, the angle of the imaging device relative to the ground, the height parameter, and the like, and is not limited in particular.
In one embodiment, before the determining the corresponding category for each candidate lane line to be classified in step S102, the method further includes:
s111: executing expansion processing on each candidate lane line to be classified, so that invalid pixel values in the candidate lane lines are modified into valid pixel values;
s121: performing erosion processing on the candidate lane lines after the expansion processing so that the eroded candidate lane lines have the same size as the corresponding candidate lane lines before the expansion processing.
Some holes may exist in the candidate lane line detected in step S101 (a hole refers to a pixel point or a pixel block of the candidate lane line that has a pixel value different from that of a neighboring pixel point, and the neighboring may be 4 neighbors, 8 neighbors, or the like), and the pixel value at the hole is an invalid pixel value and needs to be modified into a valid pixel value. However, the amount of processing required to detect a hole is very large, and therefore, in order to reduce the amount of processing, in the present embodiment, the inflation processing is directly performed for each lane line candidate to be classified without making a determination as to whether or not a hole is present.
The way of the expansion treatment may be, for example: for each pixel in each candidate lane line, the pixel is expanded by a first radius in four directions, namely, the upper direction, the lower direction, the left direction and the right direction by taking the pixel as the center (namely, the pixel values of the pixels in the range of the first radius are all modified into the pixel value of the expanded pixel). Overall, the lane line candidates become larger in both length and width by two first radii.
In order to restore the original length and width of the lane line candidate, in step S121, erosion processing is performed on the expanded lane line candidate. The etching treatment may be, for example: and for each expanded candidate lane line, inwards corroding the edge of the candidate lane line to form pixels of the first radius (namely modifying the pixel values of the pixels in the range of the first radius to the pixel values of the pixels outside the candidate lane line and adjacent to the edge of the candidate lane line), wherein the corroded candidate lane line has the same size as the corresponding candidate lane line before expansion.
After expansion corrosion treatment, holes in the candidate lane lines can be filled as pixel values of the neighborhood pixels.
In one embodiment, after the clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines in step S300, the method further includes the following steps:
s320: performing curve fitting processing on each clustered candidate lane line by using a preset curve model;
in step S400, the determining the clustered lane line candidates as the lane lines detected from the image includes:
and determining each candidate lane line subjected to curve fitting processing by using the preset curve model as the lane line detected from the image.
In the clustering process of step S300, the curve fitting process may not be performed, but after the clustering is finished, the curve fitting process may be performed on the candidate lane lines belonging to the same category.
The curve fitting process in step S320 may also adopt a least square method curve fitting or the like, which is not described herein again. The predetermined curve model may be, for example, a polynomial curve model, a logarithmic function model, a piecewise function model, or the like.
Specifically, the preset curve model may be, for example: x is a x y 3+ b y 2+ c y + d.
In one embodiment, after determining the clustered candidate lane lines as the lane lines detected from the image in step S400, the method further comprises the following steps:
s501: calculating a designated characteristic value of each detected lane line;
s502: judging whether the specified characteristic value is in a set value range or not;
s503: if not, deleting the lane line from all the detected lane lines.
Preferably, the specified characteristic value includes at least one of the following parameters:
curvature of the lane line;
slope of the lane line;
the width of the lane line.
It is to be understood that the specified feature values are not limited to the above three types, and may be other parameters as long as the shape features of the lane lines can be characterized. The set value range can also be determined according to prior knowledge, and is not particularly limited, and the set value ranges corresponding to different characteristic values can also be different.
According to the curvature, the slope, the width and other designated characteristic values of the lane lines, the detected lane lines which do not conform to the reality are deleted, and the robustness is improved.
Based on the same concept as the above lane line detection method, referring to fig. 6, an electronic device 500 includes: a memory 501 and a processor 502 (e.g., one or more processors). The specific type of electronic device is not limited, and the electronic device may be, but is not limited to, an imaging device. The electronic device may also be, for example, a device electrically connected to the imaging device, and may acquire an image captured by the imaging device and perform the corresponding method.
In one embodiment, the memory is to store program code;
the processor, configured to invoke the program code, when the program code is executed, is configured to perform the following:
determining a plurality of candidate lane lines to be clustered in the image;
determining the position parameters of the end points of the candidate lane lines in the image;
clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines;
and determining the clustered candidate lane lines as the lane lines detected from the images.
Preferably, the processor is specifically configured to, when determining the position parameter of the end point of each candidate lane line in the image:
performing curve fitting processing on each candidate lane line in the image by using a preset first curve model, and calculating position parameters of end points of the fitted candidate lane lines;
and the position parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding fitted candidate lane lines.
Preferably, the processor is specifically configured to, when clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines:
traversing each candidate lane line according to a specified sequence, and judging whether a target candidate lane line needing to be clustered and merged with the current candidate lane line exists in the candidate lane lines which are not traversed according to the relation between the end point position parameters of the current candidate lane line which is traversed and the candidate lane lines which are not traversed;
if yes, determining the current candidate lane line and the target candidate lane line as belonging to the same category, performing curve fitting processing on the current candidate lane line and the target candidate lane line in the image by using a preset second curve model to obtain a fitted candidate lane line, calculating end point position parameters of the fitted candidate lane line, and returning to the step of traversing each candidate lane line according to the specified sequence.
Preferably, the processor, when determining whether there is a target candidate lane line to be clustered and merged with the current candidate lane line in the candidate lane lines not traversed according to the relationship between the end point position parameters of the current candidate lane line traversed and the candidate lane lines not traversed, is specifically configured to:
judging whether at least one first candidate lane line meeting specified conditions exists in the candidate lane lines which are not traversed according to the specified endpoint position parameters of the current candidate lane line;
if yes, calculating the relation between the specified endpoint position parameter and the target endpoint position parameter of the first candidate lane line aiming at each first candidate lane line, judging whether the relation meets the set relation, and if yes, the first candidate lane line is the target candidate lane line.
Preferably, the position parameters of the end points include tangent vectors and normal vectors of the end points on the corresponding candidate lane lines;
the processor, when calculating the relationship between the specified endpoint location parameter and the target endpoint location parameter of the first candidate lane line, is specifically configured to:
calculating a first tangential distance obtained by projecting the first vector on a tangent vector of the specified end point, a first normal distance obtained by projecting the first vector on a normal vector of the specified end point, a second tangential distance obtained by projecting the second vector on a tangent vector of the target end point, and a second normal distance obtained by projecting the second vector on a normal vector of the target end point; the first vector is a vector from the specified end point to a target end point, and the second vector is a vector from the target end point to a specified end point;
determining the larger of the first tangential distance and the second tangential distance as a target tangential distance, and determining the larger of the first normal distance and the second normal distance as a target normal distance;
determining the target tangential distance and the target normal distance as a relationship between the specified endpoint location parameter and the target endpoint location parameter.
Preferably, the processor is specifically configured to, when determining whether the relationship satisfies the set relationship:
and when the target tangential distance is smaller than a set tangential threshold value and the target normal distance is smaller than a set normal threshold value, determining that the relationship meets the set relationship.
Preferably, the processor is specifically configured to, when determining, according to the specified endpoint location parameter of the current candidate lane line, whether at least one first candidate lane line meeting a specified condition exists in the candidate lane lines that are not traversed:
determining a boundary for determining a search range according to the specified endpoint position parameter;
determining a search range required for searching for a first candidate lane line in the image according to the boundary;
and searching the candidate lane line in the search range in the image, and if the candidate lane line is searched, determining the searched candidate lane line as the first candidate lane line.
Preferably, the boundary is a normal of the designated endpoint on the current candidate lane line;
when determining a search range required for searching for a first candidate lane line in the image according to the boundary, the method is specifically configured to:
determining a first region and a second region on both sides of the boundary in the image, wherein the current lane line candidate is located in the first region;
and determining the second area as the search range.
Preferably, the first and second liquid crystal materials are,
the specified end point is an end point with a smaller coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a larger coordinate value in the specified direction of the target candidate lane line; or,
the specified end point is an end point with a larger coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a smaller coordinate value in the specified direction of the target candidate lane line.
Preferably, the first and second liquid crystal materials are,
the specified direction is a vertical direction or a horizontal direction in a coordinate system applied to the image.
Preferably, after clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines, the processor is further configured to:
performing curve fitting processing on each clustered candidate lane line by using a preset third curve model; wherein the highest term order of the third curve model is greater than the highest term order of the second curve model;
the processor, when determining the clustered lane line candidates as the lane lines detected from the image, is specifically configured to:
and determining each lane line candidate after performing curve fitting processing using the third curve model as a lane line detected from the image.
Preferably, the first and second liquid crystal materials are,
the designated sequence is the sequence of the lengths of the candidate lane lines from long to short.
Preferably, the processor is specifically configured to, when determining a plurality of candidate lane lines to be clustered in the image:
detecting a plurality of candidate lane lines to be classified from the image according to a preset lane line detection mode;
determining a corresponding category for each candidate lane line to be classified;
and determining each candidate lane line after the category is determined as a plurality of candidate lane lines to be clustered.
Preferably, after determining the corresponding category for each candidate lane line to be classified, the processor is further configured to:
performing skeleton extraction processing on each candidate lane line to refine each candidate lane line;
the processor is specifically configured to, when determining each candidate lane line after determining the category as the plurality of candidate lane lines to be clustered:
and determining each refined candidate lane line as the plurality of candidate lane lines to be clustered.
Preferably, after the processor performs the skeleton extraction process on each candidate lane line, the processor is further configured to:
performing inverse perspective transformation processing on each thinned candidate lane line so that the candidate lane line is under a target view angle in the image, wherein the target view angle is a view angle for overlooking the candidate lane line when the candidate lane line is collected;
the processor is specifically configured to, when determining each refined candidate lane line as the plurality of candidate lane lines to be clustered:
and determining each candidate lane line after the inverse perspective transformation processing as a plurality of candidate lane lines to be clustered.
Preferably, before the processor determines the corresponding category for each candidate lane line to be classified, the processor is further configured to:
executing expansion processing on each candidate lane line to be classified, so that invalid pixel values in the candidate lane lines are modified into valid pixel values;
performing erosion processing on the candidate lane lines after the expansion processing so that the eroded candidate lane lines have the same size as the corresponding candidate lane lines before the expansion processing.
Preferably, after clustering the candidate lane lines according to the relationship between the end point position parameters of the candidate lane lines, the processor is further configured to:
performing curve fitting processing on each clustered candidate lane line by using a preset curve model;
the processor, when determining the clustered lane line candidates as the lane lines detected from the image, is specifically configured to:
and determining each candidate lane line subjected to curve fitting processing by using the preset curve model as the lane line detected from the image.
Preferably, after the processor determines the clustered lane line candidates as the lane lines detected from the image, the processor is further configured to:
calculating a designated characteristic value of each detected lane line;
judging whether the specified characteristic value is in a set value range or not;
if not, deleting the lane line from all the detected lane lines.
Preferably, the specified characteristic value includes at least one of the following parameters:
curvature of the lane line;
slope of the lane line;
the width of the lane line.
Based on the same inventive concept as the above method, the present invention further provides a computer-readable storage medium, wherein computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the lane line detection method according to the foregoing embodiment is implemented.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by an article of manufacture with certain functionality. A typical implementation device is a computer, which may take the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same software and/or hardware or in a plurality of software and/or hardware when implementing the invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Furthermore, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (39)

  1. A lane line detection method is characterized by comprising the following steps:
    determining a plurality of candidate lane lines to be clustered in the image;
    determining the position parameters of the end points of the candidate lane lines in the image;
    clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines;
    and determining the clustered candidate lane lines as the lane lines detected from the images.
  2. The lane line detection method of claim 1, wherein said determining a location parameter of an end point of each candidate lane line in the image comprises:
    performing curve fitting processing on each candidate lane line in the image by using a preset first curve model, and calculating position parameters of end points of the fitted candidate lane lines;
    and the position parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding fitted candidate lane lines.
  3. The lane line detection method according to claim 1, wherein clustering the lane line candidates according to the relationship between the end point position parameters of the respective lane line candidates comprises:
    traversing each candidate lane line according to a specified sequence, and judging whether a target candidate lane line needing to be clustered and merged with the current candidate lane line exists in the candidate lane lines which are not traversed according to the relation between the end point position parameters of the current candidate lane line which is traversed and the candidate lane lines which are not traversed;
    if yes, determining the current candidate lane line and the target candidate lane line as belonging to the same category, performing curve fitting processing on the current candidate lane line and the target candidate lane line in the image by using a preset second curve model to obtain a fitted candidate lane line, calculating end point position parameters of the fitted candidate lane line, and returning to the step of traversing each candidate lane line according to the specified sequence.
  4. The lane line detection method according to claim 3, wherein the determining whether there is a target candidate lane line to be clustered and merged with the current candidate lane line in the non-traversed candidate lane lines according to a relationship between end point position parameters of the traversed current candidate lane line and the non-traversed candidate lane lines comprises:
    judging whether at least one first candidate lane line meeting specified conditions exists in the candidate lane lines which are not traversed according to the specified endpoint position parameters of the current candidate lane line;
    if yes, calculating the relation between the specified endpoint position parameter and the target endpoint position parameter of the first candidate lane line aiming at each first candidate lane line, judging whether the relation meets the set relation, and if yes, the first candidate lane line is the target candidate lane line.
  5. The lane line detection method according to claim 4, wherein the position parameters of the end points include tangent vectors and normal vectors of the end points on the corresponding candidate lane lines;
    the calculating a relationship between the specified endpoint location parameter and the target endpoint location parameter of the first candidate lane line includes:
    calculating a first tangential distance obtained by projecting the first vector on a tangent vector of the specified end point, a first normal distance obtained by projecting the first vector on a normal vector of the specified end point, a second tangential distance obtained by projecting the second vector on a tangent vector of the target end point, and a second normal distance obtained by projecting the second vector on a normal vector of the target end point; the first vector is a vector from the specified end point to a target end point, and the second vector is a vector from the target end point to a specified end point;
    determining the larger of the first tangential distance and the second tangential distance as a target tangential distance, and determining the larger of the first normal distance and the second normal distance as a target normal distance;
    determining the target tangential distance and the target normal distance as a relationship between the specified endpoint location parameter and the target endpoint location parameter.
  6. The method of claim 5, wherein the determining whether the relationship satisfies a predetermined relationship comprises:
    and when the target tangential distance is smaller than a set tangential threshold value and the target normal distance is smaller than a set normal threshold value, determining that the relationship meets the set relationship.
  7. The lane line detection method of claim 4, wherein determining whether at least one first candidate lane line satisfying a specified condition exists in the candidate lane lines that are not traversed according to the specified endpoint location parameter of the current candidate lane line comprises:
    determining a boundary for determining a search range according to the specified endpoint position parameter;
    determining a search range required for searching for a first candidate lane line in the image according to the boundary;
    and searching the candidate lane line in the search range in the image, and if the candidate lane line is searched, determining the searched candidate lane line as the first candidate lane line.
  8. The lane line detection method of claim 7, wherein the boundary is a normal to the specified endpoint on the current candidate lane line;
    determining a search range required for searching for a first candidate lane line in the image according to the boundary, wherein the search range comprises:
    determining a first region and a second region on both sides of the boundary in the image, wherein the current lane line candidate is located in the first region;
    and determining the second area as the search range.
  9. The lane line detection method according to claim 4,
    the specified end point is an end point with a smaller coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a larger coordinate value in the specified direction of the target candidate lane line; or,
    the specified end point is an end point with a larger coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a smaller coordinate value in the specified direction of the target candidate lane line.
  10. The lane line detecting method according to claim 9,
    the specified direction is a vertical direction or a horizontal direction in a coordinate system applied to the image.
  11. The lane line detection method according to claim 3, wherein after clustering the lane line candidates according to the relationship between the end point position parameters of the respective lane line candidates, the method further comprises:
    performing curve fitting processing on each clustered candidate lane line by using a preset third curve model; wherein the highest term order of the third curve model is greater than the highest term order of the second curve model;
    the determining the clustered candidate lane lines as the lane lines detected from the image includes:
    and determining each lane line candidate after performing curve fitting processing using the third curve model as a lane line detected from the image.
  12. The lane line detecting method according to claim 3,
    the designated sequence is the sequence of the lengths of the candidate lane lines from long to short.
  13. The lane line detection method of any of claims 1-12, wherein determining a number of candidate lane lines to be clustered in the image comprises:
    detecting a plurality of candidate lane lines to be classified from the image according to a preset lane line detection mode;
    determining a corresponding category for each candidate lane line to be classified;
    and determining each candidate lane line after the category is determined as a plurality of candidate lane lines to be clustered.
  14. The lane line detection method of claim 13, wherein after determining the corresponding category for each lane line candidate to be classified, the method further comprises:
    performing skeleton extraction processing on each candidate lane line to refine each candidate lane line;
    the determining, as the plurality of candidate lane lines to be clustered, each candidate lane line after the category is determined includes:
    and determining each refined candidate lane line as the plurality of candidate lane lines to be clustered.
  15. The lane line detection method of claim 14, wherein after performing the skeleton extraction process on each lane line candidate, the method further comprises:
    performing inverse perspective transformation processing on each thinned candidate lane line so that the candidate lane line is under a target view angle in the image, wherein the target view angle is a view angle for overlooking the candidate lane line when the candidate lane line is collected;
    the determining each refined candidate lane line as the plurality of candidate lane lines to be clustered includes:
    and determining each candidate lane line after the inverse perspective transformation processing as a plurality of candidate lane lines to be clustered.
  16. The lane line detection method of claim 13, wherein before determining the corresponding category for each lane line candidate to be classified, the method further comprises:
    executing expansion processing on each candidate lane line to be classified, so that invalid pixel values in the candidate lane lines are modified into valid pixel values;
    performing a corrosion process on the expanded candidate lane lines so that the corroded candidate lane lines have the same size as the corresponding candidate lane lines before expansion.
  17. The lane line detection method according to claim 1, wherein after clustering the lane line candidates according to the relationship between the end point position parameters of the respective lane line candidates, the method further comprises:
    performing curve fitting processing on each clustered candidate lane line by using a preset curve model;
    the determining the clustered candidate lane lines as the lane lines detected from the image includes:
    and determining each candidate lane line subjected to curve fitting processing by using the preset curve model as the lane line detected from the image.
  18. The lane line detection method according to claim 1, wherein after determining the clustered lane line candidates as the lane lines detected from the image, the method further comprises:
    calculating a designated characteristic value of each detected lane line;
    judging whether the specified characteristic value is in a set value range or not;
    if not, deleting the lane line from all the detected lane lines.
  19. The lane line detection method of claim 18, wherein the specified feature value comprises at least one of the following parameters:
    curvature of the lane line;
    slope of the lane line;
    the width of the lane line.
  20. An electronic device, comprising: a memory and a processor;
    the memory for storing program code;
    the processor, configured to invoke the program code, when the program code is executed, is configured to perform the following:
    determining a plurality of candidate lane lines to be clustered in the image;
    determining the position parameters of the end points of the candidate lane lines in the image;
    clustering the candidate lane lines according to the relation between the end point position parameters of the candidate lane lines;
    and determining the clustered candidate lane lines as the lane lines detected from the images.
  21. The device of claim 20, wherein the processor, in determining the location parameters of the end points of each candidate lane line in the image, is specifically configured to:
    performing curve fitting processing on each candidate lane line in the image by using a preset first curve model, and calculating position parameters of end points of the fitted candidate lane lines;
    and the position parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding fitted candidate lane lines.
  22. The apparatus of claim 20, wherein the processor is further configured to cluster the lane line candidates based on relationships between end point location parameters of the respective lane line candidates, and to:
    traversing each candidate lane line according to a specified sequence, and judging whether a target candidate lane line needing to be clustered and merged with the current candidate lane line exists in the candidate lane lines which are not traversed according to the relation between the end point position parameters of the current candidate lane line which is traversed and the candidate lane lines which are not traversed;
    if yes, determining the current candidate lane line and the target candidate lane line as belonging to the same category, performing curve fitting processing on the current candidate lane line and the target candidate lane line in the image by using a preset second curve model to obtain a fitted candidate lane line, calculating end point position parameters of the fitted candidate lane line, and returning to the step of traversing each candidate lane line according to the specified sequence.
  23. The device according to claim 22, wherein the processor, when determining whether there is a target candidate lane line to be clustered and merged with the current candidate lane line in the non-traversed candidate lane lines according to a relationship between end point position parameters of the traversed current candidate lane line and the non-traversed candidate lane lines, is specifically configured to:
    judging whether at least one first candidate lane line meeting specified conditions exists in the candidate lane lines which are not traversed according to the specified endpoint position parameters of the current candidate lane line;
    if yes, calculating the relation between the specified endpoint position parameter and the target endpoint position parameter of the first candidate lane line aiming at each first candidate lane line, judging whether the relation meets the set relation, and if yes, the first candidate lane line is the target candidate lane line.
  24. The apparatus of claim 23, wherein the location parameters of the end points comprise tangent vectors and normal vectors of the end points on the corresponding candidate lane lines;
    the processor, when calculating the relationship between the specified endpoint location parameter and the target endpoint location parameter of the first candidate lane line, is specifically configured to:
    calculating a first tangential distance obtained by projecting the first vector on a tangent vector of the specified end point, a first normal distance obtained by projecting the first vector on a normal vector of the specified end point, a second tangential distance obtained by projecting the second vector on a tangent vector of the target end point, and a second normal distance obtained by projecting the second vector on a normal vector of the target end point; the first vector is a vector from the specified end point to a target end point, and the second vector is a vector from the target end point to a specified end point;
    determining the larger of the first tangential distance and the second tangential distance as a target tangential distance, and determining the larger of the first normal distance and the second normal distance as a target normal distance;
    determining the target tangential distance and the target normal distance as a relationship between the specified endpoint location parameter and the target endpoint location parameter.
  25. The apparatus as claimed in claim 24, wherein said processor is configured to, when determining whether the relationship satisfies the set relationship:
    and when the target tangential distance is smaller than a set tangential threshold value and the target normal distance is smaller than a set normal threshold value, determining that the relationship meets the set relationship.
  26. The device according to claim 23, wherein the processor, when determining whether there is at least one first candidate lane line satisfying a specified condition in the candidate lane lines that are not traversed according to the specified endpoint location parameter of the current candidate lane line, is specifically configured to:
    determining a boundary for determining a search range according to the specified endpoint position parameter;
    determining a search range required for searching for a first candidate lane line in the image according to the boundary;
    and searching the candidate lane line in the search range in the image, and if the candidate lane line is searched, determining the searched candidate lane line as the first candidate lane line.
  27. The apparatus of claim 26, wherein the boundary is a normal to the specified endpoint on the current candidate lane line;
    when determining a search range required for searching for the first candidate lane line in the image according to the boundary, the method is specifically configured to:
    determining a first region and a second region on both sides of the boundary in the image, wherein the current lane line candidate is located in the first region;
    and determining the second area as the search range.
  28. The apparatus of claim 23,
    the specified end point is an end point with a smaller coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a larger coordinate value in the specified direction of the target candidate lane line; or,
    the specified end point is an end point with a larger coordinate value in the specified direction of the current candidate lane line, and the target end point is an end point with a smaller coordinate value in the specified direction of the target candidate lane line.
  29. The apparatus of claim 28,
    the specified direction is a vertical direction or a horizontal direction in a coordinate system applied to the image.
  30. The apparatus of claim 22, wherein the processor, after clustering the candidate lane lines according to the relationship between the endpoint location parameters of the respective candidate lane lines, is further configured to:
    performing curve fitting processing on each clustered candidate lane line by using a preset third curve model; wherein the highest term order of the third curve model is greater than the highest term order of the second curve model;
    the processor, when determining the clustered lane line candidates as the lane lines detected from the image, is specifically configured to:
    and determining each lane line candidate after performing curve fitting processing using the third curve model as a lane line detected from the image.
  31. The apparatus of claim 22,
    the designated sequence is the sequence of the lengths of the candidate lane lines from long to short.
  32. The apparatus according to any of claims 20-31, wherein the processor is specifically configured to, when determining a number of candidate lane lines to be clustered in the image:
    detecting a plurality of candidate lane lines to be classified from the image according to a preset lane line detection mode;
    determining a corresponding category for each candidate lane line to be classified;
    and determining each candidate lane line after the category is determined as a plurality of candidate lane lines to be clustered.
  33. The apparatus of claim 32, wherein after the processor determines a corresponding category for each candidate lane line to be classified, the processor is further configured to:
    performing skeleton extraction processing on each candidate lane line to refine each candidate lane line;
    the processor is specifically configured to, when determining each candidate lane line after determining the category as the plurality of candidate lane lines to be clustered:
    and determining each refined candidate lane line as the plurality of candidate lane lines to be clustered.
  34. The device of claim 33, wherein after the processor performs the skeleton extraction process on each candidate lane line, the processor is further configured to:
    performing inverse perspective transformation processing on each thinned candidate lane line so that the candidate lane line is under a target view angle in the image, wherein the target view angle is a view angle for overlooking the candidate lane line when the candidate lane line is collected;
    the processor is specifically configured to, when determining each refined candidate lane line as the plurality of candidate lane lines to be clustered:
    and determining each candidate lane line after the inverse perspective transformation processing as a plurality of candidate lane lines to be clustered.
  35. The apparatus of claim 32, wherein the processor, prior to determining the corresponding category for each candidate lane line to be classified, is further configured to:
    executing expansion processing on each candidate lane line to be classified, so that invalid pixel values in the candidate lane lines are modified into valid pixel values;
    performing erosion processing on the candidate lane lines after the expansion processing so that the eroded candidate lane lines have the same size as the corresponding candidate lane lines before the expansion processing.
  36. The apparatus of claim 20, wherein the processor, after clustering the candidate lane lines according to the relationship between the endpoint location parameters of the respective candidate lane lines, is further configured to:
    performing curve fitting processing on each clustered candidate lane line by using a preset curve model;
    the processor, when determining the clustered lane line candidates as the lane lines detected from the image, is specifically configured to:
    and determining each candidate lane line subjected to curve fitting processing by using the preset curve model as the lane line detected from the image.
  37. The apparatus of claim 20, wherein after the processor determines the clustered lane line candidates as the lane lines detected from the image, the processor is further configured to:
    calculating a designated characteristic value of each detected lane line;
    judging whether the specified characteristic value is in a set value range or not;
    if not, deleting the lane line from all the detected lane lines.
  38. The apparatus of claim 37, wherein the specified feature value comprises at least one of the following parameters:
    curvature of the lane line;
    slope of the lane line;
    the width of the lane line.
  39. A computer-readable storage medium, characterized in that,
    the computer-readable storage medium having stored thereon computer instructions that, when executed, implement the lane line detection method of any of claims 1-19.
CN201880068401.7A 2018-11-29 2018-11-29 Lane line detection method, lane line detection apparatus, and computer-readable storage medium Pending CN111433780A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/118186 WO2020107326A1 (en) 2018-11-29 2018-11-29 Lane line detection method, device and computer readale storage medium

Publications (1)

Publication Number Publication Date
CN111433780A true CN111433780A (en) 2020-07-17

Family

ID=70852683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880068401.7A Pending CN111433780A (en) 2018-11-29 2018-11-29 Lane line detection method, lane line detection apparatus, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN111433780A (en)
WO (1) WO2020107326A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113449629A (en) * 2021-06-25 2021-09-28 重庆卡佐科技有限公司 Lane line false and true identification device, method, equipment and medium based on driving video
CN114708569A (en) * 2022-02-22 2022-07-05 广州文远知行科技有限公司 Road curve detection method, device, equipment and storage medium
CN116304142A (en) * 2023-05-12 2023-06-23 智道网联科技(北京)有限公司 Point cloud data acquisition method, device, equipment and storage medium

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113223113B (en) * 2021-04-30 2024-04-19 阿波罗智联(北京)科技有限公司 Lane line processing method and device, electronic equipment and cloud control platform
CN113673438A (en) * 2021-08-23 2021-11-19 上海商汤临港智能科技有限公司 Collision early warning method and device, electronic equipment and storage medium
CN116563814A (en) * 2022-01-28 2023-08-08 灵动科技(北京)有限公司 Autonomous mobile robot and method for detecting lane lines by using same
CN114625823A (en) * 2022-03-02 2022-06-14 阿波罗智联(北京)科技有限公司 Lane line data processing method, device, equipment and storage medium
CN115797896B (en) * 2023-01-30 2023-05-09 智道网联科技(北京)有限公司 Lane line clustering method, equipment and computer readable storage medium
CN116030286B (en) * 2023-03-29 2023-06-16 高德软件有限公司 Boundary lane line matching method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017041396A1 (en) * 2015-09-10 2017-03-16 百度在线网络技术(北京)有限公司 Driving lane data processing method, device, storage medium and apparatus
CN108052880A (en) * 2017-11-29 2018-05-18 南京大学 Traffic monitoring scene actual situation method for detecting lane lines

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4744537B2 (en) * 2008-02-05 2011-08-10 日立オートモティブシステムズ株式会社 Driving lane detector
CN104008387B (en) * 2014-05-19 2017-02-15 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN105320927B (en) * 2015-03-25 2018-11-23 中科院微电子研究所昆山分所 Method for detecting lane lines and system
CN105740782B (en) * 2016-01-25 2019-02-22 北京航空航天大学 A kind of driver's lane-change course quantization method based on monocular vision

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017041396A1 (en) * 2015-09-10 2017-03-16 百度在线网络技术(北京)有限公司 Driving lane data processing method, device, storage medium and apparatus
CN108052880A (en) * 2017-11-29 2018-05-18 南京大学 Traffic monitoring scene actual situation method for detecting lane lines

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113449629A (en) * 2021-06-25 2021-09-28 重庆卡佐科技有限公司 Lane line false and true identification device, method, equipment and medium based on driving video
CN113449629B (en) * 2021-06-25 2022-10-28 重庆卡佐科技有限公司 Lane line false and true identification device, method, equipment and medium based on driving video
CN114708569A (en) * 2022-02-22 2022-07-05 广州文远知行科技有限公司 Road curve detection method, device, equipment and storage medium
CN116304142A (en) * 2023-05-12 2023-06-23 智道网联科技(北京)有限公司 Point cloud data acquisition method, device, equipment and storage medium
CN116304142B (en) * 2023-05-12 2023-08-08 智道网联科技(北京)有限公司 Point cloud data acquisition method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2020107326A1 (en) 2020-06-04

Similar Documents

Publication Publication Date Title
CN111433780A (en) Lane line detection method, lane line detection apparatus, and computer-readable storage medium
JP6830139B2 (en) 3D data generation method, 3D data generation device, computer equipment and computer readable storage medium
JP6347827B2 (en) Method, apparatus and device for detecting lane boundaries
CN110307838B (en) Robot repositioning method and device, computer-readable storage medium and robot
US11189032B2 (en) Method and apparatus for extracting a satellite image-based building footprint
EP2887315A1 (en) Calibration device, method for implementing calibration, program and camera for movable body
CN110969145B (en) Remote sensing image matching optimization method and device, electronic equipment and storage medium
CN112336342A (en) Hand key point detection method and device and terminal equipment
CN111308500A (en) Obstacle sensing method and device based on single-line laser radar and computer terminal
CN112257698A (en) Method, device, equipment and storage medium for processing annular view parking space detection result
WO2021056339A1 (en) Positioning method and system, and movable platform
CN109635641B (en) Method, device and equipment for determining road boundary line and storage medium
CN111401143A (en) Pedestrian tracking system and method
US20160063716A1 (en) Line parametric object estimation
CN107480710B (en) Feature point matching result processing method and device
CN115965927B (en) Pavement information extraction method and device, electronic equipment and readable storage medium
CN112212851B (en) Pose determination method and device, storage medium and mobile robot
CN115239776B (en) Point cloud registration method, device, equipment and medium
CN113033593A (en) Text detection training method and device based on deep learning
CN114581890B (en) Method and device for determining lane line, electronic equipment and storage medium
CN113867371B (en) Path planning method and electronic equipment
CN116858233A (en) Path generation method, path generation device, server and storage medium
CN115249407B (en) Indicator light state identification method and device, electronic equipment, storage medium and product
CN112686155A (en) Image recognition method, image recognition device, computer-readable storage medium and processor
CN117419690B (en) Pose estimation method, device and medium of unmanned ship

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200717