Nothing Special   »   [go: up one dir, main page]

CN108416044B - Scene thumbnail generation method and device, electronic equipment and storage medium - Google Patents

Scene thumbnail generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN108416044B
CN108416044B CN201810213001.2A CN201810213001A CN108416044B CN 108416044 B CN108416044 B CN 108416044B CN 201810213001 A CN201810213001 A CN 201810213001A CN 108416044 B CN108416044 B CN 108416044B
Authority
CN
China
Prior art keywords
clustering
point
points
fitting
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810213001.2A
Other languages
Chinese (zh)
Other versions
CN108416044A (en
Inventor
刘青
胡祝青
卢彦斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zebred Network Technology Co Ltd
Original Assignee
Zebred Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zebred Network Technology Co Ltd filed Critical Zebred Network Technology Co Ltd
Priority to CN201810213001.2A priority Critical patent/CN108416044B/en
Publication of CN108416044A publication Critical patent/CN108416044A/en
Application granted granted Critical
Publication of CN108416044B publication Critical patent/CN108416044B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Databases & Information Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a method and a device for generating a scene thumbnail, electronic equipment and a storage medium, wherein the method comprises the following steps: determining three-dimensional track points of each sampling device in a target scene according to image data and motion data acquired by each acquisition device when the target scene moves; acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points; clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain clustering points; and performing curve fitting on each clustering point to generate a thumbnail of the target scene, wherein the thumbnail of the target scene can be generated for the target scene which is inconvenient for aerial photography or not covered by a GPS, the method is simple, the operand is small, and the rapid generation of the thumbnail can be realized.

Description

Scene thumbnail generation method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of map acquisition, in particular to a method and a device for generating a scene thumbnail, electronic equipment and a storage medium.
Background
With the rapid development of network technology, communication technology and geographic information system technology, the real-time acquisition of the thumbnail of the target scene is very important. For example, when a vehicle enters a parking lot, a cell, and the like, the planning of the parking lot or the cell can be rapidly mastered through thumbnails of the parking lot or the cell, so that the vehicle can safely run conveniently, and a user can find a destination quickly.
In the prior art, a method for acquiring a target scene thumbnail mainly includes acquiring an image of a target scene through remote sensing or aerial photography, and extracting the target scene thumbnail from the photographed image. Alternatively, a thumbnail of the target scene is generated by collecting GPS (Global Positioning System) Positioning data of a large number of moving objects in the target scene, and performing arithmetic processing on the GPS Positioning data.
However, the conventional technology has high cost, and cannot obtain thumbnails of places such as underground parking lots where aerial photography is inconvenient or places where GPS cannot cover.
Disclosure of Invention
The embodiment of the invention provides a method and a device for generating a scene thumbnail, electronic equipment and a storage medium, and aims to solve the problems that in the prior art, the cost is high, and the thumbnail cannot be obtained in places which are inconvenient to aerial photograph such as underground parking lots and the like or places which cannot be covered by a GPS (global positioning system).
In a first aspect, an embodiment of the present invention provides a method for generating a scene thumbnail, including:
determining three-dimensional track points of each sampling device in a target scene according to image data and motion data acquired by each acquisition device when the target scene moves;
acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points;
clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain clustering points;
and performing curve fitting on each clustering point to generate a thumbnail of the target scene.
In a possible implementation manner of the first aspect, the acquiring the two-dimensional coordinate point and the direction angle of each three-dimensional track point specifically includes:
taking the projection point of the three-dimensional track point on a two-dimensional plane as a two-dimensional coordinate point of the three-dimensional track point;
and taking the connecting line direction of the two-dimensional coordinate point and the two-dimensional coordinate point adjacent to the sampling moment as the direction angle of the two-dimensional coordinate point.
In another possible implementation manner of the first aspect, the two-dimensional plane is a plane closest to each of the three-dimensional track points.
In another possible implementation manner of the first aspect, the clustering, according to a distance between each two-dimensional coordinate point and a direction angle of each two-dimensional coordinate point, each two-dimensional coordinate point to obtain each clustering point specifically includes:
determining the distance between two adjacent two-dimensional coordinate points;
taking each adjacent two-dimensional coordinate point of which the sum of the distances is smaller than a preset clustering distance and the direction angle is within a preset direction angle range as a two-dimensional point set;
and determining clustering points corresponding to the two-dimensional point sets and direction angles of the clustering points.
In another possible implementation manner of the first aspect, the determining the clustering points corresponding to each two-dimensional point set and the direction angles of the clustering points specifically includes:
and taking the central point of the two-dimensional point set as the clustering point, and taking the average value of the direction angles of all the two-dimensional coordinate points in the two-dimensional point set as the direction angle of the clustering point, wherein the distance between two adjacent clustering points is the preset clustering distance.
In another possible implementation manner of the first aspect, the determining the clustering points corresponding to each two-dimensional point set and the direction angles of the clustering points specifically includes:
and taking a first two-dimensional coordinate point in the two-dimensional point set as the clustering point, and taking the direction angle of the first two-dimensional coordinate point as the direction angle of the clustering point, wherein the distance between two adjacent clustering points is the preset clustering distance.
In another possible implementation manner of the first aspect, the performing curve fitting on each of the clustering points to generate a thumbnail of the target scene specifically includes:
grouping clustering points between two adjacent turning points into a fitting set, wherein the turning points are clustering points of which the difference value between the direction angle and the direction angle of the adjacent clustering points exceeds a preset value;
performing curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set;
and connecting the fitted curves to generate a thumbnail of the target scene.
In another possible implementation manner of the first aspect, the performing curve fitting on each fitting set specifically includes:
performing straight line fitting on each fitting set;
or performing quadratic fitting on each fitting set.
In another possible implementation manner of the first aspect, before the connecting the fitted curves and generating the thumbnail of the target scene, the method further includes:
and combining the fitted curves which are less than the preset distance and opposite in direction into one fitted curve.
In another possible implementation manner of the first aspect, the connecting the fitted curves to generate a thumbnail of the target scene specifically includes:
if the next clustering point of the target clustering points on the first fitting curve is not on the first fitting curve, determining a second fitting curve where the next clustering point is, wherein the first fitting curve is any one of the fitting curves;
determining a triangle formed by the target clustering point, the next clustering point and the intersection point of the first fitted curve and the second fitted curve;
and if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene.
In a second aspect, an embodiment of the present invention provides an apparatus for generating a scene thumbnail, including:
the determining module is used for determining three-dimensional track points of each sampling device in the target scene according to image data and motion data acquired by each acquisition device when the target scene moves;
the acquisition module is used for acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points;
the clustering module is used for clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain clustering points;
and the generating module is used for performing curve fitting on each clustering point to generate a thumbnail of the target scene.
In a possible implementation manner of the second aspect, the obtaining module is specifically configured to use a projection point of the three-dimensional track point on the two-dimensional plane as a two-dimensional coordinate point of the three-dimensional track point; and taking the connecting line direction of the two-dimensional coordinate point and the two-dimensional coordinate point close to the sampling moment as the direction angle of the two-dimensional coordinate point.
In another possible implementation manner of the second aspect, the two-dimensional plane is a plane closest to each of the three-dimensional track points.
In another possible implementation manner of the second aspect, the clustering module includes:
a determining unit configured to determine a distance between two adjacent two-dimensional coordinate points;
the clustering unit is used for taking each adjacent two-dimensional coordinate point, of which the sum of the distances is smaller than a preset clustering distance and the direction angle is within a preset direction angle range, as a two-dimensional point set;
the determining unit is further configured to determine a clustering point corresponding to each two-dimensional point set and a direction angle of each clustering point.
In another possible implementation manner of the second aspect, the determining unit is specifically configured to use a central point of the two-dimensional point set as the clustering point, and use an average value of direction angles of each two-dimensional coordinate point in the two-dimensional point set as a direction angle of the clustering point, where a distance between two adjacent clustering points is the preset clustering distance.
In another possible implementation manner of the second aspect, the determining unit is further specifically configured to use a first two-dimensional coordinate point in the two-dimensional point set as the clustering point, and use an orientation angle of the first two-dimensional coordinate point as an orientation angle of the clustering point, where a distance between two adjacent clustering points is the preset clustering distance.
In another possible implementation manner of the second aspect, the generating unit includes:
the classifying unit is used for classifying all clustering points between two adjacent turning points into a fitting set, wherein the turning points are clustering points of all clustering points, and the difference value between the direction angle and the direction angle of the adjacent clustering points exceeds a preset value;
the fitting unit is used for performing curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set;
and the connecting unit is used for connecting the fitting curves to generate a thumbnail of the target scene.
In another possible implementation manner of the second aspect, the fitting unit is specifically configured to perform straight line fitting on each fitting set;
or performing quadratic fitting on each fitting set.
In another possible implementation manner of the second aspect, the fitting unit is further configured to combine fitting curves that are less than a preset distance apart and have opposite directions into one fitting curve.
In another possible implementation manner of the second aspect, the connection unit is specifically configured to determine, if a next clustering point of the target clustering points on the first fitted curve is not on the first fitted curve, a second fitted curve where the next clustering point is located, and determine a triangle formed by the target clustering point, the next clustering point, and an intersection point of the first fitted curve and the second fitted curve; and if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene, wherein the first fitting curve is any one of the fitting curves.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
a memory for storing a computer program;
a processor configured to execute the computer program to implement the method for generating a scene thumbnail described in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored, and the computer program, when executed, implements the method for generating a scene thumbnail described in the first aspect.
The embodiment of the invention has the following beneficial effects:
determining three-dimensional track points of each sampling device in a target scene according to image data and motion data acquired by each acquisition device when the target scene moves, acquiring two-dimensional coordinate points and direction angles of each three-dimensional track point, clustering each two-dimensional coordinate point according to the distance between each two-dimensional coordinate point and the direction angle of each two-dimensional coordinate point to acquire each clustering point, and performing curve fitting on each clustering point to generate a thumbnail of the target scene. Therefore, for target scenes which are inconvenient to aerial photograph or not covered by a GPS, the thumbnail can be generated by using the method of the embodiment, the method of the embodiment is simple, the operand is small, and the thumbnail can be rapidly generated.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a method for generating a scene thumbnail according to an embodiment of the present invention;
fig. 2 is a two-dimensional coordinate point distribution diagram according to an embodiment of the present invention;
FIG. 3 is a diagram of a distribution of cluster points according to an embodiment of the present invention;
FIG. 4 is a thumbnail of a target scene according to an embodiment of the present invention;
fig. 5 is a flowchart of a method for generating a scene thumbnail according to a second embodiment of the present invention;
FIG. 6 is a distribution diagram of clustering points according to a second embodiment of the present invention;
fig. 7 is a flowchart of a method for generating a scene thumbnail according to a third embodiment of the present invention;
FIG. 8 is a schematic diagram of a fitting curve according to a third embodiment of the present invention;
FIG. 9 is another schematic diagram of a fitting curve according to a third embodiment of the present invention;
FIG. 10 is a thumbnail of a target scene according to a third embodiment of the present invention;
fig. 11 is a schematic structural diagram of a scene thumbnail generation apparatus according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a scene thumbnail generation apparatus according to a second embodiment of the present invention;
fig. 13 is a schematic structural diagram of a scene thumbnail generation apparatus according to a third embodiment of the present invention;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method is suitable for the fields needing to acquire the thumbnail, such as map planning, road topology, object outline simplification and the like.
According to the technical scheme, the camera and the sensor of the acquisition device are used for acquiring image data and motion data, three-dimensional track points of the sampling devices in the target scene are generated according to the acquired image data and motion data, the three-dimensional track points are converted into two-dimensional coordinate points, the converted two-dimensional coordinate points are clustered and subjected to curve fitting, and the thumbnail of the target scene is generated. Therefore, for target scenes which are inconvenient to aerial photograph or not covered by the GPS, the method can be used for generating the thumbnail, and the method is simple, has small calculation amount and can realize the quick generation of the thumbnail.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 1 is a flowchart of a method for generating a scene thumbnail according to an embodiment of the present invention, as shown in fig. 1, the method of this embodiment may include:
s101, determining three-dimensional track points of each sampling device in a target scene according to image data acquired by each acquisition device when the target scene moves.
The execution subject of the embodiment is any electronic device with data processing capability, and the electronic device at least comprises a memory and a processor. The electronic device may be a separate device, for example, a thumbnail image generation apparatus, and optionally, the electronic device may be integrated in other products, for example, a vehicle, as a part of the vehicle, for example, an on-board apparatus on the vehicle.
Optionally, the electronic device may further have a display function, and may display a thumbnail of the finally generated target scene. For example, the electronic device is a vehicle-mounted device with a display function, so that a user in a target scene can know the thumbnail of the target scene through the vehicle-mounted device, the user can drive the vehicle conveniently, and the driving experience of the user is improved. Meanwhile, the method of the embodiment is convenient for realizing unmanned driving.
Optionally, the electronic device of this embodiment may also be a network processor, such as a cloud processor.
The collection equipment of this embodiment can be vehicle, robot etc. installs camera and sensor etc. on this vehicle or the robot, and wherein the sensor mainly includes inertial navigation device, odometer etc. and the camera can be monocular camera, two mesh cameras, or many mesh cameras.
The collection device of this embodiment can be for signing vehicle or robot of crowdsourcing agreement, and this crowdsourcing agreement has the reward mechanism, when collection device accomplishes the collection task, can obtain corresponding reward. This may encourage more acquisition devices to be added to the acquisition task.
In the same target area, a plurality of acquisition devices can be controlled to acquire images, so that a large amount of image data and motion data can be formed.
In the actual acquisition process, the acquisition equipment shoots a target scene by using a camera installed on the acquisition equipment, generates image data of the target scene, and acquires the motion data of the acquisition equipment by using a speed sensor and an inertial navigation sensor installed on the acquisition equipment. The motion data includes mileage information of the acquisition device, speed information and steering information of the acquisition device, and optionally, the motion data further includes acceleration information, angular velocity information, compass information and the like provided by the inertial navigation device.
The image data and the motion data use the same time stamp, or a hardware trigger mechanism is used to ensure the time synchronization of the image data and the motion data.
The acquisition device of this embodiment may be connected to the electronic device in a Wireless communication manner, for example, through a Wireless communication connection such as 3G/4G cellular communication, WIFI (Wireless Fidelity) device, or bluetooth device. Optionally, the acquisition device may be in wired connection with the electronic device, for example, the electronic device is a vehicle-mounted device, and the vehicle-mounted device is connected with a camera and a sensor on the vehicle through a vehicle bus.
In this way, the acquisition device may send the acquired image data and the acquired motion data to the electronic device, for example, the acquisition device transmits the acquired image data and the acquired motion data to the electronic device in real time, or the acquisition device caches the acquired image data and the acquired motion data locally, and sends the cached image data and the cached motion data to the electronic device after the acquisition is finished.
Next, using an existing map planning method, for example, SLAM (Simultaneous localization and Mapping), SfM (Structure from Motion), or multi-sensor fusion technology, image data and Motion data acquired by each acquisition device are processed to generate location points (i.e., three-dimensional coordinate points) of each sampling device in the target scene at different times, and the location points are used as three-dimensional track points of each acquisition device in the target scene.
According to the method, the camera and the sensor of the acquisition equipment are used for acquiring the image data of the target scene, and compared with the prior art that remote sensing or aerial photography is used for acquiring the image data, the cost is low. Meanwhile, for places which are inconvenient to remotely sense or take photos, such as underground parking lots or places which are not covered by a GPS, the method of the embodiment can effectively obtain thumbnails of the places, namely the method of the embodiment has wide application range and simple process.
And S102, acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points.
The thumbnail of the present embodiment is a two-dimensional projection of the target scene, and thus, it is necessary to convert each three-dimensional locus point obtained in the above steps into two-dimensional coordinate points, for example, a three-dimensional locus point a (x1, y1, z1), whose two-dimensional coordinate points on the XY plane are a1(x1, y1), whose two-dimensional coordinate points on the XZ plane are a2(x1, z1), and whose two-dimensional coordinate points on the YZ plane are a3(y1, z 1).
The three-dimensional track points have sequence, namely the three-dimensional track points not only contain position information, but also contain direction information. For example, in a vehicle-mounted forward-looking system, each calculated three-dimensional track point represents the position of the vehicle in the target scene at different moments, and the sequence of the three-dimensional track points represents the movement direction of the vehicle in the target scene.
In one example, a projection point of each three-dimensional track point on a two-dimensional plane is used as a two-dimensional coordinate point of the three-dimensional track point;
and taking the connecting line direction of the two-dimensional coordinate point and the two-dimensional coordinate point close to the sampling moment as the direction angle of the two-dimensional coordinate point.
The two-dimensional plane of the present embodiment may be any plane.
Optionally, the two-dimensional plane is a horizontal plane, such as the ground.
Optionally, the two-dimensional plane is a plane most distributed away from each three-dimensional trace point, that is, a plane closest to each three-dimensional trace point.
Generally, the three-dimensional track points distributed on a horizontal plane such as the ground or a desktop are the most, and therefore, preferably, the two-dimensional plane of the embodiment is the horizontal plane closest to each three-dimensional track point.
The embodiment does not limit the position of the two-dimensional plane, and the position is determined according to actual needs.
After the two-dimensional plane is determined according to the method, the three-dimensional track points are projected onto the two-dimensional plane, so that the three-dimensional track points form a two-dimensional projection on the two-dimensional plane, and the projection of the three-dimensional track points on the two-dimensional plane is used as two-dimensional coordinate points of the three-dimensional track points, which is specifically shown in fig. 2.
In this embodiment, electronic equipment can obtain the sampling time of each three-dimensional track point, like this, can regard current two-dimensional coordinate point and the line direction of the two-dimensional coordinate point that corresponds near sampling moment as the direction angle of current two-dimensional coordinate point.
For example, if the current two-dimensional coordinate point is b1, the two-dimensional coordinate of the three-dimensional trace point corresponding to the next sampling time is b2, and thus the connection direction between b1 and b2 is defined as the connection direction between b1 and b2
Figure BDA0001597753830000091
As the direction angle of the two-dimensional coordinate point b 1.
S103, clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain the clustering points.
Specifically, the two-dimensional coordinate points generated in the above steps are clustered to generate clustering points as shown in fig. 3.
In one example, the two-dimensional coordinate points are sorted according to the distance and direction between the two-dimensional coordinate points, and then a preset number of adjacent two-dimensional coordinate points are used as one clustering point.
Optionally, according to a preset distance (or a preset interval), a certain two-dimensional coordinate point which is apart from the sorted two-dimensional coordinate points by the preset distance (or the preset interval) may be used as a clustering point.
In another example, two-dimensional coordinate points which are at a preset distance from each other and have the same direction or 180 degrees opposite directions are taken as a clustering point.
In yet another example, the two-dimensional coordinate points are divided using a preset window length, and the two-dimensional coordinate points divided into one group are taken as one clustering point.
Optionally, other methods may also be adopted in this embodiment to cluster the two-dimensional coordinate points to obtain the cluster points.
In this embodiment, because the data volume of the three-dimensional track point of collection is more, the data volume of its two-dimensional coordinate point of throwing is also great, like this, in order to reduce the calculation burden, improves data processing speed, then clusters two-dimensional coordinate point, simplifies a plurality of two-dimensional coordinate points into a clustering point, the data volume that has significantly reduced, subsequent curve fitting of being convenient for.
And S104, performing curve fitting on each clustering point to generate a thumbnail of the target scene.
Specifically, after the clustering points shown in fig. 3 are obtained according to the above steps, curve fitting is performed on each clustering point to generate a thumbnail of the target scene shown in fig. 4.
Alternatively, a straight line fitting method, such as a unary linear regression algorithm or a least square method, may be used to fit each clustering point to obtain a fitting curve of each clustering point, and these fitting curves constitute the thumbnail of the target scene.
Optionally, a curve fitting method, such as a least square method, a moving least square method, a cubic spline method, or a lagrange interpolation method, may also be used to fit each clustering point to obtain a fitting curve of each clustering point, and these fitting curves constitute a thumbnail of the target scene.
For example, the target scene in this embodiment is a parking lot, and the thumbnails formed in this embodiment may be roads, parking spaces, and the like of the parking lot.
According to the method, curve fitting is carried out on each clustering point to generate the thumbnail of the target scene, the whole process is simple, the processing speed is high, and the calculation result is accurate.
According to the method for generating the scene thumbnail, provided by the embodiment of the invention, three-dimensional track points of each sampling device in a target scene are determined according to image data and motion data acquired by each acquisition device when the target scene moves, two-dimensional coordinate points and direction angles of each three-dimensional track point are acquired, each two-dimensional coordinate point is clustered according to the distance between each two-dimensional coordinate point and the direction angle of each two-dimensional coordinate point, each clustering point is acquired, curve fitting is carried out on each clustering point, and the thumbnail of the target scene is generated. Therefore, for target scenes which are inconvenient to aerial photograph or not covered by a GPS, the thumbnail can be generated by using the method of the embodiment, the method of the embodiment is simple, the operand is small, and the thumbnail can be rapidly generated.
Fig. 5 is a flowchart of a method for generating a scene thumbnail according to a second embodiment of the present invention. Based on the foregoing embodiment, the present embodiment relates to a specific process of clustering each two-dimensional coordinate point according to a distance between each two-dimensional coordinate point and a direction angle of each two-dimensional coordinate point to obtain each clustering point. As shown in fig. 5, the present embodiment may include:
s201, determining the distance between two adjacent two-dimensional coordinate points.
The present embodiment can calculate the distance between two adjacent two-dimensional coordinate points by the euclidean distance.
For example, if two adjacent two-dimensional coordinate points b1 are (x1, y1) and b2 are (x2, y2), the distance between b1 and b2 is
Figure BDA0001597753830000111
Thus, the distance between adjacent two-dimensional coordinate points can be obtained according to the above method.
S202, taking each adjacent two-dimensional coordinate point with the distance sum smaller than the preset clustering distance and the direction angle error within the preset direction angle error range as a two-dimensional point set.
S203, determining clustering points corresponding to the two-dimensional point sets and direction angles of the clustering points.
For example, assuming that the preset clustering distance is dk, the preset direction angle error range is ± 1 °, and the two-dimensional coordinate points are arranged according to the sequence of the sampling time: b1, b2, b3 … bn. Starting from the first two-dimensional coordinate point b1, the distance between b1 and the second two-dimensional sampling point b2 is d1, d1< dk, and the direction angle is 80 °; the distance between b2 and the third two-dimensional sampling point b3 is d2, (d1+ d2) < dk, with an orientation angle of 80.5 °, 80.5 ° -80 ° -0.5 ° <1 °; the distance between b3 and the fourth two-dimensional sampling point b4 is d3, (d1+ d2+ d3) > dk, with an orientation angle of 80.4 °, 80.4 ° -80.5 ° -0.1 ° <1 °.
As can be seen from the above, the sum (d1+ d2+ d3) of the distances between the adjacent two-dimensional coordinate points B1, B2 and B3 is smaller than the preset clustering distance dk, and the errors of the direction angles of B1, B2 and B3 are within ± 1 ° of the preset direction angle error, so that the two-dimensional coordinate points B1, B2 and B3 can be regarded as a two-dimensional point set B.
Then, the clustering points of the two-dimensional point set and the direction angles of the clustering points are determined.
Then, the clustering points are used as starting points, the preset clustering distance dk is used as a distance, and another clustering point is inserted into the subsequent two-dimensional coordinate points, so that the distance between two adjacent clustering points is the same and is equal to the preset clustering distance dk.
Wherein, the above S203 may also be implemented according to the following manner:
in one example, any one of the two-dimensional coordinate points in the two-dimensional point set is taken as a clustering point of the two-dimensional point set, and meanwhile, a direction angle corresponding to the two-dimensional coordinate point is taken as a direction angle of the clustering point.
For example, with continued reference to the above example, any one of the two-dimensional coordinate points B3 in the two-dimensional point set B (B1, B2, B3) is taken as a cluster point of the two-dimensional point set B, and the direction angle (i.e., 80.4 °) corresponding to B3 is taken as the direction angle of the cluster point.
In another example, a central point of the two-dimensional point set is taken as the clustering point, and an average value of direction angles of each two-dimensional coordinate point in the two-dimensional point set is taken as the direction angle of the clustering point, wherein a distance between two adjacent clustering points is the preset clustering distance.
For example, with continued reference to the above example, the coordinate midpoint bm of three points in the two-dimensional point set B (B1, B2, B3) is defined as the clustering point of the two-dimensional point set B, and the average value of the direction angles of the three points B1, B2, B3 is defined as the direction angle of the clustering point.
Alternatively, the center point B2 in the two-dimensional point set B (B1, B2, B3) is set as the clustering point of the two-dimensional point set B, and the direction angle (i.e., 80.5 °) corresponding to B2 is set as the direction angle of the clustering point.
In another example, a first two-dimensional coordinate point in the two-dimensional point set is taken as the clustering point, and a direction angle of the first two-dimensional coordinate point is taken as a direction angle of the clustering point, wherein a distance between two adjacent clustering points is the preset clustering distance.
For example, with continued reference to the above example, the first two-dimensional coordinate point B1 in the two-dimensional point set B (B1, B2, B3) is taken as a cluster point of the two-dimensional point set B, and the direction angle (i.e., 80 °) corresponding to B1 is taken as the direction angle of the cluster point.
That is, the present embodiment can obtain the clustering point distribution map as shown in fig. 6 according to the above method.
According to the method for generating the scene thumbnail, provided by the embodiment of the invention, the distance between two adjacent two-dimensional coordinate points is determined, and the sum of the distances is smaller than the preset clustering distance, and each adjacent two-dimensional coordinate point of which the direction angle is within the preset direction angle range is used as a two-dimensional point set; and determining clustering points corresponding to the two-dimensional point sets and direction angles of the clustering points, and further realizing accurate clustering of the two-dimensional coordinate points.
Fig. 7 is a flowchart of a method for generating a scene thumbnail according to a third embodiment of the present invention. On the basis of the foregoing embodiment, the present embodiment relates to a specific process of performing curve fitting on each of the clustering points to generate a thumbnail of the target scene. As shown in fig. 7, the present embodiment may include:
s301, grouping clustering points between two adjacent turning points into a fitting set as the fitting set, wherein the turning points are clustering points of which the difference value between the direction angle and the direction angle of the adjacent clustering points exceeds a preset value.
Specifically, as shown in fig. 6, the direction angle of the cluster point located at the corner is different from the direction angle of the adjacent cluster point by a large amount, so that the cluster points can be traversed, the direction angles of the cluster points are compared, and if the direction angle of a certain cluster point is changed greatly from that of the next cluster point adjacent to the certain cluster point, for example, the difference between the direction angle of the cluster point bk and the direction angle of the next cluster point bk +1 adjacent to the certain cluster point exceeds a preset value c, the cluster point bk can be determined as the turning point.
According to the method, the turning points in the clustering points can be obtained, and the turning points can be marked for convenience of identification.
At this time, directions of the clustering points between two adjacent turning points are substantially the same, and the clustering points between two adjacent turning points can be used as a fitting set.
Thus, according to the above method, each cluster point is divided into each fitting set, and a plurality of fitting sets are obtained.
And S302, performing curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set.
Specifically, a curve fitting, for example, a straight line fitting or a curve fitting is performed on each clustering point in each fitting set by using one fitting set as a unit, so as to generate a fitting curve corresponding to each fitting set.
Optionally, if the error of the fitted curve is too large, the curve may be segmented and then fitted separately.
Fig. 8 is a plurality of fitting curves generated by fitting straight lines to the cluster points shown in fig. 6.
In a possible implementation manner of this embodiment, fitted curves that are separated from each other by a preset distance and have opposite directions are combined into one fitted curve.
Specifically, as shown in fig. 8, since the range of the direction angle of the point is [0, 360 ], the curve fitting is performed by considering the cluster points of the direction angles a (a <180) and a +180 as two directions. In the actual road, a and a +180 can be regarded as two different driving directions of the same road, so that curves close to each other in distance and with 180-degree direction difference can be combined to form one curve.
As shown in fig. 8, the distance between the fitting curve 1 and the fitting curve 2 is smaller than the preset distance, and the direction angles of the fitting curve 1 and the fitting curve 2 are opposite, so that the fitting curve 1 and the fitting curve 2 can be combined to generate the fitting curve shown in fig. 9.
The fitting curve 1 and the fitting curve 2 are combined, specifically, each clustering point on the fitting curve 1 and each clustering point on the fitting curve 2 are re-fitted to generate a fitting curve.
I.e., the method of the present embodiment, may be used to detect bidirectional lane information.
And S303, connecting the fitting curves to generate a thumbnail of the target scene.
Specifically, according to the above steps, fitting curves as shown in fig. 9 can be obtained, and then, these fitting curves are connected to generate a thumbnail of the target scene as shown in fig. 10.
For example, the fitted curves in fig. 9 are extended and intersected to form the thumbnail image shown in fig. 10.
In one example, the curve connection may be made according to the following:
s3031, if the next clustering point of the target clustering points on the first fitted curve is not on the first fitted curve, determining a second fitted curve where the next clustering point is located, wherein the first fitted curve is any one of the fitted curves.
S3032, determining a triangle formed by the target clustering point, the next clustering point and the intersection point of the first fitted curve and the second fitted curve.
And S3033, if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene.
For example, one of the fitting curves shown in fig. 9 is selected as a first fitting curve L1, each clustering point in L1 is traversed sequentially, and if a certain clustering point p1 (the clustering point is designated as a target clustering point) on the first fitting curve L1 and the next clustering point p2 is not on the first fitting curve L1, the fitting curve where the clustering point p2 is located is queried as a second fitting curve L2.
Next, an intersection p of the first fitted curve L1 and the second fitted curve L2 is determined, which is near the intersection of p1 and p 2.
Determining a triangle delta p1p2p formed by p1, p2 and p, judging whether other fitting curves exist in the delta p1p2p, and if not, performing curve fitting connection on L1 and L2 based on the directions of p1, p2 and p1 and p 2. If other fitted curves exist in the Δ p1p2p, the fact that a communication curve exists between the p1 and the p2 does not need to be connected again is shown.
According to the method for generating the scene thumbnail, provided by the embodiment of the invention, each clustering point between two adjacent turning points is used as a fitting set, curve fitting is carried out on each fitting set to obtain a fitting curve corresponding to each fitting set, each fitting curve is connected to generate the thumbnail of the target scene, and then the accurate generation of the thumbnail of the target scene is realized.
Fig. 11 is a schematic structural diagram of a scene thumbnail generation apparatus according to a first embodiment of the present invention, and as shown in fig. 11, the scene thumbnail generation apparatus 100 according to the present embodiment may include:
the determining module 110 is configured to determine three-dimensional track points of each sampling device in a target scene according to image data and motion data acquired by each acquisition device when the target scene moves;
the obtaining module 120 is configured to obtain a two-dimensional coordinate point and a direction angle of each three-dimensional track point;
a clustering module 130, configured to cluster the two-dimensional coordinate points according to a distance between the two-dimensional coordinate points and a direction angle of each two-dimensional coordinate point, so as to obtain each cluster point;
and a generating module 140, configured to perform curve fitting on each clustering point to generate a thumbnail of the target scene.
In a possible implementation manner of this embodiment, the acquiring the two-dimensional coordinate point and the direction angle of each three-dimensional track point specifically includes:
the obtaining module 120 is specifically configured to use a projection point of the three-dimensional track point on the two-dimensional plane as a two-dimensional coordinate point of the three-dimensional track point; and taking the connecting line direction of the two-dimensional coordinate point and the two-dimensional coordinate point close to the sampling moment as the direction angle of the two-dimensional coordinate point.
In another possible implementation manner of this embodiment, the two-dimensional plane is a plane closest to each of the three-dimensional track points.
Fig. 12 is a schematic structural diagram of a scene thumbnail generation apparatus according to a second embodiment of the present invention, as shown in fig. 12, based on the second embodiment, the clustering module 130 includes:
a determining unit 131 for determining a distance between two adjacent two-dimensional coordinate points;
a clustering unit 132, configured to use each adjacent two-dimensional coordinate point, where the sum of the distances is smaller than a preset clustering distance and the direction angle is within a preset direction angle range, as a two-dimensional point set;
the determining unit 132 is further configured to determine a clustering point corresponding to each two-dimensional point set and a direction angle of each clustering point.
In a possible implementation manner of this embodiment, the determining unit 131 is specifically configured to use a central point of the two-dimensional point set as the clustering point, and use an average value of direction angles of each two-dimensional coordinate point in the two-dimensional point set as the direction angle of the clustering point, where a distance between two adjacent clustering points is the preset clustering distance.
In another possible implementation manner of this embodiment, the determining unit 131 is further specifically configured to use a first two-dimensional coordinate point in the two-dimensional point set as the clustering point, and use a direction angle of the first two-dimensional coordinate point as a direction angle of the clustering point, where a distance between two adjacent clustering points is the preset clustering distance.
Fig. 13 is a schematic structural diagram of a scene thumbnail generation apparatus according to a third embodiment of the present invention, and as shown in fig. 13, on the basis of the third embodiment, the generation module 140 includes:
the classifying unit 141 is configured to classify each clustering point between two adjacent turning points into a fitting set, where the turning point is a clustering point in each clustering point, where a difference between the direction angle and a direction angle of an adjacent clustering point exceeds a preset value;
a fitting unit 142, configured to perform curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set;
a connecting unit 143, configured to connect the fitted curves to generate a thumbnail of the target scene.
In a possible implementation manner of this embodiment, the fitting unit 142 is specifically configured to perform straight line fitting on each fitting set; or performing quadratic fitting on each fitting set.
In another possible implementation manner of this embodiment, the fitting unit 142 is further configured to combine fitting curves that are less than a preset distance apart and have opposite directions into one fitting curve.
In another possible implementation manner of this embodiment, the connecting unit 143 is specifically configured to determine, if a next clustering point of the target clustering points on the first fitted curve is not on the first fitted curve, a second fitted curve where the next clustering point is located, and determine a triangle formed by the target clustering point, the next clustering point, and an intersection point of the first fitted curve and the second fitted curve; and if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene, wherein the first fitting curve is any one of the fitting curves.
It should be noted that: the scene thumbnail generation device provided in the above embodiments is only illustrated by the division of the above functional modules when performing the scene thumbnail generation process, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the scene thumbnail generation apparatus provided in the foregoing embodiment and the scene thumbnail generation apparatus embodiment belong to the same concept, and a specific implementation process thereof is detailed in the method embodiment and is not described herein again.
Fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 14, an electronic device 200 according to the embodiment includes:
a memory 210 for storing a computer program;
the processor 220 is configured to execute the computer program to implement the method for generating a scene thumbnail, which has similar implementation principles and technical effects and is not described herein again.
Further, when at least a part of the functions of the method for generating a scene thumbnail in the embodiment of the present invention are implemented by software, the embodiment of the present invention further provides a computer storage medium for storing computer software instructions for generating the scene thumbnail, which, when executed on a computer, enable the computer to execute various possible methods for generating a scene thumbnail in the embodiment of the method. The processes or functions described in accordance with the embodiments of the present invention may be generated in whole or in part when the computer-executable instructions are loaded and executed on a computer. The computer instructions may be stored on a computer storage medium or transmitted from one computer storage medium to another via wireless (e.g., cellular, infrared, short-range wireless, microwave, etc.) to another website site, computer, server, or data center. The computer storage media may be any available media that can be accessed by a computer or a data storage device, such as a server, data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., SSD), among others.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method for generating a scene thumbnail is characterized by comprising the following steps:
determining three-dimensional track points of each sampling device in a target scene according to image data and motion data acquired by each acquisition device when the target scene moves;
acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points;
clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain clustering points;
grouping clustering points between two adjacent turning points into a fitting set, wherein the turning points are clustering points of which the difference value between a direction angle and a direction angle of an adjacent clustering point exceeds a preset value;
performing curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set;
if the next clustering point of the target clustering points on the first fitting curve is not on the first fitting curve, determining a second fitting curve where the next clustering point is, wherein the first fitting curve is any one of the fitting curves;
determining a triangle formed by the target clustering point, the next clustering point and the intersection point of the first fitted curve and the second fitted curve;
and if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene.
2. The method according to claim 1, wherein the obtaining of the two-dimensional coordinate points and the direction angles of the three-dimensional track points specifically comprises:
taking the projection point of the three-dimensional track point on a two-dimensional plane as a two-dimensional coordinate point of the three-dimensional track point;
and taking the connecting line direction of the two-dimensional coordinate point and the two-dimensional coordinate point adjacent to the sampling moment as the direction angle of the two-dimensional coordinate point.
3. The method of claim 2, wherein the two-dimensional plane is the plane closest to each of the three-dimensional trajectory points.
4. The method according to claim 3, wherein the clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain the clustering points specifically comprises:
determining the distance between two adjacent two-dimensional coordinate points;
taking each adjacent two-dimensional coordinate point with the distance sum smaller than the preset clustering distance and the direction angle within the preset direction angle range as a two-dimensional point set;
and determining the clustering points corresponding to the two-dimensional point sets and the direction angles of the clustering points.
5. The method according to claim 4, wherein the determining of the clustering points corresponding to each two-dimensional point set and the direction angles of each clustering point specifically comprises:
and taking the central point of the two-dimensional point set as the clustering point, and taking the average value of the direction angles of all two-dimensional coordinate points in the two-dimensional point set as the direction angle of the clustering point, wherein the distance between two adjacent clustering points is the preset clustering distance.
6. The method according to claim 1, wherein the curve fitting each of the fitting sets specifically comprises:
performing straight line fitting on each fitting set;
or performing quadratic fitting on each fitting set.
7. The method of claim 6, wherein before connecting the fitted curves to generate the thumbnail of the target scene, the method further comprises:
and combining the fitted curves which are less than the preset distance and opposite in direction into one fitted curve.
8. An apparatus for generating a scene thumbnail, comprising:
the determining module is used for determining three-dimensional track points of each sampling device in the target scene according to image data and motion data acquired by each acquisition device when the target scene moves;
the acquisition module is used for acquiring two-dimensional coordinate points and direction angles of the three-dimensional track points;
the clustering module is used for clustering the two-dimensional coordinate points according to the distance between the two-dimensional coordinate points and the direction angle of the two-dimensional coordinate points to obtain clustering points;
a generation module comprising:
the classifying unit is used for classifying all clustering points between two adjacent turning points into a fitting set, wherein the turning points are clustering points of all clustering points, and the difference value between the direction angle and the direction angle of the adjacent clustering points exceeds a preset value;
the fitting unit is used for performing curve fitting on each fitting set to obtain a fitting curve corresponding to each fitting set;
the connecting unit is used for determining a second fitted curve where a next clustering point of a target clustering point on a first fitted curve is located and determining a triangle formed by the target clustering point, the next clustering point and an intersection point of the first fitted curve and the second fitted curve if the next clustering point of the target clustering point on the first fitted curve is not located on the first fitted curve; and if no fitting curve exists in the triangle, performing fitting connection on the first fitting curve and the second fitting curve to generate the thumbnail of the target scene, wherein the first fitting curve is any one of the fitting curves.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program for implementing the method of generating a scene thumbnail as claimed in any of claims 1-7.
10. A computer storage medium, characterized in that the storage medium has stored therein a computer program which, when executed, implements a method of generating a scene thumbnail as recited in any of claims 1-7.
CN201810213001.2A 2018-03-15 2018-03-15 Scene thumbnail generation method and device, electronic equipment and storage medium Active CN108416044B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810213001.2A CN108416044B (en) 2018-03-15 2018-03-15 Scene thumbnail generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810213001.2A CN108416044B (en) 2018-03-15 2018-03-15 Scene thumbnail generation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108416044A CN108416044A (en) 2018-08-17
CN108416044B true CN108416044B (en) 2021-11-09

Family

ID=63131609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810213001.2A Active CN108416044B (en) 2018-03-15 2018-03-15 Scene thumbnail generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108416044B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110060343B (en) * 2019-04-24 2023-06-20 阿波罗智能技术(北京)有限公司 Map construction method and system, server and computer readable medium
CN114972828A (en) * 2022-05-14 2022-08-30 上海贝特威自动化科技有限公司 Clustering-based two-dimensional coordinate row and column sorting method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105404844A (en) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 Road boundary detection method based on multi-line laser radar
CN105741355A (en) * 2016-02-01 2016-07-06 华侨大学 Block segmentation method for triangular grid model
CN105825510A (en) * 2016-03-17 2016-08-03 中南大学 Automatic matching method between point of interest and road network
CN105993034A (en) * 2014-02-13 2016-10-05 微软技术许可有限责任公司 Contour completion for augmenting surface reconstructions

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7764819B2 (en) * 2006-01-25 2010-07-27 Siemens Medical Solutions Usa, Inc. System and method for local pulmonary structure classification for computer-aided nodule detection
US10621214B2 (en) * 2015-10-15 2020-04-14 Verizon Patent And Licensing Inc. Systems and methods for database geocoding

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105993034A (en) * 2014-02-13 2016-10-05 微软技术许可有限责任公司 Contour completion for augmenting surface reconstructions
CN105404844A (en) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 Road boundary detection method based on multi-line laser radar
CN105741355A (en) * 2016-02-01 2016-07-06 华侨大学 Block segmentation method for triangular grid model
CN105825510A (en) * 2016-03-17 2016-08-03 中南大学 Automatic matching method between point of interest and road network

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Boundary detection with a road model for occupancy grids in the curvilinear coordinate system using a downward-looking lidar sensor;Kim, Je, Seok, et al;《Proceedings of the Institution of Mechanical Engineers, Part D. Journal of Automobile Engineering》;20161231(第10期);1351-1363页 *
架空输电线路中设备轮廓提取方法的研究;李辉等;《电子测量技术》;20180228(第4期);87-92页 *
郊外低对比度图像线特征提取算法研究;柯欣琦等;《激光与红外》;20170831;第47卷(第8期);1046-1050页 *

Also Published As

Publication number Publication date
CN108416044A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
CN108089572B (en) Method and device for vehicle positioning
CN110497901B (en) Parking space automatic searching method and system based on robot VSLAM technology
CN112567201B (en) Distance measuring method and device
CN111108342B (en) Visual range method and pair alignment for high definition map creation
EP3361278B1 (en) Autonomous vehicle localization based on walsh kernel projection technique
CN112640417B (en) Matching relation determining method and related device
CN110386142A (en) Pitch angle calibration method for automatic driving vehicle
WO2021184218A1 (en) Relative pose calibration method and related apparatus
CN110388931A (en) The two-dimentional bounding box of object is converted into the method for the three-dimensional position of automatic driving vehicle
US20210373161A1 (en) Lidar localization using 3d cnn network for solution inference in autonomous driving vehicles
WO2022104774A1 (en) Target detection method and apparatus
CN110345955A (en) Perception and planning cooperation frame for automatic Pilot
CN110073362A (en) System and method for lane markings detection
CN108387241A (en) Update the method and system of the positioning map of automatic driving vehicle
JP2019527832A (en) System and method for accurate localization and mapping
CN110096053A (en) Driving locus generation method, system and machine readable media for automatic driving vehicle
JP2021515178A (en) LIDAR positioning for time smoothing using RNN and LSTM in self-driving vehicles
WO2020264060A1 (en) Determining weights of points of a point cloud based on geometric features
CN113865580A (en) Map construction method and device, electronic equipment and computer readable storage medium
CN111652072A (en) Track acquisition method, track acquisition device, storage medium and electronic equipment
WO2022062480A1 (en) Positioning method and positioning apparatus of mobile device
EP4206731A1 (en) Target tracking method and device
CN113591518A (en) Image processing method, network training method and related equipment
CN108416044B (en) Scene thumbnail generation method and device, electronic equipment and storage medium
CN111833443A (en) Landmark position reconstruction in autonomous machine applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant