CN110672074A - Method and device for measuring distance of target object - Google Patents
Method and device for measuring distance of target object Download PDFInfo
- Publication number
- CN110672074A CN110672074A CN201911015686.0A CN201911015686A CN110672074A CN 110672074 A CN110672074 A CN 110672074A CN 201911015686 A CN201911015686 A CN 201911015686A CN 110672074 A CN110672074 A CN 110672074A
- Authority
- CN
- China
- Prior art keywords
- distance
- target object
- obtaining
- relative
- relative distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for measuring the distance of a target object, which comprises the steps of obtaining a first width of the target object in an image according to the image obtained by an image acquisition device; then obtaining the relevant state parameters of the target object at the previous moment; and then obtaining the first relative distance according to the relevant state parameters of the target object, wherein the real distance between the two movable devices can be obtained because the relevant state parameters can truly reflect the state of the target object. In addition, a second width of the target object is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first width and the second width, and the first relative distance is updated to be the second relative distance according to the relative difference value. Therefore, the width of the target object is obtained in different modes, and the relative distance between the two mobile devices is constrained by using the comparison result of the widths, so that the relative distance is more accurate, and the accuracy of distance measurement is realized.
Description
Technical Field
The application relates to the technical field of automatic driving, in particular to a method and a device for measuring a target object distance.
Background
With the continuous development of science and technology, automatic driving is also developed at a rapid speed. The automatic driving is not required to be equipped with a driver, and the whole process is automatically controlled by a computer.
One of the major concerns of autopilot research is ranging, i.e., measuring the distance between a preceding autopilot device and a current autopilot device. Distance measurements directly affect driving safety and driving efficiency. For example, if the distance between the front vehicle and the rear vehicle is not accurately measured, the rear vehicle easily collides with the front vehicle, causing traffic accidents, and seriously affecting the driving safety. For another example, inaccurate measurement distances of front and rear unmanned aerial vehicles may cause accidents such as collision and crash of the unmanned aerial vehicles.
Therefore, how to improve the ranging accuracy is a problem that needs to be solved urgently at present.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems.
According to an aspect of the present application, there is provided a method of measuring a target object distance, the method including: according to an image acquired by an image acquisition device, acquiring a first width of the target object in the image; acquiring relevant state parameters of the target object at the previous moment; obtaining a first relative distance according to the relevant state parameter of the target object at the previous moment, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; obtaining a second width of the target object in the image according to the first relative distance; obtaining a relative difference value according to the first width and the second width; and updating the first relative distance to be a second relative distance according to the relative difference.
According to another aspect of the present application, there is provided an apparatus for measuring a target object distance, including:
the first obtaining module is used for obtaining a first width of the target object in the image according to the image obtained by the image acquisition device; the second obtaining module is used for obtaining the relevant state parameters of the target object at the previous moment; the first processing module is used for obtaining a first relative distance according to the relevant state parameter of the target object at the previous moment, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment; the second processing module is used for obtaining a second width of the target object in the image according to the first relative distance; the comparison module is used for obtaining a relative difference value according to the first width and the second width; and the adjusting module is used for updating the first relative distance into a second relative distance according to the relative difference.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method as described above.
According to yet another aspect of the application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method as described above.
Compared with the prior art, the method of the application obtains the first width of the target object in the image according to the image obtained by the image acquisition device; then obtaining the relevant state parameters of the target object at the previous moment; and then obtaining the first relative distance according to the relevant state parameters of the target object, wherein the real distance between the two movable devices can be obtained because the relevant state parameters can truly reflect the state of the target object. In addition, a second width of the target object in the image is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first width and the second width, and the first relative distance is updated to be the second relative distance according to the relative difference value. Therefore, the width of the target object is obtained in different modes, and the relative distance between the two mobile devices is constrained by using the comparison result of the widths, so that the relative distance is more accurate, and the accuracy of distance measurement is realized.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a flowchart illustrating a method for measuring a target object distance in a video according to an exemplary embodiment of the present application.
Fig. 2 is a schematic view of an unmanned vehicle provided in another exemplary embodiment of the present application.
Fig. 3 is a schematic flow chart of obtaining a first relative distance according to an exemplary embodiment of the present application.
Fig. 4 is a diagram for determining a projection relationship in a current frame image according to an exemplary embodiment of the present application.
Fig. 5 is a flowchart of a method for adjusting a first relative distance to a second relative distance according to a relative difference according to an exemplary embodiment of the present application.
Fig. 6 is a schematic diagram of an apparatus for measuring a target object distance according to an exemplary embodiment of the present application.
Fig. 7 is an exemplary block diagram of the first processing module 630 according to an exemplary embodiment of the present application.
Fig. 8 is a block diagram of an example of an adjustment module 660 provided in an example embodiment of the present application.
Fig. 9 is a block diagram of an electronic device provided in an exemplary embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Summary of the application
The existing distance measuring modes are generally divided into monocular distance measuring, binocular distance measuring, trinocular distance measuring and the like, and are divided according to the number of image acquisition devices. For example, monocular ranging is where a single image acquisition device (in a posterior movable apparatus) measures the distance between the posterior movable apparatus and a preceding movable apparatus (also referred to herein as a "target object").
In monocular distance measurement, however, the accuracy of distance measurement directly affects driving safety and driving efficiency. If the distance measurement is inaccurate, the driving safety is seriously influenced.
In view of the above problems, the present application aims to research how to improve the accuracy of ranging in monocular ranging, and based on this purpose, the present application researches a method for measuring the distance of a target object, and obtains a first width of the target object in an image according to the image acquired by an image acquisition device; then obtaining the relevant state parameters of the target object at the previous moment; and then obtaining the first relative distance according to the relevant state parameter of the target object at the previous moment, wherein the relevant state parameter can truly reflect the state of the target object, so that the real distance between the two movable devices can be obtained. In addition, a second width of the target object in the image is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first width and the second width, and the first relative distance is updated to be the second relative distance according to the relative difference value. Therefore, the width of the target object is obtained in different modes, and the relative distance between the two mobile devices is constrained by using the comparison result of the widths, so that the relative distance is more accurate, and the accuracy of distance measurement is realized.
Exemplary method
Fig. 1 is a flowchart illustrating a method for measuring a distance to a target object according to an exemplary embodiment of the present disclosure. The embodiment can be applied to mobile equipment. The mobile equipment of this embodiment, including unmanned vehicle, unmanned aerial vehicle, arm and mobile robot etc. can the autonomous movement's equipment.
The present embodiment is applied in monocular distance measurement, and during forward and backward relative driving of two movable devices, the distance between the two movable devices (target object) and the preceding movable device is measured by the following movable device. It is noted that the rear movable apparatus takes an image with the image pickup device to perform ranging, and thus the distance between the image pickup device and the target object is equivalent to the distance between the rear movable apparatus and the target object.
A ranging method of a target object described in one or more embodiments of the present application is shown in fig. 1, and includes the following steps:
step 101, obtaining a first width of a target object in an image according to the image obtained by an image acquisition device.
The image capturing device of this embodiment may be a camera, an infrared camera, or other devices, and of course, the specific type is not limited, and any device having an image capturing function should be included in the scope of this embodiment.
The image obtained in this example contains the following information: the width of the target object in the image (which may be considered as a virtual width), the morphology of the target object (shape, state, appearance, etc.). In addition, the image acquisition device has a focal length, which is set in advance.
And the first width of the target object in the image refers to the virtual width of the target object in the image. After the image is shot, the image contains the target object, so that the virtual width of the target object in the image can be obtained by identifying the image. Taking an unmanned vehicle as an example, referring to fig. 2, the image of the rear end of the front vehicle is obtained by shooting the rear vehicle, and the virtual vehicle width of the front vehicle in the image can be obtained by recognizing the image. It is to be noted that the virtual vehicle width of the preceding vehicle in the image can be obtained from the image even under extreme conditions (e.g., the preceding vehicle turns, the preceding vehicle travels straight on a lane).
And 102, acquiring relevant state parameters of the target object at the previous moment.
In particular, the relevant state parameter can reflect the actual state of the target object as it appears during driving. Since the relevant state parameters presented by the target object at each moment in the driving process may be different, the relevant state parameters of the target object at each moment in time include: a correlated distance parameter, a correlated velocity parameter, a process noise. Further, the process noise includes: velocity noise and distance noise. Furthermore, the term "relevant state parameter at the previous time" in this embodiment refers to the relevant state parameter of the target object at the previous time based on the current time. When the target object runs to the current time, the relevant state parameters of the target object at the previous time are already presented, so that the relevant state parameters can be obtained through a specific implementation mode, and detailed description is given later, and is not repeated herein.
Wherein, the related speed parameter refers to the driving speed of the target object, and the related speed parameter at each moment is related to the related state parameter at the previous moment. The correlation velocity at a previous time may affect the correlation velocity at a later time. Taking the current time as an example, the relevant speed parameter at the current time is related to the relevant speed parameter at the previous time, the speed noise at the current time, and the like.
The relevant distance parameter refers to the distance between the image acquisition device and the target object, and the relevant distance parameter at each moment is related to the relevant state parameter at the previous moment. Taking the current time as an example, the relevant distance parameter of the current time is related to the relevant distance parameter of the previous time, the relevant speed parameter of the previous time, the time difference between the previous time and the current time, the distance noise and the like.
The velocity noise at each time is used to affect the velocity at each time. The distance noise at each time instant is used to influence the distance at each time instant, both of which are adjustable parameters.
Further, during the driving process of the target object, the relevant state parameters presented at each moment may be different, and the relevant state parameters of the target object at the previous moment may affect the relevant state parameters of the target object at the current moment. Therefore, the related state parameter at the previous time is required to be obtained as the basic parameter for obtaining the first relative distance. The specific implementation process will be described later, and will not be described herein again.
Step 103, obtaining a first relative distance according to the relevant state parameter of the target object at the previous moment.
Specifically, the first relative distance is a distance between the image acquisition device and the target object at the present time.
The obtained related state parameters can reflect the actual state of the target object in the driving process, so that the first relative distance obtained by the related state parameters can truly and accurately reflect the relative distance between the image acquisition device and the target object.
And 104, acquiring a second width of the target object according to the first relative distance.
Wherein the second width is used for characterizing the width of the target object in the image and is obtained based on the first relative distance. The first width is derived from the image to identify the target object. The second width is mapped into the image according to the real state parameter of the target object. The two widths are of different origin.
And 105, obtaining a relative difference value according to the first width and the second width. Comparing the two, the relative difference can be obtained. If the relative difference is small (e.g., smaller than a predetermined threshold), it indicates that both widths are relatively accurate. If the relative difference is large (e.g., above a predetermined threshold), it reflects a deviation in the measured first relative distance, and further adjustment is required.
And 106, updating the first relative distance to the second relative distance according to the relative difference.
Wherein, the adjustment mode is different according to the difference of the relative difference. The embodiment adopts the relative difference value as the adjustment standard, and can further optimize the relative distance with the target object.
Through the analysis, the embodiment of the invention determines the first width of the target object according to the image acquired by the image acquisition device; then, acquiring relevant state parameters of the target object at the previous moment; and obtaining the first relative distance according to the relevant state parameters, wherein the distance between the image acquisition device at the current moment and the target object can be obtained because the relevant state parameters can truly reflect the actual state of the target object in the driving process. In addition, a second width of the target object in the image is obtained according to the first relative distance. Then, a relative difference value is obtained according to the first width and the second width, and the first relative distance is adjusted to be the second relative distance according to the relative difference value. Therefore, the width of the target object is obtained in different modes, the relative distance between the two movable devices is restricted and adjusted by using the comparison result of the width, the relative distance between the two movable devices is more accurate, the distance measurement accuracy can be improved, and the driving safety is guaranteed.
On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, in the process of step 102, a relevant distance parameter of the target object at a previous time, a relevant speed parameter at a previous time, and a noise error at a previous time are obtained.
This is done because during driving, the relevant state parameter of the target object at the previous moment in time influences the relevant state parameter at the current moment in time. Taking an unmanned vehicle as an example, under the condition that the driving condition of the rear vehicle is not changed, if the front vehicle decelerates to drive at a time before, the distance between the two vehicles at the current time is reduced. The actual influence of the related state parameter at the previous moment is considered, the first relative distance at the current moment is obtained by taking the actual influence as a basis, and the accuracy of the first relative distance can be improved.
On the basis of obtaining the relevant state parameter, as an optional implementation manner of this embodiment, in the process of step 103, the following operations are implemented: and obtaining a first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.
More specifically, with reference to fig. 3, the above implementation includes the following specific operation steps:
step 301, obtaining a time difference between a current time and a previous time.
The time difference between the previous time and the current time may be in units of "milliseconds", and the time difference may be any value, for example, 2ms, 5ms, and the like.
And step 302, obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment.
The process noise can be divided into: distance noise and velocity noise. The distance noise at each moment can be used for regulating and controlling the distance precision at each moment, and the speed noise at each moment can be used for regulating and controlling the speed precision at each moment.
In addition, the relevant state parameters at each moment can influence the relevant state parameters at the moment immediately after the moment. Thus before calculatingIn the process of the relevant speed parameter at the time, the speed noise at the previous time and the relevant speed parameter at the time immediately before the previous time need to be referred to. Specifically, the correlated velocity parameter immediately before the previous time and the velocity noise at the previous time may be summed to obtain the correlated velocity parameter at the previous time. For ease of understanding, formula V may be utilizedk-1=Vk-2+Wvk-1And (6) obtaining. Wherein k-1 represents the previous time, Vk-1Indicating the relevant speed parameter, W, at the previous momentvk-1Represents the velocity noise at the previous time, k-2 represents the time immediately before the previous time (also referred to as the previous time of the previous time), Vk-2Representing the relevant speed parameter at the instant immediately preceding the previous instant.
In the implementation process of obtaining the relative movement distance at the current moment according to the time difference and the related speed parameter at the previous moment, the product of the time difference and the related speed parameter at the previous moment can be used as the relative movement distance at the current moment. In accordance with the above example, the relative movement distance at the current time is Vk-1Δ t, wherein Δ t represents a time difference between the current time and the previous time.
In the above operation, the relative movement distance is obtained by combining the relevant state parameters at the previous time (the relevant speed parameter, the time difference, and the like at the previous time), so that the influence of the change of the relevant state parameters at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the relative movement distance at the current time can be improved.
Step 303, obtaining a first relative distance based on the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise at the current time.
Specifically, the relative distance parameter at the previous time is used for representing the distance between the target object image acquisition devices. The relative movement distance of the current moment at the previous moment is used for representing the movement distance of the target object moving relative to the image acquisition device in the time difference, so that the first relative distance can be obtained by summing the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise of the current moment.
Further, the distance noise at the current time is obtained by: obtaining a related speed parameter at the previous moment; and obtaining the distance noise at the current moment according to the related speed parameter, the time difference and the scale coefficient at the previous moment. Specifically, the time difference refers to a difference between the previous time and the current time, and the scale factor is constant. If the symbol is taken as an example, Wsk1=Vk-1Δ t s. Wherein, Wsk1Representing the distance noise, V, at the current timek-1The relevant speed parameter at the previous moment is shown, Δ t represents the time difference between the current moment and the previous moment, and s is a scale coefficient.
For ease of understanding, in conjunction with the formula notation given above, the present embodiment obtains the first relative distance specifically as follows: sk=Sk-1+Vk-1*Δt+Wsk1Where k denotes the current time, SkA first relative distance, S, representing the current timek-1Representing the relative distance parameter, V, of the preceding instantk-1Δ t represents the relative movement distance at the current time, Wsk1Representing the distance noise at the current time.
As can be seen, in the above operation, the first relative distance is obtained by combining the relevant state parameter at the previous time (the relative distance parameter, the relative movement distance, and the like at the previous time) and the distance noise at the current time, so that the influence of the change in the relevant state parameter at the previous time on the relative movement distance can be comprehensively considered, and the accuracy of the first relative distance at the current time can be improved.
Through the analysis, the relevant state parameters at the previous moment and various parameters at the current moment are comprehensively considered in the implementation process of the first relative distance, and the parameters can comprehensively reflect the real-time driving state of the target object, so that the first relative distance is obtained by taking the parameters as the basis, the relative position relation between the target object and the image acquisition device can be accurately reflected, the distance measurement accuracy can be realized, and the driving safety is further ensured.
On the basis of the embodiment shown in fig. 1, as an optional implementation manner of this embodiment, the step 104 specifically includes the following operations: the second width is obtained based on the projection relationship and the first relative distance.
The projection relation comprises a mapping relation between a preset reference surface and the image, and further comprises a reduction scale, wherein the reduction scale is determined according to the focal length and the first relative distance in the image acquisition device. The second width is the calculated width of the target object in the image.
For convenience of explanation, the following description will be given with reference to symbols.
The second width of the image acquisition device in the target object is p, and the first relative distance between the target object and the image acquisition device is SkThe focal length of the image acquisition device is f, and the focal length f is set in advance. The reduction ratio can be calculated according to the distance and the focal length as follows: and (5) S/f. In addition, the actual width D of the target object is obtained. In the process of obtaining the actual width of the target object, inputting the image into a preset model, recognizing the form (shape, state, appearance and the like) of the target object in the image by the preset model, obtaining and determining the type of the target object; and obtaining the actual width of the target object according to the type of the target object.
Wherein the target object in the target image will present its own shape, such as shape, state and appearance. As different target objects may have different appearances, particular shapes, or their own brands LOGO, etc. Therefore, after the image is input into the preset model, the preset model can determine the type of the target object according to the form of the target object.
In the process of determining the type of the target object, a base model (such as a neural network model (CNN), RNN, and the like) is constrained in advance by using the sample morphology and the sample type of a large number of related objects, so as to obtain the preset model. And then, inputting the image into a preset model, processing the form of the target object in the image according to the preset model, and outputting the type of the target object. And the type of target object may determine the actual width of the target object. Taking an unmanned vehicle as an example, the type of the unmanned vehicle refers to an unmanned vehicle type, for example, if the unmanned vehicle in fig. 2 is a type a of a certain brand, the actual vehicle width is fixed.
Then, the initial width D of the target object is reduced according to the reduction ratio S/f, and the following results are obtained:the deformation is as follows:
referring to fig. 4, the actual width of the target object may be reduced into the image according to a reduction scale. Specifically, the preset reference surface may be set as the ground.
Referring to fig. 5, on the basis of the embodiment shown in fig. 1, as an alternative implementation manner of this embodiment, the step 106 specifically includes the following operations:
step 501, judging whether the relative difference is smaller than a preset threshold value.
Specifically, the specific value of the preset threshold needs to be adjusted according to experience and actual conditions, and this embodiment is not limited herein.
After the determination is performed, the obtained determination result may be one of the following two results:
firstly, the relative difference is smaller than a preset threshold value. Indicating that the first width and the second width are relatively close, step 502 may be performed.
The second step is as follows: if the relative difference is greater than or equal to the preset threshold, which indicates that there may be a deviation in the first relative distance, step 503 is executed.
Step 502, if yes, the first relative distance is determined as the second relative distance.
Wherein the second relative distance is used to characterize the distance between the target object and the image acquisition device at the current time. The second relative distance is basic data for performing the subsequent driving operation, so that the requirement on the accuracy of the second relative distance is high. The higher the accuracy of the second relative distance is, the more the driving safety and the driving efficiency can be ensured. If the relative difference is smaller than the preset threshold, it indicates that the accuracy of the obtained first relative distance is higher, so that the first relative distance can be directly determined as the second relative distance.
Therefore, the first relative distance between the two mobile devices is constrained by the relative difference value of the first width and the second width and the preset threshold value so as to obtain the second relative distance, the obtained second relative distance can be used for representing the relative position relation between the target object and the image acquisition device more accurately, the distance measurement accuracy can be realized, and the driving safety is further guaranteed.
And 503, if not, adjusting the distance noise at the current moment according to the relative difference, and obtaining a second relative distance according to the adjusted distance noise at the current moment and the relevant state parameter of the target object.
And the difference value of the third width and the first width obtained by the second relative distance is smaller than a preset threshold value.
Specifically, the relative difference and the distance noise have a mapping relationship. And acquiring corresponding distance noise from the mapping relation according to the obtained relative difference, and then adjusting the distance noise at the current moment according to the distance noise. For example, the distance noise Wsk1 at the current time is adjusted to the adjusted distance noise Wsk 2.
After the adjustment, the second relative distance may be obtained with reference to the manner in which the first relative distance is obtained. Specifically, the second relative distance can be obtained by summing the relative distance parameter at the previous time, the relative movement distance at the current time, and the adjusted distance noise. Following the above equation, the second relative distance Sk' is Sk-1+ Vk Δ t + Wsk 2. Where Sk' represents the second relative distance and represents Wsk1 the distance noise adjusted at the current time.
Further, obtaining the focal length of the image acquisition device; and processing the focal length, the width of the target object in the image and the second relative distance to obtain a third width. The third width, also referred to as the width of the target object in the image. The present embodiment is configured such that the difference between the third width and the first width is smaller than a preset threshold. That is to say, in this embodiment, the difference between the third width and the first width is smaller than the preset threshold as the constraint condition, so as to constrain the second relative distance, and the obtained second relative distance can be more accurate to represent the relative position relationship between the target object and the image acquisition device, so that the accuracy of distance measurement can be realized, and the driving safety can be further ensured.
It should be noted that, in order to obtain the second relative distance more accurately, the difference between the third width and the first width may be smaller than a preset threshold as a constraint condition, and the distance noise is adjusted to perform the above steps multiple times until the constraint condition is satisfied.
Exemplary devices
Fig. 6 illustrates a block diagram of an apparatus 600 for measuring a target object distance according to an embodiment of the present application.
As shown in fig. 6, an apparatus 600 for measuring a target object distance according to an embodiment of the present application includes: a first obtaining module 600, configured to obtain a first width of the target object in an image according to the image obtained by the image acquisition apparatus; a second obtaining module 610, configured to obtain a relevant state parameter of the target object at a previous time; a first processing module 630, configured to obtain a first relative distance according to a related state parameter of the target object at a previous time, where the first relative distance is a distance between the image capturing apparatus and the target object at a current time; the second processing module 640 is configured to obtain a second width of the target object in the image according to the first relative distance; a comparing module 650, configured to obtain a relative difference according to the first width and the second width; an adjusting module 660, configured to update the first relative distance to a second relative distance according to the relative difference.
In one example, the obtaining of the relevant state parameter of the target object at the previous time includes: and acquiring a relevant distance parameter of the target object at the previous moment and a relevant speed parameter of the target object at the previous moment.
In an example, the first processing module 630 is specifically configured to: and obtaining the first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.
Fig. 7 illustrates an example block diagram of a first processing module 630 according to an embodiment of this application. As shown in fig. 7, in an example, the first processing module 630 specifically includes: a third obtaining module 710, configured to obtain a time difference between the previous time and the current time; a fourth obtaining module 720, configured to obtain the relative movement distance of the current time according to the time difference and the related speed parameter of the previous time; a fifth obtaining module 730, configured to obtain the first relative distance according to the relative distance parameter at the previous time, the relative movement distance at the current time, and the distance noise at the current time.
In one example, the distance noise at the current time is obtained by: and obtaining the distance noise at the current moment according to the related speed parameter, the time difference and the scale coefficient at the previous moment.
In an example, the second processing module 640 is specifically configured to obtain the second width based on a projection relationship and the first relative distance, where the projection relationship includes a mapping relationship between a preset reference surface and an image.
FIG. 8 illustrates an example block diagram of an adjustment module 660 in accordance with an embodiment of this application. As shown in fig. 8, in one example, the adjustment module 660 includes: a determining module 810, configured to determine whether the relative difference is smaller than a preset threshold; a first adjusting submodule 820, configured to determine, if yes, the first relative distance as a second relative distance; a second adjusting submodule 830, configured to, if not, adjust the distance noise at the current time according to the relative difference, and obtain the second relative distance according to the adjusted distance noise at the current time and the relevant state parameter of the target object; and the difference value between the first width and the third width obtained by the second relative distance meets the preset difference value range.
Exemplary electronic device
In the following, the electronic device of the embodiment of the present application may be either or both of the first device 120 and the second device 121, or a stand-alone device independent of them, which may communicate with the first device and the second device to receive the acquired input signal therefrom.
FIG. 9 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 9, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, when the electronic device is the first device 100 or the second device 200, the input device 13 may be a microphone or a microphone array as described above for capturing an input signal of a sound source. When the electronic device is a stand-alone device, the input means 13 may be a communication network connector for receiving the acquired input signals from the first device 100 and the second device 200.
The input device 13 may also include, for example, a keyboard, a mouse, and the like.
The output device 14 may output various information including determined distance information, direction information, and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 9, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in a method of pose tracking of a target object according to various embodiments of the present application described in the "exemplary methods" section of this specification above.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in a method of pose tracking of a target object according to various embodiments of the present application described in the "exemplary methods" section above in this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.
Claims (10)
1. A method of measuring a target object distance, the method comprising:
according to an image acquired by an image acquisition device, acquiring a first width of the target object in the image;
acquiring relevant state parameters of the target object at the previous moment;
obtaining a first relative distance according to the relevant state parameter of the target object at the previous moment, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;
obtaining a second width of the target object in the image according to the first relative distance;
obtaining a relative difference value according to the first width and the second width;
and updating the first relative distance to be a second relative distance according to the relative difference.
2. The method of claim 1, wherein the obtaining of the relevant state parameter of the target object at the previous time comprises: and acquiring a relevant distance parameter of the target object at the previous moment and a relevant speed parameter of the target object at the previous moment.
3. The method of claim 2, wherein said obtaining a first relative distance from the relevant state parameter of the target object comprises:
and obtaining the first relative distance according to the relevant distance parameter at the previous moment and the relevant speed parameter at the previous moment.
4. The method of claim 3, wherein the obtaining the first relative distance from the distance-related parameter at the previous time and the speed-related parameter at the previous time comprises:
obtaining a time difference between the previous time and the current time;
obtaining the relative movement distance of the current moment according to the time difference and the related speed parameter of the previous moment;
and obtaining the first relative distance according to the relative distance parameter of the previous moment, the relative movement distance of the current moment and the distance noise of the current moment.
5. The method of claim 4, wherein the distance noise at the current time is obtained by:
and obtaining the distance noise of the current moment according to the related speed parameter, the time difference and the scale coefficient of the previous moment.
6. The method of claim 1, wherein;
the obtaining a second width of the target object in the image according to the first relative distance includes:
and obtaining the second width based on a projection relation and the first relative distance, wherein the projection relation comprises a mapping relation between a preset reference surface and an image.
7. The method of claim 1, wherein said updating the first relative distance to the second relative distance according to the relative difference comprises:
judging whether the relative difference value is smaller than a preset threshold value or not;
if so, determining the first relative distance as the second relative distance;
if not, adjusting the distance noise at the current moment according to the relative difference value, and obtaining the second relative distance according to the adjusted distance noise at the current moment and the relevant state parameter of the target object; and the difference value between the first width and the third width obtained by the second relative distance meets the preset difference value range.
8. An apparatus for measuring a target object distance, comprising:
the first obtaining module is used for obtaining a first width of the target object in the image according to the image obtained by the image acquisition device;
the second obtaining module is used for obtaining the relevant state parameters of the target object at the previous moment;
the first processing module is used for obtaining a first relative distance according to the relevant state parameter of the target object at the previous moment, wherein the first relative distance is the distance between the image acquisition device and the target object at the current moment;
the second processing module is used for obtaining a second width of the target object in the image according to the first relative distance;
the comparison module is used for obtaining a relative difference value according to the first width and the second width;
and the adjusting module is used for updating the first relative distance into a second relative distance according to the relative difference.
9. An electronic device, comprising:
a processor; and
a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method of any of claims 1-7.
10. A computer-readable storage medium, the storage medium storing a computer program for performing the method of any of the preceding claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911015686.0A CN110672074A (en) | 2019-10-24 | 2019-10-24 | Method and device for measuring distance of target object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911015686.0A CN110672074A (en) | 2019-10-24 | 2019-10-24 | Method and device for measuring distance of target object |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110672074A true CN110672074A (en) | 2020-01-10 |
Family
ID=69083790
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911015686.0A Pending CN110672074A (en) | 2019-10-24 | 2019-10-24 | Method and device for measuring distance of target object |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110672074A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113188509A (en) * | 2021-04-28 | 2021-07-30 | 上海商汤临港智能科技有限公司 | Distance measuring method and device, electronic equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101234601A (en) * | 2007-01-30 | 2008-08-06 | 南京理工大学 | Automobile cruise control method based on monocular vision and implement system thereof |
CN102508246A (en) * | 2011-10-13 | 2012-06-20 | 吉林大学 | Method for detecting and tracking obstacles in front of vehicle |
US20120200707A1 (en) * | 2006-01-04 | 2012-08-09 | Mobileye Technologies Ltd. | Estimating distance to an object using a sequence of images recorded by a monocular camera |
CN104197901A (en) * | 2014-09-19 | 2014-12-10 | 成都翼比特科技有限责任公司 | Image distance measurement method based on marker |
CN106610294A (en) * | 2015-10-27 | 2017-05-03 | 高德信息技术有限公司 | Positioning method and device |
WO2017122641A1 (en) * | 2016-01-15 | 2017-07-20 | 富士フイルム株式会社 | Measurement assistance device and measurement assistance method |
CN107966700A (en) * | 2017-11-20 | 2018-04-27 | 天津大学 | A kind of front obstacle detecting system and method for pilotless automobile |
CN108872975A (en) * | 2017-05-15 | 2018-11-23 | 蔚来汽车有限公司 | Vehicle-mounted millimeter wave radar filtering estimation method, device and storage medium for target following |
CN109655823A (en) * | 2018-12-30 | 2019-04-19 | 北京经纬恒润科技有限公司 | The tracking and device of target |
CN110361003A (en) * | 2018-04-09 | 2019-10-22 | 中南大学 | Information fusion method, device, computer equipment and computer readable storage medium |
-
2019
- 2019-10-24 CN CN201911015686.0A patent/CN110672074A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120200707A1 (en) * | 2006-01-04 | 2012-08-09 | Mobileye Technologies Ltd. | Estimating distance to an object using a sequence of images recorded by a monocular camera |
CN101234601A (en) * | 2007-01-30 | 2008-08-06 | 南京理工大学 | Automobile cruise control method based on monocular vision and implement system thereof |
CN102508246A (en) * | 2011-10-13 | 2012-06-20 | 吉林大学 | Method for detecting and tracking obstacles in front of vehicle |
CN104197901A (en) * | 2014-09-19 | 2014-12-10 | 成都翼比特科技有限责任公司 | Image distance measurement method based on marker |
CN106610294A (en) * | 2015-10-27 | 2017-05-03 | 高德信息技术有限公司 | Positioning method and device |
WO2017122641A1 (en) * | 2016-01-15 | 2017-07-20 | 富士フイルム株式会社 | Measurement assistance device and measurement assistance method |
CN108872975A (en) * | 2017-05-15 | 2018-11-23 | 蔚来汽车有限公司 | Vehicle-mounted millimeter wave radar filtering estimation method, device and storage medium for target following |
CN107966700A (en) * | 2017-11-20 | 2018-04-27 | 天津大学 | A kind of front obstacle detecting system and method for pilotless automobile |
CN110361003A (en) * | 2018-04-09 | 2019-10-22 | 中南大学 | Information fusion method, device, computer equipment and computer readable storage medium |
CN109655823A (en) * | 2018-12-30 | 2019-04-19 | 北京经纬恒润科技有限公司 | The tracking and device of target |
Non-Patent Citations (1)
Title |
---|
庞成: "基于测距雷达和机器视觉数据融合的前方车辆检测系统", 《中国优秀硕士学位论文全文数据库信息科技辑月刊》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113188509A (en) * | 2021-04-28 | 2021-07-30 | 上海商汤临港智能科技有限公司 | Distance measuring method and device, electronic equipment and storage medium |
WO2022227708A1 (en) * | 2021-04-28 | 2022-11-03 | 上海商汤智能科技有限公司 | Ranging method and apparatus, electronic device, and storage medium |
CN113188509B (en) * | 2021-04-28 | 2023-10-24 | 上海商汤临港智能科技有限公司 | Distance measurement method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10442435B2 (en) | Speed control parameter estimation method for autonomous driving vehicles | |
CN113022580B (en) | Trajectory prediction method, trajectory prediction device, storage medium and electronic equipment | |
US20200116867A1 (en) | Automatic lidar calibration based on pre-collected static reflection map for autonomous driving | |
EP3315388A1 (en) | Spring system-based change lane approach for autonomous vehicles | |
US10407076B2 (en) | Method and system for determining road frictions of autonomous driving vehicles using learning-based model predictive control | |
US10272778B2 (en) | Method and system for determining unit gain of speed control for autonomous driving vehicles | |
KR102610001B1 (en) | System for sensor synchronization data analysis in autonomous vehicles | |
US20200174486A1 (en) | Learning-based dynamic modeling methods for autonomous driving vehicles | |
EP2533226A1 (en) | Vehicle surroundings monitoring device | |
EP3608687A1 (en) | Vehicle advanced assisted driving calibration device | |
JP6479272B1 (en) | Gaze direction calibration apparatus, gaze direction calibration method, and gaze direction calibration program | |
KR20210037790A (en) | Autonomous driving apparatus and method | |
US20220289184A1 (en) | Method and Device for Scheduling a Trajectory of a Vehicle | |
CN110673123B (en) | Target object ranging method and device | |
CN110672074A (en) | Method and device for measuring distance of target object | |
CN111337010B (en) | Positioning method and positioning device of movable equipment and electronic equipment | |
US11214251B2 (en) | Speed control command auto-calibration system for autonomous vehicles | |
CN112304293B (en) | Road height detection method and device, readable storage medium and electronic equipment | |
CN112115739A (en) | Vehicle state quantity information acquisition method and device | |
CN108961337B (en) | Vehicle-mounted camera course angle calibration method and device, electronic equipment and vehicle | |
CN112085786A (en) | Pose information determination method and device | |
CN116182905A (en) | Laser radar and combined inertial navigation space-time external parameter calibration method, device and system | |
EP3669247A1 (en) | Method for evaluating localization system of autonomous driving vehicles | |
CN111212239B (en) | Exposure time length adjusting method and device, electronic equipment and storage medium | |
US20230326091A1 (en) | Systems and methods for testing vehicle systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |