CN110673609A - Vehicle running control method, device and system - Google Patents
Vehicle running control method, device and system Download PDFInfo
- Publication number
- CN110673609A CN110673609A CN201910960468.8A CN201910960468A CN110673609A CN 110673609 A CN110673609 A CN 110673609A CN 201910960468 A CN201910960468 A CN 201910960468A CN 110673609 A CN110673609 A CN 110673609A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- image information
- state
- target
- running state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000004044 response Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 230000008859 change Effects 0.000 description 6
- 206010039203 Road traffic accident Diseases 0.000 description 3
- 238000003331 infrared imaging Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0225—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a vehicle running control method, device and system. The method comprises the following steps: acquiring image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship; identifying the image information to obtain a first running state of the first vehicle; and controlling the second vehicle to enter the second running state based on the first running state. The invention solves the technical problems that the regulation of the driving state only according to the self condition of the vehicle obviously cannot meet the requirement of safety and cannot adjust the driving state of the vehicle according to the surrounding vehicle environment and the driving state of other vehicles.
Description
Technical Field
The invention relates to the field of vehicle control, in particular to a method, a device and a system for controlling vehicle running.
Background
With the continuous development of the field of vehicle automation control, vehicle automatic identification and unmanned driving become a new application trend, and are increasingly applied to daily vehicle driving. At present, in the unmanned driving process of a vehicle, a control system can only adjust the driving state of the vehicle, and when the vehicle runs on a road with a complex surrounding environment, such as multiple road sections of vehicles at a peak in the morning and at night, the vehicles around the vehicle are more, the vehicle distance is short, the condition that the front vehicle brakes is frequent, and even other vehicles do not stop merging and steering. In this case, it is obvious that the adjustment of the driving state based on only the condition of the vehicle itself does not satisfy the requirement of safety, and the adjustment of the driving state of the vehicle itself based on the surrounding vehicle environment and the driving state of the other vehicle cannot be achieved.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The invention mainly aims to provide a vehicle running control method, a vehicle running control device and a vehicle running control system, which aim to solve the technical problems that the driving state adjustment only according to the condition of a vehicle can not meet the requirement of safety obviously and the driving state adjustment of the vehicle according to the surrounding vehicle environment and the driving state of other vehicles can not be achieved at present.
In order to achieve the above object, according to one aspect of the present invention, there is provided a control method of vehicle travel, including: acquiring image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship; identifying the image information to obtain a first running state of the first vehicle; and controlling the second vehicle to enter the second running state based on the first running state.
Optionally, controlling the second vehicle to enter the second driving state based on the first driving state comprises: determining a travel control operation corresponding to the first travel state; the second vehicle is controlled to enter the second running state in response to the running control operation.
Alternatively, after the second vehicle is controlled to enter the second running state, the first vehicle and the second vehicle are in the safe running state.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: matching the image information with the target image information; and determining the running state indicated by the target image information as the first running state when the image information is successfully matched with the target image information.
Optionally, the method further comprises: and determining that the image information and the target image information are successfully matched under the condition that the similarity between the image information and the target image information is greater than a target threshold value.
Optionally, the acquiring the image information of the first vehicle comprises: image information of a target component of the first vehicle is acquired, wherein the operating state of the target component is used for indicating the type of the running state of the first vehicle.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: identifying the image information of the target component and determining the working state of the target component; a first travel state of a type corresponding to the operating state of the target component is determined.
Optionally, the target position relationship includes: the first vehicle is located at a target position of the second vehicle; and/or the first vehicle is within a target distance range of the second vehicle.
According to another aspect of the present invention, there is provided a control method of vehicle travel, including: displaying image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship; obtaining a first driving state of the first vehicle by identifying the image information, and obtaining a control instruction based on the first driving state; and sending a control command to the second vehicle so that the second vehicle enters a second running state.
Optionally, after the second vehicle enters the second driving state, the method further comprises: and outputting prompt information, wherein the prompt information is used for indicating that the first vehicle and the second vehicle are in a safe driving state.
According to another aspect of the present invention, there is provided a vehicle including: a display for displaying image information of a first vehicle, wherein the first vehicle has a target position relationship with a second vehicle to be controlled; the controller is used for obtaining a first running state of the first vehicle by identifying the image information and obtaining a control instruction based on the first running state; and the sending device is used for sending a control command to the second vehicle so that the second vehicle enters a second running state.
Optionally, the vehicle further comprises: and the output device is used for outputting prompt information after the second vehicle enters a second running state, wherein the prompt information is used for indicating that the first vehicle and the second vehicle are in a safe running state.
According to another aspect of the present invention, there is provided a control system for vehicle travel, including: the first controller is installed on the first vehicle and used for capturing and recognizing image information of the first vehicle, obtaining a first running state of the first vehicle and obtaining a control instruction based on the first running state; the second controller is installed on the second vehicle and used for receiving the control instruction and controlling the second vehicle to enter a second running state based on the control instruction; wherein the first vehicle and the second vehicle have a target positional relationship.
Optionally, the system comprises: the first output device is arranged on the first vehicle and used for outputting first prompt information, wherein the first prompt information is used for indicating that the first vehicle is in a safe driving state; and the second output device is arranged on the second vehicle and used for outputting second prompt information, wherein the second prompt information is used for indicating that the second vehicle is in a safe driving state.
According to another aspect of the present invention, there is provided a control apparatus for running of a vehicle, including: a first acquisition unit configured to acquire image information of a first vehicle, wherein the first vehicle has a target positional relationship with a second vehicle to be controlled; the identification unit is used for identifying the image information to obtain a first running state of the first vehicle; and a control unit for controlling the second vehicle to enter the second running state based on the first running state.
According to another aspect of the present invention, there is provided a control apparatus for running of a vehicle, including: a display unit for displaying image information of a first vehicle, wherein the first vehicle has a target position relationship with a second vehicle to be controlled; the second acquisition unit is used for acquiring a first running state of the first vehicle by identifying the image information and acquiring a control instruction based on the first running state; and the sending unit is used for sending a control command to the second vehicle so that the second vehicle enters a second running state.
According to another aspect of the present invention, there is provided a storage medium characterized in that the storage medium includes a stored program, wherein the apparatus on which the storage medium is located is controlled to execute the method when the program runs.
According to another aspect of the present invention, a processor is provided, wherein the processor is configured to execute a program, and wherein the program executes the method.
According to the invention, the mode of capturing and identifying the image information of the opposite vehicle, matching the image information with the preset image information and changing the driving state of the vehicle according to the matching result is adopted, so that the technical problems that the regulation of the driving state only according to the self condition of the vehicle obviously cannot meet the requirement of safety and cannot adjust the driving state of the vehicle according to the surrounding vehicle environment and the driving state of other vehicles at present are solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a control method of vehicle running according to an embodiment of the invention;
FIG. 2 is an alternative method of controlling vehicle travel according to an embodiment of the present invention;
FIG. 3 is a block diagram of a vehicle according to an embodiment of the present invention;
fig. 4 is a block diagram of a control system for vehicle running according to an embodiment of the present invention;
fig. 5 is a block diagram of a control apparatus for vehicle running according to an embodiment of the present invention; and
fig. 6 is a block diagram of an alternative vehicle travel control apparatus according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a control method of vehicle running according to an embodiment of the present invention, and as shown in fig. 1, the method may include the steps of:
step S102, image information of a first vehicle is obtained, wherein the first vehicle and a second vehicle to be controlled have a target position relation.
Specifically, the image information of the first vehicle may be a driving state diagram of the surrounding vehicle, or may be an image of the surrounding vehicle light, where the image of the surrounding vehicle light may show the driving state of the vehicle, for example, if it is collected that the yellow lights on both sides of the taillight of the front vehicle are on, it may be determined that the driving state of the front vehicle is braking or deceleration driving. The first vehicle may be any vehicle around the second vehicle, which has a positional relationship, and for example, the second vehicle is vehicle a, which is a vehicle to be controlled according to the embodiment of the present invention, and then the first vehicle is vehicle B, which is located right in front of vehicle a, which means that vehicle a and vehicle B have a positional relationship in front and back.
The target positional relationship may be a positional relationship of the vehicle in front and rear directions, or may be a positional relationship of an arbitrary position around the vehicle with control.
It should be noted that, the acquiring of the image information of the first vehicle is performed by using a high-definition camera, which performs focusing positioning according to the position of the first vehicle, and captures and stores the driving state of the first vehicle in the form of an image. In addition, when the vehicle travels at night, the light is dark, and at this moment, a high-definition camera is required to integrate an infrared imaging function, so that the image information of the first vehicle can be accurately acquired under the condition of dark light.
Optionally, the acquiring the image information of the first vehicle comprises: image information of a target component of the first vehicle is acquired, wherein the operating state of the target component is used for indicating the type of the running state of the first vehicle.
Specifically, the image of the first vehicle is acquired by the image acquisition device by the vehicle with control, and the image may be acquired according to the working state of a certain component of the first vehicle, where the certain component may be an automobile tail lamp, that is, the driving state, whether to brake, whether to steer, and the like of the vehicle are determined by determining the working state of the automobile tail lamp.
It should be noted that, when the vehicle component to be determined is the tail lamp, it is necessary to integrate a comparator function on the processor of the belt-controlled vehicle, and set a standard picture for comparing the acquired tail lamp picture, where the standard picture is the brightness of the tail lamp of the preceding vehicle when the vehicle is not braked and steered, and therefore, when the acquired image shows that the brightness of the tail lamp is greater than that of the standard picture, the belt-controlled vehicle can change the driving state of the preceding vehicle to be braked or steered.
For example, when the image capturing device of the vehicle a determines that the brightness of the tail lamps of the vehicle B changes, which means that the vehicle B performs a braking or steering operation, the vehicle a may further analyze and process the conditions of the tail lamps of the vehicle B according to the conditions of the tail lamps of the vehicle B captured by the image capturing device of the vehicle a.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: identifying the image information of the target component and determining the working state of the target component; a first travel state of a type corresponding to the operating state of the target component is determined.
Specifically, the first vehicle driving state may be braking, steering or normal driving, the target component may be a tail lamp of the first vehicle, and when the image capture device of the second vehicle captures that the brightness of the tail lamp of the first vehicle is consistent with the brightness of the standard screen in the above-described embodiment (or consistent within an error-allowable range), it may be determined that the first vehicle tail lamp operating state is an off or wide lamp on state, in which the driving state of the first vehicle is normal driving. Since the tail lamp of the first vehicle is lighted up to different degrees when the first vehicle has a condition that needs braking deceleration or a condition of decelerating turning, for example, when the first vehicle turns left, the yellow indicator lamp on the left side of the tail lamp flashes at a fixed frequency, and for example, when the first vehicle takes a braking measure (braking), the yellow indicator lamps on both sides of the tail lamp of the first vehicle are lighted up simultaneously. When the first vehicle tail lamp yellow lamp is judged to be turned on, the brightness of the tail lamp is judged to be different from that of the tail lamp of the standard picture, and then whether the specific driving state of the front vehicle is braking or steering is analyzed.
It should be noted that, because the brightness of the tail lights of different vehicles is often different, in the embodiment, in order to accurately determine the driving state of the first vehicle, a method of performing real-time comparison and standard picture comparison at the same time is usually adopted, for example, the real-time comparison means that the brightness of the tail lights of the front vehicle is obtained in real time and stored in the local storage, when the brightness of the front lights of the front vehicle becomes low, the locally stored lowest brightness value of the rear lights of the front vehicle is updated, and when the brightness of the tail lights of the front vehicle, which is acquired by the image acquisition device, exceeds the locally stored lowest brightness value of the front lights of the front vehicle by more than 1.5 times, the driving state of the front vehicle is determined to be steering or.
It should be noted that the target component may also be a front vehicle rear bumper, and the distance from the front vehicle bumper is acquired by a radar acquisition device of a second vehicle (vehicle with control), and when the distance from the front vehicle bumper is shortened, the driving state of the front vehicle is determined to be braking or steering, and the working state of the bumper may be a distance parameter from the second vehicle.
Optionally, the target position relationship includes: the first vehicle is located at a target position of the second vehicle; and/or the first vehicle is within a target distance range of the second vehicle.
Specifically, the first vehicle being located in a target position of the second vehicle means that the first vehicle is located in a traveling direction of the second vehicle, for example, the second vehicle travels north, and then the first vehicle is on the north side of the second vehicle, i.e., the first vehicle is within a distance of the front of the second vehicle.
In addition, the first vehicle being within the target distance range of the second vehicle means that the first vehicle may be within the range of respective positions of the second vehicle, front, rear, left, and right, i.e., within the target distance range of the second vehicle as long as it is kept within a certain distance from the second vehicle. For example, the first vehicle is 50 meters to the left of the second vehicle, and the first vehicle is within the target distance range of the second vehicle, wherein the target distance may be set according to actual conditions, and is not specifically limited herein.
And step S104, identifying the image information to obtain a first running state of the first vehicle.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: matching the image information with the target image information; and determining the running state indicated by the target image information as the first running state when the image information is successfully matched with the target image information.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information to be consistent or similar, the driving state of the first vehicle can be judged to be consistent with the driving state of the vehicle in the target image information, namely, the first vehicle is driven normally.
When the image information is matched with the target image, the parameters may be completely matched or matched within a certain error. For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is smaller than a certain threshold, it may be determined that a is similar to or consistent with b, and thus, it may be determined that the driving state of the front vehicle is normal driving.
Optionally, the method further comprises: and determining that the image information and the target image information are successfully matched under the condition that the similarity between the image information and the target image information is greater than a target threshold value.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information and exceeds the target threshold value of the target image information, the running state of the first vehicle can be judged to be inconsistent with the running state of the vehicle in the target image information, namely the running state of the first vehicle is braking or steering. The target threshold may be a preset parameter difference degree value, that is, when the image information parameter of the first vehicle exceeds the difference degree value, it may be determined that the image information parameter of the first vehicle does not conform to the target image information.
For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is greater than a certain threshold, it may be determined that a is inconsistent with b, and thus it is determined that the driving state of the front vehicle is braking or steering.
And S106, controlling the second vehicle to enter a second running state based on the first running state.
Optionally, controlling the second vehicle to enter the second driving state based on the first driving state comprises: determining a travel control operation corresponding to the first travel state; the second vehicle is controlled to enter the second running state in response to the running control operation.
Specifically, the first driving state is a driving state of the first vehicle, which may be normal driving, braking or steering, and the second vehicle is controlled according to the driving state of the first vehicle obtained in step S104, so that the second vehicle takes corresponding measures with respect to the driving state of the first vehicle, thereby avoiding danger. The running control operation corresponding to the first running state refers to an operation of what change should be made in the running state of the second vehicle according to the first running state, and the control operation is performed.
For example, if the first driving state of the first vehicle is braking, that is, the first vehicle is in the process of decelerating, the second vehicle performs the own-vehicle control operation according to the information that the first vehicle is braking, that is, the second vehicle is controlled to perform braking as well, so as to avoid an accident.
Alternatively, after the second vehicle is controlled to enter the second running state, the first vehicle and the second vehicle are in the safe running state.
Specifically, the safe driving state refers to that the first vehicle and the second vehicle do not have a traffic accident, and the safe distance is kept, which is also the operation result of the second vehicle entering the second driving state, namely, the driving safety is ensured.
Fig. 2 is an alternative control method for vehicle driving according to an embodiment of the present invention, as shown in fig. 2, the method may include the steps of:
step S202, displaying image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relation.
Specifically, the image information of the first vehicle may be a driving state diagram of the surrounding vehicle, or may be an image of the surrounding vehicle light, where the image of the surrounding vehicle light may show the driving state of the vehicle, for example, if it is collected that the yellow lights on both sides of the taillight of the front vehicle are on, it may be determined that the driving state of the front vehicle is braking or deceleration driving. The first vehicle may be any vehicle around the second vehicle, which has a positional relationship, and for example, the second vehicle is vehicle a, which is a vehicle to be controlled according to the embodiment of the present invention, and then the first vehicle is vehicle B, which is located right in front of vehicle a, which means that vehicle a and vehicle B have a positional relationship in front and back.
The target positional relationship may be a positional relationship of the vehicle in front and rear directions, or may be a positional relationship of an arbitrary position around the vehicle with control.
It should be noted that, the acquiring of the image information of the first vehicle is performed by using a high-definition camera, which performs focusing positioning according to the position of the first vehicle, and captures and stores the driving state of the first vehicle in the form of an image. In addition, when the vehicle travels at night, the light is dark, and at this moment, a high-definition camera is required to integrate an infrared imaging function, so that the image information of the first vehicle can be accurately acquired under the condition of dark light.
Optionally, the acquiring the image information of the first vehicle comprises: image information of a target component of the first vehicle is acquired, wherein the operating state of the target component is used for indicating the type of the running state of the first vehicle.
Specifically, the image of the first vehicle is acquired by the image acquisition device by the vehicle with control, and the image may be acquired according to the working state of a certain component of the first vehicle, where the certain component may be an automobile tail lamp, that is, the driving state, whether to brake, whether to steer, and the like of the vehicle are determined by determining the working state of the automobile tail lamp.
It should be noted that, when the vehicle component to be determined is the tail lamp, it is necessary to integrate a comparator function on the processor of the belt-controlled vehicle, and set a standard picture for comparing the acquired tail lamp picture, where the standard picture is the brightness of the tail lamp of the front vehicle when the vehicle is not braking and turning, so that when the acquired image shows that the brightness of the tail lamp is greater than that of the standard picture, the belt-controlled vehicle can change the driving state of the front vehicle to braking or turning.
For example, when the image capturing device of the vehicle a determines that the brightness of the tail lamps of the vehicle B changes, which means that the vehicle B performs a braking or steering operation, the vehicle a may further analyze and process the conditions of the tail lamps of the vehicle B according to the conditions of the tail lamps of the vehicle B captured by the image capturing device of the vehicle a.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: identifying the image information of the target component and determining the working state of the target component; a first travel state of a type corresponding to the operating state of the target component is determined.
Specifically, the first vehicle driving state may be braking, steering or normal driving, the target component may be a tail lamp of the first vehicle, and when the image capture device of the second vehicle captures that the brightness of the tail lamp of the first vehicle is consistent with the brightness of the standard screen in the above-described embodiment (or consistent within an error-allowable range), it may be determined that the first vehicle tail lamp operating state is an off or wide lamp on state, in which the driving state of the first vehicle is normal driving. Since the tail lamp of the first vehicle is lighted up to different degrees when the first vehicle has a condition that needs braking deceleration or a condition of decelerating turning, for example, when the first vehicle turns left, the yellow indicator lamp on the left side of the tail lamp flashes at a fixed frequency, and for example, when the first vehicle takes a braking measure (braking), the yellow indicator lamps on both sides of the tail lamp of the first vehicle are lighted up simultaneously. When the first vehicle tail lamp yellow lamp is judged to be turned on, the brightness of the tail lamp is judged to be different from that of the tail lamp of the standard picture, and then whether the specific driving state of the front vehicle is braking or steering is analyzed.
It should be noted that, because the brightness of the tail lights of different vehicles is often different, in the embodiment, in order to accurately determine the driving state of the first vehicle, a method of performing real-time comparison and standard picture comparison at the same time is usually adopted, for example, the real-time comparison means that the brightness of the tail lights of the front vehicle is obtained in real time and stored in the local storage, when the brightness of the front lights of the front vehicle becomes low, the locally stored lowest brightness value of the rear lights of the front vehicle is updated, and when the brightness of the tail lights of the front vehicle, which is acquired by the image acquisition device, exceeds the locally stored lowest brightness value of the front lights of the front vehicle by more than 1.5 times, the driving state of the front vehicle is determined to be steering or.
It should be noted that the target component may also be a front vehicle rear bumper, and the distance from the front vehicle bumper is acquired by a radar acquisition device of a second vehicle (vehicle with control), and when the distance from the front vehicle bumper is shortened, the driving state of the front vehicle is determined to be braking or steering, and the working state of the bumper may be a distance parameter from the second vehicle.
Optionally, the target position relationship includes: the first vehicle is located at a target position of the second vehicle; and/or the first vehicle is within a target distance range of the second vehicle.
Specifically, the first vehicle being located in a target position of the second vehicle means that the first vehicle is located in a traveling direction of the second vehicle, for example, the second vehicle travels north, and then the first vehicle is on the north side of the second vehicle, i.e., the first vehicle is within a distance of the front of the second vehicle.
In addition, the first vehicle being within the target distance range of the second vehicle means that the first vehicle may be within the range of respective positions of the second vehicle, front, rear, left, and right, i.e., within the target distance range of the second vehicle as long as it is kept within a certain distance from the second vehicle. For example, the first vehicle is 50 meters to the left of the second vehicle, and the first vehicle is within the target distance range of the second vehicle, wherein the target distance may be set according to actual conditions, and is not specifically limited herein.
Step S204, a first running state of the first vehicle is obtained through the identification image information, and a control command is obtained based on the first running state.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: matching the image information with the target image information; and determining the running state indicated by the target image information as the first running state when the image information is successfully matched with the target image information.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information to be consistent or similar, the driving state of the first vehicle can be judged to be consistent with the driving state of the vehicle in the target image information, namely, the first vehicle is driven normally.
When the image information is matched with the target image, the parameters may be completely matched or matched within a certain error. For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is smaller than a certain threshold, it may be determined that a is similar to or consistent with b, and thus, it may be determined that the driving state of the front vehicle is normal driving.
Optionally, the method further comprises: and determining that the image information and the target image information are successfully matched under the condition that the similarity between the image information and the target image information is greater than a target threshold value.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information and exceeds the target threshold value of the target image information, the running state of the first vehicle can be judged to be inconsistent with the running state of the vehicle in the target image information, namely the running state of the first vehicle is braking or steering. The target threshold may be a preset parameter difference degree value, that is, when the image information parameter of the first vehicle exceeds the difference degree value, it may be determined that the image information parameter of the first vehicle does not conform to the target image information.
For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is greater than a certain threshold, it may be determined that a is inconsistent with b, and thus it is determined that the driving state of the front vehicle is braking or steering.
In step S206, a control instruction is sent to the second vehicle, so that the second vehicle enters the second running state.
Optionally, after the second vehicle enters the second driving state, the method further comprises: and outputting prompt information, wherein the prompt information is used for indicating that the first vehicle and the second vehicle are in a safe driving state.
Specifically, the first driving state is a driving state of the first vehicle, which may be normal driving, braking or steering, and the second vehicle is controlled according to the driving state of the first vehicle obtained in step S204, so that the second vehicle takes corresponding measures with respect to the driving state of the first vehicle, thereby avoiding danger. The running control operation corresponding to the first running state refers to an operation of what change should be made in the running state of the second vehicle according to the first running state, and the control operation is performed.
For example, if the first driving state of the first vehicle is braking, that is, the first vehicle is in the process of decelerating, the second vehicle performs the own-vehicle control operation according to the information that the first vehicle is braking, that is, the second vehicle is controlled to perform braking as well, so as to avoid an accident.
Specifically, the safe driving state refers to that the first vehicle and the second vehicle do not have a traffic accident, and the safe distance is kept, which is also the operation result of the second vehicle entering the second driving state, namely, the driving safety is ensured.
Fig. 3 is a block diagram of a vehicle according to an embodiment of the present invention, and as shown in fig. 3, the vehicle may include: a display 30, a controller 32, and a transmitting device 34.
A display 30 for displaying image information of a first vehicle, wherein the first vehicle has a target positional relationship with a second vehicle to be controlled.
Specifically, the image information of the first vehicle may be a driving state diagram of the surrounding vehicle, or may be an image of the surrounding vehicle light, where the image of the surrounding vehicle light may show the driving state of the vehicle, for example, if it is collected that the yellow lights on both sides of the taillight of the front vehicle are on, it may be determined that the driving state of the front vehicle is braking or deceleration driving. The first vehicle may be any vehicle around the second vehicle, which has a positional relationship, and for example, the second vehicle is vehicle a, which is a vehicle to be controlled according to the embodiment of the present invention, and then the first vehicle is vehicle B, which is located right in front of vehicle a, which means that vehicle a and vehicle B have a positional relationship in front and back.
The target positional relationship may be a positional relationship of the vehicle in front and rear directions, or may be a positional relationship of an arbitrary position around the vehicle with control.
It should be noted that, the acquiring of the image information of the first vehicle is performed by using a high-definition camera, which performs focusing positioning according to the position of the first vehicle, and captures and stores the driving state of the first vehicle in the form of an image. In addition, when the vehicle travels at night, the light is dark, and at this moment, a high-definition camera is required to integrate an infrared imaging function, so that the image information of the first vehicle can be accurately acquired under the condition of dark light.
Optionally, the acquiring the image information of the first vehicle comprises: image information of a target component of the first vehicle is acquired, wherein the operating state of the target component is used for indicating the type of the running state of the first vehicle.
Specifically, the image of the first vehicle is acquired by the image acquisition device by the vehicle with control, and the image may be acquired according to the working state of a certain component of the first vehicle, where the certain component may be an automobile tail lamp, that is, the driving state, whether to brake, whether to steer, and the like of the vehicle are determined by determining the working state of the automobile tail lamp.
It should be noted that, when the vehicle component to be determined is the tail lamp, it is necessary to integrate a comparator function on the processor of the belt-controlled vehicle, and set a standard picture for comparing the acquired tail lamp picture, where the standard picture is the brightness of the tail lamp of the front vehicle when the vehicle is not braking and turning, so that when the acquired image shows that the brightness of the tail lamp is greater than that of the standard picture, the belt-controlled vehicle can change the driving state of the front vehicle to braking or turning.
For example, when the image capturing device of the vehicle a determines that the brightness of the tail lamps of the vehicle B changes, which means that the vehicle B performs a braking or steering operation, the vehicle a may further analyze and process the conditions of the tail lamps of the vehicle B according to the conditions of the tail lamps of the vehicle B captured by the image capturing device of the vehicle a.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: identifying the image information of the target component and determining the working state of the target component; a first travel state of a type corresponding to the operating state of the target component is determined.
Specifically, the first vehicle driving state may be braking, steering or normal driving, the target component may be a tail lamp of the first vehicle, and when the image capture device of the second vehicle captures that the brightness of the tail lamp of the first vehicle is consistent with the brightness of the standard screen in the above-described embodiment (or consistent within an error-allowable range), it may be determined that the first vehicle tail lamp operating state is an off or wide lamp on state, in which the driving state of the first vehicle is normal driving. Since the tail lamp of the first vehicle is lighted up to different degrees when the first vehicle has a condition that needs braking deceleration or a condition of decelerating turning, for example, when the first vehicle turns left, the yellow indicator lamp on the left side of the tail lamp flashes at a fixed frequency, and for example, when the first vehicle takes a braking measure (braking), the yellow indicator lamps on both sides of the tail lamp of the first vehicle are lighted up simultaneously. When the first vehicle tail lamp yellow lamp is judged to be turned on, the brightness of the tail lamp is judged to be different from that of the tail lamp of the standard picture, and then whether the specific driving state of the front vehicle is braking or steering is analyzed.
It should be noted that, because the brightness of the tail lights of different automobiles is often different, in the embodiment, in order to accurately determine the driving state of the first automobile, a method of performing real-time comparison and standard picture comparison at the same time is usually adopted, for example, the real-time comparison means that the brightness of the tail lights of the front automobile is obtained in real time and stored in the local storage, when the brightness of the front lamps of the front automobile becomes low, the locally stored lowest brightness value of the rear lamps of the front automobile is updated, and when the brightness of the tail lights of the front automobile, which is acquired by the image acquisition device, exceeds the locally stored lowest brightness value of the front lamps of the front automobile by more than 1.5 times, the driving state of the front automobile is determined to be steering or.
It should be noted that the target component may also be a front vehicle rear bumper, and the distance from the front vehicle bumper is acquired by a radar acquisition device of a second vehicle (vehicle with control), and when the distance from the front vehicle bumper is shortened, the driving state of the front vehicle is determined to be braking or steering, and the working state of the bumper may be a distance parameter from the second vehicle.
Optionally, the target position relationship includes: the first vehicle is located at a target position of the second vehicle; and/or the first vehicle is within a target distance range of the second vehicle.
Specifically, the first vehicle being located in a target position of the second vehicle means that the first vehicle is located in a traveling direction of the second vehicle, for example, the second vehicle travels north, and then the first vehicle is on the north side of the second vehicle, i.e., the first vehicle is within a distance of the front of the second vehicle.
In addition, the first vehicle being within the target distance range of the second vehicle means that the first vehicle may be within the range of respective positions of the second vehicle, front, rear, left, and right, i.e., within the target distance range of the second vehicle as long as it is kept within a certain distance from the second vehicle. For example, the first vehicle is 50 meters to the left of the second vehicle, and the first vehicle is within the target distance range of the second vehicle, wherein the target distance may be set according to actual conditions, and is not specifically limited herein.
And a controller 32 for obtaining a first traveling state of the first vehicle by recognizing the image information, and obtaining a control instruction based on the first traveling state.
Optionally, the recognizing the image information, and the obtaining the first driving state of the first vehicle includes: matching the image information with the target image information; and determining the running state indicated by the target image information as the first running state when the image information is successfully matched with the target image information.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information to be consistent or similar, the driving state of the first vehicle can be judged to be consistent with the driving state of the vehicle in the target image information, namely, the first vehicle is driven normally.
When the image information is matched with the target image, the parameters may be completely matched or matched within a certain error. For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is smaller than a certain threshold, it may be determined that a is similar to or consistent with b, and thus, it may be determined that the driving state of the front vehicle is normal driving.
Optionally, the method further comprises: and determining that the image information and the target image information are successfully matched under the condition that the similarity between the image information and the target image information is greater than a target threshold value.
Specifically, the target image may be a standard screen mentioned in the embodiment of the present invention, that is, the target image is image information of the first vehicle when the first vehicle is traveling normally. The embodiment of the invention matches and compares the image information acquired in real time with the target image, and when the image information is matched with the target image information and exceeds the target threshold value of the target image information, the running state of the first vehicle can be judged to be inconsistent with the running state of the vehicle in the target image information, namely the running state of the first vehicle is braking or steering. The target threshold may be a preset parameter difference degree value, that is, when the image information parameter of the first vehicle exceeds the difference degree value, it may be determined that the image information parameter of the first vehicle does not conform to the target image information.
For example, when the brightness of the front vehicle tail light is a, the local vehicle compares a with the locally stored target image information b, and when the absolute value of (a-b) is greater than a certain threshold, it may be determined that a is inconsistent with b, and thus it is determined that the driving state of the front vehicle is braking or steering.
And a sending device 34 for sending a control instruction to the second vehicle so that the second vehicle enters the second running state.
Optionally, the vehicle further comprises an output device for outputting a prompt message after the second vehicle enters the second running state, wherein the prompt message is used for indicating that the first vehicle and the second vehicle are in the safe running state.
Specifically, the first driving state is a driving state of the first vehicle, which may be normal driving, braking or steering, and the second vehicle is controlled according to the driving state of the first vehicle obtained in step S204, so that the second vehicle takes corresponding measures with respect to the driving state of the first vehicle, thereby avoiding danger. The running control operation corresponding to the first running state refers to an operation of what change should be made in the running state of the second vehicle according to the first running state, and the control operation is performed.
For example, if the first driving state of the first vehicle is braking, that is, the first vehicle is in the process of decelerating, the second vehicle performs the own-vehicle control operation according to the information that the first vehicle is braking, that is, the second vehicle is controlled to perform braking as well, so as to avoid an accident.
Specifically, the safe driving state refers to that the first vehicle and the second vehicle do not have a traffic accident, and the safe distance is kept, which is also the operation result of the second vehicle entering the second driving state, namely, the driving safety is ensured.
Fig. 4 is a block diagram of a control system for vehicle running according to an embodiment of the present invention, which may include, as shown in fig. 4: a first controller 40 and a second controller 42.
And a first controller 40 installed on the first vehicle, for capturing and recognizing image information of the first vehicle, obtaining a first traveling state of the first vehicle, and obtaining a control instruction based on the first traveling state.
The above embodiments have been described with respect to obtaining the first vehicle image information, and how to obtain the control instruction, and are not described herein again.
And a second controller 42 mounted on the second vehicle for receiving the control instruction and controlling the second vehicle to enter a second running state based on the control instruction.
According to the above embodiment, the first vehicle and the second vehicle have the target positional relationship.
Optionally, the control system further comprises: the first output device is arranged on the first vehicle and used for outputting first prompt information, wherein the first prompt information is used for indicating that the first vehicle is in a safe driving state; and the second output device is arranged on the second vehicle and used for outputting second prompt information, wherein the second prompt information is used for indicating that the second vehicle is in a safe driving state.
The above embodiments have been described with respect to receiving a control instruction and controlling the second vehicle to enter the second running state based on the control instruction, and will not be described herein again.
Fig. 5 is a block diagram of a control apparatus for vehicle running according to an embodiment of the present invention, which may include, as shown in fig. 5: a first acquisition unit 50, a recognition unit 52 and a control unit 54.
A first acquisition unit 50 for acquiring image information of a first vehicle having a target positional relationship with a second vehicle to be controlled.
The identifying unit 52 is configured to identify the image information to obtain a first driving state of the first vehicle.
A control unit 54 for controlling the second vehicle to enter the second running state based on the first running state.
The above embodiments of the present invention have specifically described the relevant steps, and are not described herein again.
Fig. 6 is a block diagram illustrating a structure of an alternative vehicle driving control apparatus according to an embodiment of the present invention, which may include, as shown in fig. 6: a display unit 60, a second acquisition unit 62 and a transmission unit 64.
The display unit 60 is configured to display image information of a first vehicle having a target positional relationship with a second vehicle to be controlled.
And a second obtaining unit 62 for obtaining a first traveling state of the first vehicle by recognizing the image information, and obtaining a control instruction based on the first traveling state.
A sending unit 64 for sending a control instruction to the second vehicle so that the second vehicle enters the second running state.
The above embodiments of the present invention have specifically described the relevant steps, and are not described herein again.
According to another aspect of the present invention, there is provided a storage medium characterized in that the storage medium includes a stored program, wherein a method performed by an apparatus in which the storage medium is located is controlled when the program is executed. For example, the following procedure may be performed: acquiring image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship; identifying the image information to obtain a first running state of the first vehicle; and controlling the second vehicle to enter the second running state based on the first running state.
According to another aspect of the invention, a processor is provided, wherein the processor is configured to run a program, and wherein the program is configured to execute the method when running. For example, the following procedure may be performed: acquiring image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship; identifying the image information to obtain a first running state of the first vehicle; and controlling the second vehicle to enter the second running state based on the first running state.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (18)
1. A control method of vehicle travel, characterized by comprising:
acquiring image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship;
identifying the image information to obtain a first running state of the first vehicle;
controlling the second vehicle to enter a second travel state based on the first travel state.
2. The method of claim 1, wherein controlling the second vehicle to enter a second travel state based on the first travel state comprises:
determining a travel control operation corresponding to the first travel state;
controlling the second vehicle to enter the second running state in response to the running control operation.
3. The method according to claim 2, characterized in that the first vehicle and the second vehicle are in a safe driving state after controlling the second vehicle to enter the second driving state.
4. The method of claim 1, wherein identifying the image information resulting in the first travel state of the first vehicle comprises:
matching the image information with target image information;
and determining the driving state indicated by the target image information as the first driving state when the image information is successfully matched with the target image information.
5. The method of claim 4, further comprising:
and determining that the image information and the target image information are successfully matched under the condition that the similarity between the image information and the target image information is greater than a target threshold value.
6. The method of any one of claims 1 to 5, wherein obtaining image information of the first vehicle comprises:
acquiring image information of a target component of the first vehicle, wherein the working state of the target component is used for indicating the type of the running state of the first vehicle.
7. The method of claim 6, wherein identifying the image information resulting in the first travel state of the first vehicle comprises:
identifying the image information of the target component and determining the working state of the target component;
determining the first driving state of a type corresponding to an operating state of the target component.
8. The method according to any one of claims 1 to 5, wherein the target position relationship comprises:
the first vehicle is located at a target position of the second vehicle; and/or
The first vehicle is within a target distance range of the second vehicle.
9. A control method of vehicle travel, characterized by comprising:
displaying image information of a first vehicle, wherein the first vehicle and a second vehicle to be controlled have a target position relationship;
obtaining a first running state of the first vehicle by identifying the image information, and obtaining a control instruction based on the first running state;
sending the control instruction to the second vehicle so that the second vehicle enters a second driving state.
10. The method of claim 9, wherein after the second vehicle enters a second driving state, the method further comprises:
and outputting prompt information, wherein the prompt information is used for indicating that the first vehicle and the second vehicle are in a safe driving state.
11. A vehicle, characterized by comprising:
a display for displaying image information of a first vehicle, wherein the first vehicle has a target positional relationship with a second vehicle to be controlled;
the controller is used for obtaining a first running state of the first vehicle by identifying the image information and obtaining a control instruction based on the first running state;
and the sending device is used for sending the control instruction to the second vehicle so that the second vehicle enters a second running state.
12. The vehicle of claim 11, further comprising:
and the output device is used for outputting prompt information after the second vehicle enters a second running state, wherein the prompt information is used for indicating that the first vehicle and the second vehicle are in a safe running state.
13. A control system for vehicle travel, characterized by comprising:
the first controller is installed on a first vehicle and used for capturing and recognizing image information of the first vehicle, obtaining a first running state of the first vehicle and obtaining a control instruction based on the first running state;
the second controller is installed on a second vehicle and used for receiving the control instruction and controlling the second vehicle to enter a second running state based on the control instruction;
wherein the first vehicle and the second vehicle have a target positional relationship.
14. The control system of claim 13, comprising:
the first output device is mounted on the first vehicle and used for outputting first prompt information, wherein the first prompt information is used for indicating that the first vehicle is in a safe driving state;
and the second output device is arranged on the second vehicle and used for outputting second prompt information, wherein the second prompt information is used for indicating that the second vehicle is in a safe driving state.
15. A control device for vehicle travel, characterized by comprising:
a first acquisition unit configured to acquire image information of a first vehicle, wherein the first vehicle has a target positional relationship with a second vehicle to be controlled;
the identification unit is used for identifying the image information to obtain a first running state of the first vehicle;
a control unit configured to control the second vehicle to enter a second travel state based on the first travel state.
16. A control device for vehicle travel, characterized by comprising:
a display unit configured to display image information of a first vehicle, wherein the first vehicle has a target position relationship with a second vehicle to be controlled;
the second acquisition unit is used for acquiring a first running state of the first vehicle by identifying the image information and acquiring a control instruction based on the first running state;
a sending unit, configured to send the control instruction to the second vehicle, so that the second vehicle enters a second driving state.
17. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program, when executed, controls an apparatus in which the storage medium is located to perform the method of any one of claims 1 to 10.
18. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910960468.8A CN110673609A (en) | 2019-10-10 | 2019-10-10 | Vehicle running control method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910960468.8A CN110673609A (en) | 2019-10-10 | 2019-10-10 | Vehicle running control method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110673609A true CN110673609A (en) | 2020-01-10 |
Family
ID=69081339
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910960468.8A Pending CN110673609A (en) | 2019-10-10 | 2019-10-10 | Vehicle running control method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110673609A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111221341A (en) * | 2020-02-16 | 2020-06-02 | 翟桂芳 | Safe driving control method for automatic driving vehicle and vehicle-mounted controller |
CN111338344A (en) * | 2020-02-28 | 2020-06-26 | 北京小马慧行科技有限公司 | Vehicle control method and device and vehicle |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09322052A (en) * | 1996-05-24 | 1997-12-12 | Nippon Hoso Kyokai <Nhk> | Automatic photographing camera system |
JP2004221871A (en) * | 2003-01-14 | 2004-08-05 | Auto Network Gijutsu Kenkyusho:Kk | Device for monitoring periphery of vehicle |
CN101122799A (en) * | 2006-08-10 | 2008-02-13 | 比亚迪股份有限公司 | Automobile tail-catching prealarming device and method |
CN102910168A (en) * | 2011-08-01 | 2013-02-06 | 株式会社日立制作所 | Image processing device |
JP2014044691A (en) * | 2012-08-29 | 2014-03-13 | Toyoda Gosei Co Ltd | Drive recorder system |
CN103909930A (en) * | 2014-04-02 | 2014-07-09 | 全蕊 | Method for auxiliary control of traveling along with vehicle ahead |
US20150073705A1 (en) * | 2013-09-09 | 2015-03-12 | Fuji Jukogyo Kabushiki Kaisha | Vehicle environment recognition apparatus |
WO2016060384A1 (en) * | 2014-10-17 | 2016-04-21 | 전자부품연구원 | Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information |
KR20160117971A (en) * | 2015-04-01 | 2016-10-11 | 주식회사 크리에이티브넷 | System, server for tracking a car by determining an emergency situation and method for providing the system |
CN106183981A (en) * | 2016-07-19 | 2016-12-07 | 乐视控股(北京)有限公司 | Obstacle detection method based on automobile, device and automobile |
CN106184202A (en) * | 2016-07-26 | 2016-12-07 | 浙江吉利控股集团有限公司 | A kind of automatic emergency steering for vehicle and control method thereof |
CN108021856A (en) * | 2016-10-31 | 2018-05-11 | 比亚迪股份有限公司 | Light for vehicle recognition methods, device and vehicle |
CN108492602A (en) * | 2018-03-28 | 2018-09-04 | 浙江鼎奕科技发展有限公司 | Vehicular automatic driving method and device |
CN109664889A (en) * | 2017-10-12 | 2019-04-23 | 北京旷视科技有限公司 | A kind of control method for vehicle, device and system and storage medium |
CN109733285A (en) * | 2019-02-27 | 2019-05-10 | 百度在线网络技术(北京)有限公司 | Vehicle running state display methods, equipment and system |
KR102012318B1 (en) * | 2018-04-30 | 2019-08-20 | 재단법인 경북아이티융합 산업기술원 | Apparatus for welding quality total inspection using image sensor and method thereof |
-
2019
- 2019-10-10 CN CN201910960468.8A patent/CN110673609A/en active Pending
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09322052A (en) * | 1996-05-24 | 1997-12-12 | Nippon Hoso Kyokai <Nhk> | Automatic photographing camera system |
JP2004221871A (en) * | 2003-01-14 | 2004-08-05 | Auto Network Gijutsu Kenkyusho:Kk | Device for monitoring periphery of vehicle |
CN101122799A (en) * | 2006-08-10 | 2008-02-13 | 比亚迪股份有限公司 | Automobile tail-catching prealarming device and method |
CN102910168A (en) * | 2011-08-01 | 2013-02-06 | 株式会社日立制作所 | Image processing device |
JP2014044691A (en) * | 2012-08-29 | 2014-03-13 | Toyoda Gosei Co Ltd | Drive recorder system |
US20150073705A1 (en) * | 2013-09-09 | 2015-03-12 | Fuji Jukogyo Kabushiki Kaisha | Vehicle environment recognition apparatus |
CN103909930A (en) * | 2014-04-02 | 2014-07-09 | 全蕊 | Method for auxiliary control of traveling along with vehicle ahead |
WO2016060384A1 (en) * | 2014-10-17 | 2016-04-21 | 전자부품연구원 | Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information |
KR20160117971A (en) * | 2015-04-01 | 2016-10-11 | 주식회사 크리에이티브넷 | System, server for tracking a car by determining an emergency situation and method for providing the system |
CN106183981A (en) * | 2016-07-19 | 2016-12-07 | 乐视控股(北京)有限公司 | Obstacle detection method based on automobile, device and automobile |
CN106184202A (en) * | 2016-07-26 | 2016-12-07 | 浙江吉利控股集团有限公司 | A kind of automatic emergency steering for vehicle and control method thereof |
CN108021856A (en) * | 2016-10-31 | 2018-05-11 | 比亚迪股份有限公司 | Light for vehicle recognition methods, device and vehicle |
CN109664889A (en) * | 2017-10-12 | 2019-04-23 | 北京旷视科技有限公司 | A kind of control method for vehicle, device and system and storage medium |
CN108492602A (en) * | 2018-03-28 | 2018-09-04 | 浙江鼎奕科技发展有限公司 | Vehicular automatic driving method and device |
KR102012318B1 (en) * | 2018-04-30 | 2019-08-20 | 재단법인 경북아이티융합 산업기술원 | Apparatus for welding quality total inspection using image sensor and method thereof |
CN109733285A (en) * | 2019-02-27 | 2019-05-10 | 百度在线网络技术(北京)有限公司 | Vehicle running state display methods, equipment and system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111221341A (en) * | 2020-02-16 | 2020-06-02 | 翟桂芳 | Safe driving control method for automatic driving vehicle and vehicle-mounted controller |
CN111338344A (en) * | 2020-02-28 | 2020-06-26 | 北京小马慧行科技有限公司 | Vehicle control method and device and vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9970615B1 (en) | Light-based vehicle-device communications | |
US11180164B2 (en) | Vehicle control apparatus, vehicle, and control method | |
US10336254B2 (en) | Camera assisted vehicle lamp diagnosis via vehicle-to-vehicle communication | |
CN110351491B (en) | Light supplementing method, device and system in low-light environment | |
JP2002083297A (en) | Object recognition method and object recognition device | |
CN106816036A (en) | The method for early warning and system of vehicle collision risk | |
KR101738995B1 (en) | Imaging system and method with ego motion detection | |
CN107199943B (en) | Intelligent lamp assembly and intelligent lighting system for motor vehicle | |
KR101848451B1 (en) | Vehicle imaging system and method for distinguishing between vehicle tail lights and flashing red stop lights | |
CN114523973B (en) | Driving assistance device | |
CN110673609A (en) | Vehicle running control method, device and system | |
CN111731318B (en) | Vehicle control device, vehicle control method, vehicle, and storage medium | |
US20240251060A1 (en) | Display system | |
CN113492846A (en) | Control device, control method, and computer-readable storage medium storing program | |
CN116872957A (en) | Early warning method and device for intelligent driving vehicle, electronic equipment and storage medium | |
US11052822B2 (en) | Vehicle control apparatus, control method, and storage medium for storing program | |
CN111277956A (en) | Method and device for collecting vehicle blind area information | |
CN112740220A (en) | System and method for traffic light identification | |
KR20200071405A (en) | Brake light inoperative detection system | |
CN109991978B (en) | Intelligent automatic driving method and device based on network | |
CN114715025A (en) | Control method and control device of high beam and electronic equipment | |
CN114511834A (en) | Method and device for determining prompt information, electronic equipment and storage medium | |
CN111216631B (en) | Travel control device, control method, and storage medium storing program | |
CN118144825B (en) | Automatic driving level switching method and system, readable storage medium and vehicle | |
CN115641569B (en) | Driving scene processing method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200110 |