Nothing Special   »   [go: up one dir, main page]

CN111489564A - Driving method, device and system of unmanned vehicle - Google Patents

Driving method, device and system of unmanned vehicle Download PDF

Info

Publication number
CN111489564A
CN111489564A CN202010325551.0A CN202010325551A CN111489564A CN 111489564 A CN111489564 A CN 111489564A CN 202010325551 A CN202010325551 A CN 202010325551A CN 111489564 A CN111489564 A CN 111489564A
Authority
CN
China
Prior art keywords
information
vehicle
blind area
perception
road condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010325551.0A
Other languages
Chinese (zh)
Inventor
王永聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neolix Technologies Co Ltd
Original Assignee
Neolix Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neolix Technologies Co Ltd filed Critical Neolix Technologies Co Ltd
Priority to CN202010325551.0A priority Critical patent/CN111489564A/en
Publication of CN111489564A publication Critical patent/CN111489564A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/048Detecting movement of traffic to be counted or controlled with provision for compensation of environmental or other condition, e.g. snow, vehicle stopped at detector
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a driving method, a device and a system of an unmanned vehicle, and relates to the technical field of unmanned vehicles and automatic driving. The driving method comprises the following steps: monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information; judging whether the surrounding environment has a perception blind area or not according to the monitoring information; under the condition that the sensing blind area exists in the surrounding environment, request information is sent to the server side so as to request the server side to provide road condition information in the sensing blind area; and controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side. The invention solves the technical problem of driving danger of the unmanned vehicle when the automatic driving sensing system has a sensing blind area.

Description

Driving method, device and system of unmanned vehicle
Technical Field
The invention relates to the technical field of intelligent transportation, in particular to a driving method, a driving device and a driving system of an unmanned vehicle.
Background
An unmanned vehicle, also known as an unmanned vehicle or an autonomous vehicle, is a vehicle that does not require a driver to drive with the assistance of an autonomous driving perception system. The automatic driving perception system is equivalent to human eyes, and can enable the unmanned vehicle to automatically avoid surrounding pedestrians, vehicles and various obstacles.
At present, the automatic driving perception system generally adopts a multi-sensor fusion method. The collected data of the sensors such as the laser radar, the camera and the millimeter wave radar are fused, so that the automatic driving sensing system can sense nearby pedestrians, vehicles and various obstacles in an all-around manner. However, in some extreme cases, the automatic driving sensing system cannot obtain a complete sensing result for the surrounding environment of the unmanned vehicle, wherein the area around the unmanned vehicle, which cannot be sensed by the automatic driving sensing system, is a sensing blind area of the unmanned vehicle.
Under the condition that the sensing blind area occurs in the automatic driving sensing system, pedestrians, vehicles and various obstacles in the sensing blind area cannot be avoided in the advancing process, and the driving is very dangerous.
Disclosure of Invention
In view of this, embodiments of the present invention provide an automatic driving method, apparatus, and system to solve the technical problem of driving danger of an unmanned vehicle when an automatic driving sensing system has a sensing blind area.
According to a first aspect of the present invention, there is provided a driving method of an unmanned vehicle, which is applied to a vehicle side, including:
monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information;
judging whether a perception blind area exists in the surrounding environment according to the monitoring information;
under the condition that the sensing blind area exists in the surrounding environment, request information is sent to a server side to request the server side to provide road condition information in the sensing blind area;
and controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side.
Optionally, the monitoring information includes at least one of: the method comprises the following steps of monitoring the time interval between the transmission time and the reflected receiving time of a signal of the surrounding environment;
judging whether the surrounding environment has a perception blind area according to the monitoring information, comprising: determining the perception range of the unmanned vehicle according to the road surface fluctuation information and/or the spatial position information, and judging whether a perception blind area exists in the surrounding environment according to a determination result; or comparing the interval duration with a preset duration, and judging whether the surrounding environment has a perception blind area according to a comparison result.
Optionally, sending request information to the server, including:
determining a perception blind area according to the monitoring information, and sending the perception blind area to the server end so that the server end can acquire road condition information in the perception blind area;
or sending the monitoring information to the server side so that the server side can determine a perception blind area according to the monitoring information and acquire road condition information in the perception blind area.
Optionally, sending request information to the server, including:
sequentially sending request information for acquiring target data to a server end to enable the server end to return unreturned target data until the target data returned by the server end provides all road condition information in a perception blind area; or,
sending request trigger information to a server so that the server sequentially returns unreturned target data; and sending request termination information to the server side to enable the server side to terminate the sending of the target data under the condition that the target data returned by the server side provides all road condition information in the perception dead zone; or,
sending request information for sensing road condition information in the blind area to a server end so that the server end determines all road condition information in the blind area from a plurality of target data and returns the determined all road condition information;
the target data is an automatic driving perception result of a target vehicle end, and the target vehicle end is a vehicle end of a load vehicle located in the perception blind area.
According to a second aspect of the present invention, there is provided a driving method of an unmanned vehicle, the driving method being applied to a server side, comprising:
receiving request information sent by a vehicle end, wherein the request information provides information for sensing road condition information in a blind area for a request;
determining a perception blind area according to the request information;
and acquiring road condition information in the perception blind area and sending the acquired road condition information to the vehicle end.
Optionally, the obtaining the road condition information in the blind sensing area and sending the obtained road condition information in the blind sensing area to the vehicle end includes:
determining a target vehicle end from a plurality of vehicle ends connected with the server end, wherein the target vehicle end is a vehicle end of a load vehicle located in the perception blind area;
sending access information to the target vehicle end so that the target vehicle end can return target data, wherein the target data are the automatic driving perception result of the target vehicle end;
and receiving the target data and sending the received target data to the vehicle end.
Optionally, sending the received target data to the vehicle end includes:
under the condition that the request information is to acquire target data, sending unreturned target data to the vehicle end;
under the condition that the request information is request trigger information, sequentially returning unreturned target data to the vehicle end until request termination information sent by the vehicle end is received;
and under the condition that the request information is the request for sensing the road condition information in the blind area, determining all road condition information in the blind area from a plurality of target data and returning the determined all road condition information.
According to a third aspect of the present invention, there is provided a driving apparatus of an unmanned vehicle, the apparatus applying the driving method of the first aspect, the apparatus comprising:
the monitoring module is used for monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information;
the judging module is used for judging whether the surrounding environment has a perception blind area or not according to the monitoring information;
the request module is used for sending request information to the server side to request the server side to provide road condition information in the perception blind area under the condition that the surrounding environment is judged to have the perception blind area;
and the control module is used for controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side.
According to a fourth aspect of the present invention, there is provided a driving apparatus of an unmanned vehicle, the apparatus applying the driving method of the second aspect, the apparatus comprising:
the receiving module is used for receiving request information sent by a vehicle end, and the request information provides information for sensing road condition information in the blind area for a request;
the determining module is used for determining a perception blind area according to the request information;
and the sending module is used for acquiring the road condition information in the perception blind area and sending the acquired road condition information to the vehicle end.
According to a fifth aspect of the present invention, there is provided a driving system of an unmanned vehicle, comprising: a vehicle-side and a server-side communicatively coupled, wherein,
the vehicle end is used for executing the driving method of the first aspect;
the server is used for executing the driving method of the second aspect.
The embodiment of the invention has the following advantages or beneficial effects:
in the embodiment of the invention, under the condition that the sensing blind area exists in the surrounding environment of the unmanned vehicle is judged according to the monitoring information, the server side provides the road condition information in the sensing blind area to the vehicle side, so that the sensing blind area of the unmanned vehicle is eliminated, and the more complete road condition around the unmanned vehicle is sensed, so that the vehicle side can control the unmanned vehicle to safely run, and the technical problem that the unmanned vehicle runs dangerously under the condition that the sensing blind area exists in the automatic driving sensing system is solved.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing embodiments of the present invention with reference to the following drawings, in which:
FIG. 1 is a schematic diagram of a driving system according to the present invention;
FIG. 2 is a flow chart illustrating a driving method according to a first embodiment of the present invention;
FIG. 3 (a) illustrates a schematic view of an unmanned vehicle turning around a blind area for sensing;
fig. 3 (b) illustrates another schematic view of the presence of a perceived blind area around the unmanned vehicle when cornering;
FIG. 3 (c) illustrates another schematic view of the unmanned vehicle with a perceived blind area around the unmanned vehicle when cornering;
FIG. 4 is a schematic diagram illustrating the presence of a perception shadow around an unmanned vehicle when traveling straight;
FIG. 5 illustrates another schematic view of the presence of a perceived blind area around an unmanned vehicle when traveling straight;
fig. 6 shows a flowchart of a driving method in the fourth embodiment of the invention;
FIG. 7 is a block diagram showing another construction of a driving system provided by the present invention;
FIG. 8 is a diagram illustrating an application scenario of a driving method according to a fourth embodiment of the present invention;
fig. 9 is a block diagram showing a structure of a driving apparatus according to the present invention;
fig. 10 is a block diagram showing another configuration of a steering apparatus according to the present invention.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. The invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Fig. 1 is a block diagram illustrating a driving system of an unmanned vehicle according to an embodiment of the present application. Referring to fig. 1, the unmanned vehicle driving system 100 includes a vehicle end 110 and a server end 120 communicatively connected. The vehicle end 110 is installed on the unmanned vehicle and is used for monitoring whether a perception blind area exists in the surrounding environment of the unmanned vehicle, and if the perception blind area exists, the vehicle end 120 is requested to provide road condition information in the perception blind area; the server side 120 is configured to provide the vehicle side 110 with the information about the road condition in the perceived blind area according to the received request information. After receiving the road condition information provided by the server 120, the vehicle end 110 splices the automatic driving sensing result of the unmanned vehicle and the road condition information provided by the server 120, so that the unmanned vehicle can be controlled to safely run without a sensing blind area.
The driving method of the unmanned vehicle performed by the driving system 100 is described in detail below based on the embodiment.
The first embodiment is as follows:
fig. 2 illustrates a method of driving an unmanned vehicle performed by the driving system 100.
Referring to fig. 2, the driving method includes:
step S201, monitoring the surrounding environment of the unmanned vehicle by a vehicle end to obtain monitoring information;
step S202, the vehicle side judges whether a perception blind area exists in the surrounding environment according to the monitoring information;
step S203, the vehicle end sends request information to the server end under the condition that the surrounding environment is judged to have the perception blind area so as to request the server end to provide road condition information in the perception blind area;
step S204, the server side determines a perception blind area according to the request information sent by the vehicle side;
step S205, the server side acquires the road condition information in the perception blind area and sends the acquired road condition information to the vehicle side;
and step S206, the vehicle end controls the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information sent by the server end.
It should be noted that the automatic driving perception system of the unmanned vehicle is equivalent to the "eyes" of the unmanned vehicle, and is used for perceiving the surrounding environment of the unmanned vehicle, and the perception blind areas refer to: the unmanned vehicle is in a region which cannot be sensed by the self-automatic driving sensing system.
It should be appreciated that the perception dead zones are mostly generated due to the presence of relatively large observation obstacles around the unmanned vehicle, and in particular, the observation obstacles block a part of the "perception line of sight" of the unmanned vehicle, so that the perception dead zones exist in the surrounding environment of the unmanned vehicle. It is noted that the existence of the sensing blind area often causes the information of the front road condition of the unmanned vehicle to be lost, thereby affecting the safe driving of the unmanned vehicle.
For example, the unmanned vehicle 1 shown in fig. 3 travels on the route a and is about to turn, if there is a building 2 inside a corner of the route a shown in fig. 3 (a), the area behind the corner blocked by the building 2 is a perception blind area, and the road condition on the route a behind the corner blocked by the building 2 belongs to the missing road condition information for the unmanned vehicle 1; if the path a is a winding road as shown in fig. 3 (b), and there are mountains 3 inside the corners of the path a, the area behind the corners blocked by the mountains 3 is a dead zone for sensing, and the road conditions on the path a behind the corners blocked by the mountains 3 belong to missing road condition information for the unmanned vehicle 1; if there is a vehicle 4 traveling on the adjacent route B inside the corner of the route a as shown in fig. 3 (c), the area behind the corner blocked by the vehicle 4 is a blind perception area, and the road condition on the route a behind the corner blocked by the vehicle 4 is the missing road condition information for the unmanned vehicle 1.
For another example, if the unmanned vehicle 1 shown in fig. 4 travels on the route a in a straight-ahead state and the vehicle 4 is present on the route B adjacent to the route a, the regions in front of the vehicle 4 (i.e., the front in the traveling direction of the vehicle a) and outside (i.e., the side where the pedestrian 5 is located) blocked by the vehicle 4 are blind perception regions, and the road conditions in the regions in front of and outside the vehicle 4 blocked by the vehicle 4 are missing road condition information for the unmanned vehicle 1. If a pedestrian 5, as shown in fig. 4, runs from the outside of the vehicle 4 to the front of the unmanned vehicle 1 through the front of the vehicle 4, the unmanned vehicle 1 cannot sense the road condition that the pedestrian 5 crosses the road due to the obstruction of the vehicle 4, which affects the safe driving of the unmanned vehicle 1.
For another example, the unmanned vehicle 1 shown in fig. 5 runs on the route a, and a broken road exists in front of the unmanned vehicle 1, the area behind the slope and blocked by the broken road is the blind sensing area, and the road condition on the route a behind the slope and blocked by the broken road belongs to the missing road condition information for the unmanned vehicle 1.
It should be noted that fig. 3 to 5 are plan views, and the height of the observation obstacle is not shown in any of the drawings. In practice, the height of each viewing obstacle that may create a blind viewing zone must reach a threshold value associated with the unmanned vehicle. Taking the broken road shown in fig. 5 as an example, the height h of the top of the slope needs to be greater than a threshold value, so that the 'observation sight' of the unmanned vehicle 1 can be blocked, and a perception blind area which cannot be perceived by the unmanned vehicle 1 is generated.
In the embodiment of the invention, the server end provides the road condition information in the perception blind area for the vehicle end, so that the perception blind area of the unmanned vehicle is eliminated, namely, the automatic driving perception result of the unmanned vehicle is combined with the road condition information sent by the server end to ensure that the unmanned vehicle obtains the complete surrounding road condition information, so that the vehicle end can control the unmanned vehicle to safely run, and the technical problem that the unmanned vehicle runs dangerously when the automatic driving perception system has the perception blind area is solved.
Example two:
the driving method of the unmanned vehicle provided in this embodiment basically adopts the same steps as those in the first embodiment, and therefore, the description thereof is omitted.
The difference lies in that: the monitoring information includes at least one of: the method comprises the following steps of monitoring the time interval between the transmission moment and the reflected receiving moment of signals of the surrounding environment; and step S202, the vehicle end judges whether the surrounding environment has a perception blind area according to the monitoring information, and the method comprises the following steps: determining the sensing range of the unmanned vehicle according to the road surface fluctuation information and/or the spatial position information, and judging whether the surrounding environment has a sensing blind area according to a determination result; or comparing the interval duration with a preset duration, and judging whether the surrounding environment has a perception blind area according to the comparison result.
Specifically, the road surface undulation information around the unmanned vehicle refers to information on undulation of a road surface structure, such as mountains 3 and slopes shown in fig. 5 and shown in fig. 3 (b), and the road surface undulation information may include: the position information of the undulating region on the road surface, and the position information and the height information of the high extreme point in the undulating region on the road surface; the spatial position information of the object carried on the road around the unmanned vehicle refers to spatial position information of an object other than a surface structure on the ground, for example, the building 2 shown in (a) of fig. 3, the vehicle 4 shown in (c) of fig. 4, the pedestrian 5 shown in fig. 4, and the like, and the spatial position information may include: three-dimensional position information of the space range occupied by the bearing object.
Further, the road surface relief information around the unmanned vehicle and the spatial position information of the object carried on the road surface may be obtained by the existing sensing device (such as a binocular camera and a corresponding computing device) in the automatic driving perception system, and the vehicle end is connected with the sensing device through communication to obtain the road surface relief information and the spatial position information.
In addition, the interval duration between the transmission time and the reflected reception time of the signal of the monitored surrounding environment refers to the interval duration between the time when the signal of the monitored surrounding environment is transmitted from the transmitting device and the time when the signal of the monitored surrounding environment is received by the transmitting device after being reflected, wherein the signal of the monitored surrounding environment may be a signal transmitted by a millimeter wave radar device or an ultrasonic radar of an automatic driving sensing system, and an object (also called a signal reflecting object) which reflects the signal of the monitored surrounding environment is an object in the surrounding environment of the unmanned vehicle, so that the interval duration also represents the interval distance between the signal reflecting object and the unmanned vehicle, and a timer can be used for acquiring the interval duration.
In this embodiment, it should be noted that:
(1) if the monitoring information includes only the road surface fluctuation information around the unmanned vehicle, step S202, the vehicle end determines whether the surrounding environment has a perception blind area according to the monitoring information, which may include: and determining the perception range of the unmanned vehicle according to the road surface fluctuation information, and judging whether the surrounding environment has a perception blind area according to the determination result.
Taking the scenario shown in fig. 5 as an example, the monitoring information includes the following road surface undulation information: position information of the slope on the path a, position information of the top of the slope on the path a, and height h of the top of the slope, step S202, the vehicle end determines whether a perception blind area exists in the surrounding environment according to the monitoring information, which may be: the area in which the sensing range of the unmanned vehicle 1 does not include the area behind the top of the slope is determined according to the road surface undulation information, and then the surrounding environment is determined to have the sensing blind area based on the determination result.
Specifically, the area after the sensing range of the unmanned vehicle 1 does not include the top of the slope is determined according to the road surface undulation information, and the "observation line of sight" of the sensing device may be simulated by making a straight line with the sensing device on the unmanned vehicle as a starting point, so that the area after the top of the slope is determined to be not included in the sensing range of the unmanned vehicle 1. Also, considering that the unmanned vehicle moves toward a slope, that is, the distance between the sensing device and the top of the slope on the unmanned vehicle constantly changes, it is possible to determine a blind sensing area by simulating the "line of sight" of the sensing device when the unmanned vehicle is at a target distance from the top of the slope, which is pre-selected according to the driving speed of the unmanned vehicle based on driving safety.
(2) If the monitoring information includes only the spatial position information of the bearing objects on the road surface around the unmanned vehicle, step S202, the vehicle end determines whether the surrounding environment has a perception blind area according to the monitoring information, which may be: determining the perception range of the unmanned vehicle according to the spatial position information, and judging whether the surrounding environment has a perception blind area or not according to the determination result
Taking the diagram (a) in fig. 3 as an example, if the monitoring information includes three-dimensional position information of the space range occupied by the building 2, in step S202, the vehicle end determines whether the surrounding environment has a blind sensing area according to the monitoring information, which may be: firstly, the sensing range of the unmanned vehicle 1 is determined not to include the area shielded by the building 2 according to the three-dimensional position information, and then the sensing blind area of the surrounding environment is determined based on the determination result.
Specifically, it is determined that the sensing range of the unmanned vehicle 1 does not include the area blocked by the building 2 according to the three-dimensional position information, and the straight line QP may be taken from the sensing device Q on the unmanned vehicle as a starting point1、QP2And QP3(Point P)1Is a straight line QP1A point of tangency with the building 2, point P2Is a straight line QP2Another point of tangency with building 2, and point P3Is a straight line QP3And yet another point of tangency of building 2) to simulate a "line of sight" of the sensing device, and determine a straight line QP1、QP2And QP3Enclosed areaThe domain is an area that is obscured by the mountains 3 and the perception range does not include the area.
It should be understood that in the diagram (a) of fig. 3, the straight line QP1、QP2And QP3The three observation visual lines are only exemplary, and more observation visual lines can be simulated in the specific measurement and calculation to carry out comprehensive auxiliary measurement and calculation. And, the sensing device Q is replaced by a point in the diagram (a) of fig. 3, and in practical application, the sensing device Q has a specific shape and occupies a certain space, so that the "observation line of sight" can be made from the edge of the sensing device Q to the same side for accurate auxiliary measurement.
(3) If the monitoring information only includes the interval duration from the transmitting time to the reflected receiving time of the signal monitoring the surrounding environment, step S202, the vehicle end determines whether the surrounding environment has a blind sensing area according to the monitoring information, which may include: and comparing the interval duration with a preset duration, and judging whether the surrounding environment has a perception blind area according to a comparison result.
Specifically, the preset time period may be determined according to the size of the space required for the unmanned vehicle to safely travel and the travel speed, wherein the larger the size of the space required for the unmanned vehicle to safely travel, the larger the preset time period is, and the larger the travel speed is, the smaller the preset time period is.
Take the scenario shown in FIG. 4 as an example, where sensing device Q on an unmanned vehicle is oriented in multiple directions QS1、QS2、QS3And QS4Transmitting signals for monitoring the surrounding environment, wherein the time interval duration of the signals sequentially corresponds to t1、t2、t3And t4If the preset time is t0Then, in step S202, the vehicle end determines whether the surrounding environment has a blind sensing area according to the monitoring information, which may be: will t1、t2、t3And t4Sequentially with t0By comparison, it is clear that t1、t2、t3And t4Middle t2And minimum. At interval duration t0With appropriate settings, only t2Less than t0,QS is thus determined2The direction has a perception blind area.
It should be noted that the sensing device Q on the unmanned vehicle may set the transmission direction for monitoring the ambient environment signal according to the requirement, where more transmission directions are favorable for accurately monitoring the observation obstacle of the ambient environment, and fewer transmission directions are favorable for increasing the calculation rate, reducing the consumption of the computing device, and reducing the computing cost.
In the embodiment of the invention, the perception range of the unmanned vehicle is determined according to the road surface fluctuation information and/or the spatial position information, so that relatively complete spatial layout measurement and calculation can be carried out based on the position, and the comprehensive perception range is determined to accurately judge whether the perception range exists or not; and the interval duration is compared with the preset duration, and whether the sensing blind area exists in the surrounding environment is judged according to the comparison result, so that whether the sensing blind area exists can be conveniently judged with less calculation amount.
Example three:
the driving method of the unmanned vehicle provided in this embodiment basically adopts the same steps as those in the first to second embodiments, and therefore, the description thereof is omitted.
The difference lies in that: in step S203, the vehicle end sends request information to the server end, where the request information includes:
the vehicle end determines a sensing blind area according to the monitoring information and sends the sensing blind area to the server end so that the server end can acquire road condition information in the sensing blind area;
or the vehicle end sends the monitoring information to the server end, so that the server end determines the perception blind area according to the monitoring information and obtains the road condition information in the perception blind area.
Specifically, under the condition that the vehicle end determines a perception blind area according to the monitoring information and sends the perception blind area to the server end, the request information sent to the server end by the vehicle end comprises geographical position information of the perception blind area; and in the case that the vehicle side sends the monitoring information to the server side, the request information sent by the vehicle side to the server side includes the monitoring information.
In the embodiment of the invention, the vehicle end determines the perception blind area according to the monitoring information and sends the perception blind area to the server end so that the server end can acquire the road condition information in the perception blind area, which is beneficial to reducing the calculation task of the server end, so that the server end can quickly respond even if receiving a large amount of same request information, and the vehicle end can quickly acquire the road condition information in the perception blind area; and the vehicle end sends the monitoring information to the server end, so that the server end determines the perception blind area according to the monitoring information and acquires the road condition information in the perception blind area, thereby being beneficial to reducing the calculation task of the vehicle end, and further reducing the configuration requirement of the vehicle end and even the configuration cost of the vehicle end.
Example four:
the driving method of the unmanned vehicle provided in this embodiment basically adopts the same steps as those in the first to second embodiments, and therefore, the description thereof is omitted.
The difference lies in that: referring to fig. 6, in step S205, the server acquires the traffic information in the perceived blind area and sends the acquired traffic information to the vehicle end (also called the vehicle requesting end), including:
step S2051, the server side determines a target vehicle side (also called a requested vehicle side) from a plurality of vehicle sides connected with the server side, wherein the target vehicle side is a vehicle side of a load vehicle in a perception blind area;
step S2052, the server side sends access information to the target vehicle side to request target data, wherein the target data are automatic driving perception results of the target vehicle side;
step S2053, the target vehicle end returns target data to the server end after receiving the access information;
in step S2054, the server receives the target data and transmits the received target data to the vehicle.
It should be noted that, the server end is in communication connection with the plurality of vehicle ends, and an internet of vehicles capable of communicating the automatic driving sensing results among the plurality of vehicle ends is constructed, wherein each vehicle end is loaded on one unmanned vehicle, and each unmanned vehicle is further provided with an automatic driving sensing system, so that the vehicle ends can obtain the sensing results of the automatic driving sensing systems on the same unmanned vehicle. Based on this, the load vehicle at the target vehicle end is the unmanned vehicle at the load target vehicle end, and the target data is the automatic driving perception result generated by the automatic driving perception system at the target vehicle end.
The server side and the plurality of vehicle sides construct a vehicle network, which is specifically shown in fig. 7. Referring to fig. 7, the vehicle end 110 is divided into a requesting vehicle end 11A and a requested vehicle end 11B according to the task executed at each time, and the communication between the requesting vehicle end 11A and the server end 120 includes: the requesting vehicle end 11A sends the request information to the server end 120 and receives the traffic information sent by the server end 120, and the communication between the requested vehicle end 11B and the server end 120 includes: the requested vehicle end 11B receives the access information transmitted from the server end 120 and transmits the result of the automatic driving feeling to the server end 120.
Fig. 8 shows an example of an embodiment of the present invention. Referring to fig. 8, the unmanned vehicle 1 travels southward, and the vehicle 4 exists on the adjacent path on the east side of the unmanned vehicle 1, so that the partial areas on the south side and the west side of the vehicle 4 (the west side of the vehicle 4 and the area framed by the two broken lines) are the perception dead zones of the unmanned vehicle. With reference to fig. 7 and 8, in this case, the requesting vehicle end 11A on the unmanned vehicle 1 sends request information to the server end 120, the server end 120 sends access information to the requested vehicle end on the target vehicle 6, and finally the requesting vehicle end 11A obtains an automatic driving perception result of an automatic driving perception system on the target vehicle 6, so that the requesting vehicle end 11A on the unmanned vehicle 1 can also know the path of the pedestrian 5 in the perception blind area under the condition that the pedestrian 5 in the perception blind area crosses the road and runs to the south side of the vehicle 4, thereby avoiding a traffic accident of collision with the pedestrian 5.
In the embodiment of the invention, the server end is in communication connection with the plurality of vehicle ends to construct an internet of vehicles which can intercommunicate the automatic driving sensing results among the plurality of vehicle ends, and the server end calls the target data from the target vehicle end to acquire the road condition information in the sensing blind area, so that the server end does not need to acquire the road condition information in the sensing blind area one by one through sensing equipment, the task amount of acquiring the road condition information by the server end is reduced, and the server end is also favorable for providing the road condition information in the sensing blind area for a plurality of requesting vehicle ends.
Example five:
the driving method of the unmanned vehicle provided by this embodiment basically adopts the same steps as the fourth embodiment, and therefore, the description thereof is omitted. The difference lies in that:
in step S203, the vehicle end sends request information to the server end, where the request information includes: sequentially sending request information for acquiring target data to the server end until the target data returned by the server end provides all road condition information in the perception blind area; and the number of the first and second groups,
in step S2054, the sending, by the server side, the received target data to the vehicle side includes: and sending an unreturned target data to the vehicle end in response to each request message.
It should be noted that, since the unmanned vehicle is mostly in a mobile state, the same vehicle end may maintain the same target data only when the load vehicle does not move, and if the unmanned vehicle moves, the target data before and after the movement may be different, but the different vehicle ends at the same position may be regarded as sensing the same target data, and thus, the unreturned target data may not necessarily be provided by one target vehicle end that has not provided the target data, but may be provided by one target vehicle end at a new position that is not the target vehicle end that has provided the target data when providing the target data.
Based on the above-mentioned paraphrasing about "target data not returned", steps S2051 and S2052 may be refined as follows:
step S2051, the server determines a target vehicle end from among the plurality of vehicle ends connected to the server end, including: the server side determines a current target vehicle side at a new position from a plurality of vehicle sides connected with the server side;
step S2052, the server sends the access information to the target vehicle to request the target data, including: the server side sends access information to the current target vehicle side to request target data which is not returned.
It should be noted that the target data is an automatic driving perception result of a target vehicle end in a perception blind area, but the target vehicle end in the perception blind area does not necessarily perceive all road condition information in the perception blind area, that is, the target data and the vehicle end are in a one-to-one correspondence relationship, but the set of the target data and all road condition information in the perception blind area is not necessarily in a one-to-one correspondence relationship, and all road condition information in the perception blind area is likely to be selected from a plurality of target data.
Moreover, it should be understood that, since the unmanned vehicles are mostly in a flowing state, the target vehicle ends at different times are different and different numbers of target vehicle ends exist at different times, so that if all the request information before a certain time cannot acquire all the road condition information in the perception blind area, the request information can be sent again to further complement the road condition information in the perception blind area.
In the embodiment of the invention, the vehicle end sequentially sends the request information for acquiring the target data to the server end until the target data returned by the server end provides all road condition information in the perception blind area, so that the vehicle end is ensured to acquire more complete road condition information for the perception blind area, and the unmanned vehicle can run more safely.
Example six:
the driving method of the unmanned vehicle provided by this embodiment basically adopts the same steps as the fourth embodiment, and therefore, the description thereof is omitted.
The difference lies in that:
in step S203, the vehicle end sends request information to the server end, where the request information includes: the vehicle end sends request triggering information to the server end, and sends request termination information to the server end under the condition that target data returned by the server end provides all road condition information in the perception blind area; and the number of the first and second groups,
in step S2054, the sending, by the server side, the received target data to the vehicle side includes: and sequentially returning target data which is not returned to the vehicle end in response to the request triggering information until request termination information sent by the vehicle end is received.
It should be noted that, based on step S203 and step S2054, step S2051 and step S2052 may be refined as follows:
step S2051, the server determines a target vehicle end from among the plurality of vehicle ends connected to the server end, including: the server side determines at least one target vehicle side from a plurality of vehicle sides connected with the server side;
step S2052, the server sends the access information to the vehicle to request the target data, including: the server side sequentially sends access information to at least one target vehicle side to request each target vehicle side to provide unreturned target data.
It should be understood that the number of the at least one target vehicle end is the number of unmanned vehicles in the perceived blind area, that is, the number of the at least one target vehicle end is one if there is one unmanned vehicle in the perceived blind area, and the number of the at least one target vehicle end is plural if there are a plurality of unmanned vehicles in the perceived blind area.
Further, the server may sequentially send the access information to the at least one target vehicle end according to the position of the at least one target vehicle end, specifically, the target vehicle end in the center area of the perception blind area serves as a first vehicle end to provide unreturned target data, then the first vehicle end serves as a center and determines a second vehicle end with a first preset distance range to provide unreturned target data, then the second vehicle end serves as a center and determines a third vehicle end with a second preset distance range to provide unreturned target data, and so on, so that the effectiveness of each target data acquired by the server can be improved, and the vehicle end can be requested to acquire complete road condition information in the perception blind area quickly.
Specifically, at the server side, the steps S2052 to S2054 may be performed sequentially, that is, the server side sends an unreturned target data to the requesting vehicle side in time after acquiring the unreturned target data from one target vehicle side each time, and then acquires an unreturned target data from the next target vehicle side, so that the server side can store the target data without configuring a space.
At the requesting vehicle end, in step S203, the requesting vehicle end may determine whether all the target data returned by the server end provides all the traffic information in the blind sensing area in time after receiving the target data returned by the server end each time, and send request termination information to the server end only when it is determined that all the target data returned by the server end provides all the traffic information in the blind sensing area, thereby avoiding the server end from performing the steps of collecting and sending redundant target data.
In the embodiment of the invention, the requesting vehicle end requests the server end for sensing all road condition information in the blind area by taking the request triggering information and the request terminating information as the request information, so that the server end can rapidly acquire a plurality of unreturned target data at one time, and the requesting vehicle end can also rapidly acquire all road condition information in the blind area.
Example seven:
the driving method of the unmanned vehicle provided by this embodiment basically adopts the same steps as the fourth embodiment, and therefore, the description thereof is omitted.
The difference lies in that:
in step S203, the vehicle end sends request information to the server end, where the request information includes: the vehicle end sends request information for sensing road condition information in the blind area to the server end; and the number of the first and second groups,
in step S2054, the sending, by the server side, the received target data to the vehicle side includes: and the server determines all road condition information in the perception blind area from the plurality of target data and returns the determined all road condition information.
Specifically, the server may extract sub information belonging to road condition information in the perception blind area from the plurality of target data, and splice the extracted sub information to obtain all road condition information in the perception blind area. Further, if the server cannot determine all road condition information in the perception blind area from the plurality of target data within a preset time after receiving the request information, the server may send the plurality of acquired target data to the requesting vehicle end and send the reminding information that the road condition information is incomplete at the same time, so that the vehicle end can implement route adjustment or other measures in time. The preset time duration can be set according to the running speed of the unmanned vehicle, wherein the larger the running speed of the unmanned vehicle is, the smaller the preset time duration is, and thus, the safe running of the unmanned vehicle is ensured.
Based on step S203, step S2054, and step S2051, the determining, by the server side, a target vehicle side from the plurality of vehicle sides connected to the server side may include: the server side determines a plurality of target vehicle sides from a plurality of vehicle sides connected with the server side so as to obtain a plurality of target data from the plurality of target vehicle sides.
In the embodiment of the invention, the server determines all road condition information in the perception blind area from the plurality of target data and returns the determined all road condition information, so that the requesting vehicle end directly obtains all road condition information in the perception blind area, namely, the communication data between the requesting vehicle end and the server end is reduced, and the load of communication resources between the requesting vehicle end and the server end is reduced; moreover, the server side does not need to interact with the requesting vehicle side for multiple times to determine all road condition information in the sensing blind area, so that the speed of determining all road condition information in the sensing blind area by the server side is increased, and the requesting vehicle side is further facilitated to quickly acquire all road condition information in the sensing blind area.
Based on the same concept, the invention also discloses a driving device of the unmanned vehicle, which is arranged in the vehicle end to execute the steps required to be executed by the vehicle end in each embodiment of the unmanned method. Specifically, referring to fig. 9, the apparatus includes:
the monitoring module 111 is used for monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information;
a determining module 112, configured to determine whether a perception blind area exists in the surrounding environment according to the monitoring information;
the request module 113 is configured to send request information to the server side to request the server side to provide road condition information in the perception blind area when it is determined that the surrounding environment has the perception blind area;
and the control module 114 is used for controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side.
According to the driving device of the unmanned vehicle, the road condition information in the sensing blind area is obtained from the server side, so that the sensing blind area of the unmanned vehicle is eliminated, and the relatively complete road condition around the unmanned vehicle is sensed by the vehicle side, so that the vehicle side can control the unmanned vehicle to safely run, and the technical problem that the unmanned vehicle runs dangerously when the sensing blind area occurs in the automatic driving sensing system is solved.
Based on the same concept, the invention also discloses another driving device of the unmanned vehicle, which is arranged in the server side to execute the steps required to be executed by the server side in each embodiment of the unmanned vehicle. Specifically, referring to fig. 10, the apparatus includes:
the receiving module 121 is configured to receive request information sent by a vehicle end, where the request information provides information for sensing road condition information in a blind area for a request;
a determining module 122, configured to determine a blind sensing area according to the request information;
and the sending module 123 is configured to obtain the traffic information in the sensing blind area and send the obtained traffic information to the vehicle end.
According to the driving device of the unmanned vehicle, the information of the road condition in the sensing blind area is provided for the vehicle end, so that the sensing blind area of the unmanned vehicle is eliminated, and the relatively complete road condition around the unmanned vehicle is sensed by the vehicle end, so that the vehicle end can control the unmanned vehicle to safely run, and the technical problem of running danger of the unmanned vehicle under the condition that the sensing blind area occurs in the automatic driving sensing system is solved.
It should be noted that, in this document, the contained terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Finally, it should be noted that: it should be understood that the above examples are only for clearly illustrating the present invention and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications of the invention may be made without departing from the scope of the invention.

Claims (10)

1. A driving method of an unmanned vehicle, applied to a vehicle side, the driving method comprising:
monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information;
judging whether a perception blind area exists in the surrounding environment according to the monitoring information;
under the condition that the sensing blind area exists in the surrounding environment, request information is sent to a server side to request the server side to provide road condition information in the sensing blind area;
and controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side.
2. The driving method according to claim 1,
the monitoring information comprises at least one of the following: the method comprises the following steps of monitoring the time interval between the transmission time and the reflected receiving time of a signal of the surrounding environment;
judging whether the surrounding environment has a perception blind area according to the monitoring information, comprising: determining the perception range of the unmanned vehicle according to the road surface fluctuation information and/or the spatial position information, and judging whether a perception blind area exists in the surrounding environment according to a determination result; or comparing the interval duration with a preset duration, and judging whether the surrounding environment has a perception blind area according to a comparison result.
3. The driving method according to claim 1 or 2, wherein sending request information to the server side includes:
determining a perception blind area according to the monitoring information, and sending the perception blind area to the server end so that the server end can acquire road condition information in the perception blind area;
or sending the monitoring information to the server side so that the server side can determine a perception blind area according to the monitoring information and acquire road condition information in the perception blind area.
4. The driving method according to claim 1 or 2, wherein sending request information to the server side includes:
sequentially sending request information for acquiring target data to a server end to enable the server end to return unreturned target data until the target data returned by the server end provides all road condition information in a perception blind area; or,
sending request trigger information to a server so that the server sequentially returns unreturned target data; and sending request termination information to the server side to enable the server side to terminate the sending of the target data under the condition that the target data returned by the server side provides all road condition information in the perception dead zone; or,
sending request information for sensing road condition information in the blind area to a server end so that the server end determines all road condition information in the blind area from a plurality of target data and returns the determined all road condition information;
the target data is an automatic driving perception result of a target vehicle end, and the target vehicle end is a vehicle end of a load vehicle located in the perception blind area.
5. A driving method of an unmanned vehicle is applied to a server side, and comprises the following steps:
receiving request information sent by a vehicle end, wherein the request information provides information for sensing road condition information in a blind area for a request;
determining a perception blind area according to the request information;
and acquiring road condition information in the perception blind area and sending the acquired road condition information to the vehicle end.
6. The driving method according to claim 5, wherein obtaining the traffic information in the blind sensing area and sending the obtained traffic information to the vehicle end comprises:
determining a target vehicle end from a plurality of vehicle ends connected with the server end, wherein the target vehicle end is a vehicle end of a load vehicle located in the perception blind area;
sending access information to the target vehicle end so that the target vehicle end can return target data, wherein the target data are the automatic driving perception result of the target vehicle end;
and receiving the target data and sending the received target data to the vehicle end.
7. The driving method according to claim 6, wherein transmitting the received target data to the vehicle side includes:
under the condition that the request information is to acquire target data, sending unreturned target data to the vehicle end;
under the condition that the request information is request trigger information, sequentially returning unreturned target data to the vehicle end until request termination information sent by the vehicle end is received;
and under the condition that the request information is the request for sensing the road condition information in the blind area, determining all road condition information in the blind area from a plurality of target data and returning the determined all road condition information.
8. A driving apparatus of an unmanned vehicle, characterized in that the driving apparatus applies the driving method of claims 1-4, the apparatus comprising:
the monitoring module is used for monitoring the surrounding environment of the unmanned vehicle to obtain monitoring information;
the judging module is used for judging whether the surrounding environment has a perception blind area or not according to the monitoring information;
the request module is used for sending request information to the server side to request the server side to provide road condition information in the perception blind area under the condition that the surrounding environment is judged to have the perception blind area;
and the control module is used for controlling the unmanned vehicle to run according to the automatic driving perception result of the unmanned vehicle and the road condition information provided by the server side.
9. A driving apparatus of an unmanned vehicle, characterized in that the driving apparatus applies the driving method of claims 5-7, the apparatus comprising:
the receiving module is used for receiving request information sent by a vehicle end, and the request information provides information for sensing road condition information in the blind area for a request;
the determining module is used for determining a perception blind area according to the request information;
and the sending module is used for acquiring the road condition information in the perception blind area and sending the acquired road condition information to the vehicle end.
10. A system for driving an unmanned vehicle, comprising: a vehicle-side and a server-side communicatively coupled, wherein,
the vehicle end is used for executing the driving method of claims 1-4;
the server is used for executing the driving method of claims 5-7.
CN202010325551.0A 2020-04-23 2020-04-23 Driving method, device and system of unmanned vehicle Pending CN111489564A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010325551.0A CN111489564A (en) 2020-04-23 2020-04-23 Driving method, device and system of unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010325551.0A CN111489564A (en) 2020-04-23 2020-04-23 Driving method, device and system of unmanned vehicle

Publications (1)

Publication Number Publication Date
CN111489564A true CN111489564A (en) 2020-08-04

Family

ID=71813057

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010325551.0A Pending CN111489564A (en) 2020-04-23 2020-04-23 Driving method, device and system of unmanned vehicle

Country Status (1)

Country Link
CN (1) CN111489564A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112964266A (en) * 2021-02-04 2021-06-15 西北大学 Network contract service single-path-splicing planning method and storage medium
CN113119962A (en) * 2021-05-17 2021-07-16 腾讯科技(深圳)有限公司 Driving assistance processing method and device, computer readable medium and electronic device
CN113359724A (en) * 2021-06-02 2021-09-07 东风汽车集团股份有限公司 Vehicle intelligent driving system and method based on unmanned aerial vehicle and storage medium
CN113554874A (en) * 2021-07-30 2021-10-26 新石器慧通(北京)科技有限公司 Unmanned vehicle control method and device, electronic equipment and storage medium
CN115497335A (en) * 2021-06-18 2022-12-20 本田技研工业株式会社 Warning control device, moving body, warning control method, and computer-readable storage medium
EP4167606A1 (en) * 2021-10-12 2023-04-19 Denso Corporation Cooperative intelligent transport system and method with cpm area perception request
WO2023178661A1 (en) * 2022-03-25 2023-09-28 京东方科技集团股份有限公司 Data sharing method, vehicle-mounted device, cloud server, system, device and medium
WO2024178741A1 (en) * 2023-03-02 2024-09-06 Hong Kong Applied Science and Technology Research Institute Company Limited System and method for road monitoring

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180096600A1 (en) * 2016-10-04 2018-04-05 International Business Machines Corporation Allowing drivers or driverless vehicles to see what is on the other side of an obstruction that they are driving near, using direct vehicle-to-vehicle sharing of environment data
CN109491392A (en) * 2018-12-03 2019-03-19 上海木木聚枞机器人科技有限公司 A kind of method and system of shared avoidance
CN109979217A (en) * 2017-12-28 2019-07-05 北京百度网讯科技有限公司 Cooperative intersection passing control method, device and equipment
CN110033625A (en) * 2019-03-27 2019-07-19 刘瑞 Follow the pilotless automobile intelligent networking system of vehicle interaction priority rules
US20190304310A1 (en) * 2018-04-03 2019-10-03 Baidu Usa Llc Perception assistant for autonomous driving vehicles (advs)
CN110703753A (en) * 2019-10-16 2020-01-17 北京京东乾石科技有限公司 Path planning method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180096600A1 (en) * 2016-10-04 2018-04-05 International Business Machines Corporation Allowing drivers or driverless vehicles to see what is on the other side of an obstruction that they are driving near, using direct vehicle-to-vehicle sharing of environment data
CN109979217A (en) * 2017-12-28 2019-07-05 北京百度网讯科技有限公司 Cooperative intersection passing control method, device and equipment
US20190304310A1 (en) * 2018-04-03 2019-10-03 Baidu Usa Llc Perception assistant for autonomous driving vehicles (advs)
CN109491392A (en) * 2018-12-03 2019-03-19 上海木木聚枞机器人科技有限公司 A kind of method and system of shared avoidance
CN110033625A (en) * 2019-03-27 2019-07-19 刘瑞 Follow the pilotless automobile intelligent networking system of vehicle interaction priority rules
CN110703753A (en) * 2019-10-16 2020-01-17 北京京东乾石科技有限公司 Path planning method and device, electronic equipment and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112964266A (en) * 2021-02-04 2021-06-15 西北大学 Network contract service single-path-splicing planning method and storage medium
CN112964266B (en) * 2021-02-04 2022-08-19 西北大学 Network contract service single-path-splicing planning method and storage medium
CN113119962A (en) * 2021-05-17 2021-07-16 腾讯科技(深圳)有限公司 Driving assistance processing method and device, computer readable medium and electronic device
CN113359724A (en) * 2021-06-02 2021-09-07 东风汽车集团股份有限公司 Vehicle intelligent driving system and method based on unmanned aerial vehicle and storage medium
CN115497335A (en) * 2021-06-18 2022-12-20 本田技研工业株式会社 Warning control device, moving body, warning control method, and computer-readable storage medium
US12080171B2 (en) 2021-06-18 2024-09-03 Honda Motor Co., Ltd. Alert control device, mobile object, alert controlling method and computer-readable storage medium
CN113554874A (en) * 2021-07-30 2021-10-26 新石器慧通(北京)科技有限公司 Unmanned vehicle control method and device, electronic equipment and storage medium
EP4167606A1 (en) * 2021-10-12 2023-04-19 Denso Corporation Cooperative intelligent transport system and method with cpm area perception request
WO2023178661A1 (en) * 2022-03-25 2023-09-28 京东方科技集团股份有限公司 Data sharing method, vehicle-mounted device, cloud server, system, device and medium
WO2024178741A1 (en) * 2023-03-02 2024-09-06 Hong Kong Applied Science and Technology Research Institute Company Limited System and method for road monitoring

Similar Documents

Publication Publication Date Title
CN111489564A (en) Driving method, device and system of unmanned vehicle
KR102303716B1 (en) Method for autonomous cooperative driving based on vehicle-road infrastructure information fusion and apparatus for the same
EP3625633B1 (en) Autonomous vehicle collision mitigation systems and methods
US12055945B2 (en) Systems and methods for controlling an autonomous vehicle with occluded sensor zones
US10349011B2 (en) System and method for improved obstacle awareness in using a V2X communications system
RU2668361C1 (en) Car parking mapping system (options)
US11698635B2 (en) Control of an autonomous vehicle
US9280899B2 (en) Dynamic safety shields for situation assessment and decision making in collision avoidance tasks
CN104272364B (en) Aircraft preventing collision method and be provided with the unmanned plane of system for realizing described method
CN110816540B (en) Traffic jam determining method, device and system and vehicle
EP3835823B1 (en) Information processing device, information processing method, computer program, information processing system, and moving body device
DE102016206458A1 (en) VEHICLE DRIVING CONTROL DEVICE
CN110673599A (en) Sensor network-based environment sensing system for automatic driving vehicle
US20200250980A1 (en) Reuse of Surroundings Models of Automated Vehicles
CN110874938A (en) Traffic light control system and traffic light control method
WO2021239531A1 (en) Emergency lane assistant
CN117320935A (en) Brake arbitration
US11845429B2 (en) Localizing and updating a map using interpolated lane edge data
US20240109559A1 (en) Method and Control Device for the Situation-Dependent Determination of Observation Areas for Motor Vehicles Operated in an at Least Partially Autonomous Manner
CN114348018A (en) Automatic driving system and method for commercial vehicle
JP6583697B2 (en) Perimeter monitoring device, control device, perimeter monitoring method, and program
CN110427034B (en) Target tracking system and method based on vehicle-road cooperation
CN117894202A (en) Method, device and system for parking passengers
KR20240019928A (en) Method for predicting and determining abnormal state of autonomous driving vehicle and apparatus and system therefor
CN114394090A (en) Vehicle-road cooperative virtual front obstacle sensing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200804