Nothing Special   »   [go: up one dir, main page]

WO2020164238A1 - 用于驾驶控制的方法、装置、设备、介质和系统 - Google Patents

用于驾驶控制的方法、装置、设备、介质和系统 Download PDF

Info

Publication number
WO2020164238A1
WO2020164238A1 PCT/CN2019/103876 CN2019103876W WO2020164238A1 WO 2020164238 A1 WO2020164238 A1 WO 2020164238A1 CN 2019103876 W CN2019103876 W CN 2019103876W WO 2020164238 A1 WO2020164238 A1 WO 2020164238A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
message
driving
information
perception
Prior art date
Application number
PCT/CN2019/103876
Other languages
English (en)
French (fr)
Inventor
张珠华
鲍泽文
胡星
Original Assignee
北京百度网讯科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京百度网讯科技有限公司 filed Critical 北京百度网讯科技有限公司
Priority to EP19914783.6A priority Critical patent/EP3916696A4/en
Publication of WO2020164238A1 publication Critical patent/WO2020164238A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096811Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed offboard
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/10Conjoint control of vehicle sub-units of different type or different function including control of change-speed gearings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/35Road bumpiness, e.g. potholes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4042Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4043Lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4044Direction of movement, e.g. backwards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4046Behavior, e.g. aggressive or erratic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Definitions

  • the embodiments of the present disclosure generally relate to the computer field, and more specifically, to automatic driving control technology.
  • autonomous driving also known as unmanned driving
  • autonomous driving may include sensing the surrounding environment of the vehicle through sensing devices, executing decision-making planning on driving based on the perception results, and controlling specific driving operations of the vehicle according to the decision-making planning. Therefore, the accuracy and efficiency of environmental perception, decision planning, and control will affect the driving safety and comfort of vehicles and the operating efficiency of the overall transportation system.
  • a solution for driving control is provided.
  • a method of assisting driving control includes obtaining perceptual information related to the environment in which the vehicle is located by an external device of the vehicle; determining a description related to a target object existing in the environment based on the perceptual information, and the target object is located outside the vehicle; at least based on the correlation with the target object Description to generate driving-related messages, which include at least one of perception messages, decision planning messages, and control messages; and provide driving-related messages to the vehicle for controlling the driving of the vehicle relative to the target object
  • a method of driving control includes receiving a driving-related message from an external device of the vehicle, the driving-related message is generated based on perception information related to the environment in which the vehicle is located and indicating a description related to a target object outside the vehicle, and the driving-related message includes a perception message At least one of a decision-making planning message and a control message; and based on driving-related messages, controlling the driving of the vehicle in the environment relative to the target object.
  • an apparatus for assisting driving control includes: a perception acquisition module configured to obtain perception information related to the environment in which the vehicle is located by an external device of the vehicle; a target object determination module configured to determine that it is related to a target object existing in the environment based on the perception information The target object is located outside the vehicle; the message generation module is configured to generate driving-related messages based at least on descriptions related to the target object, the driving-related messages including at least one of perception messages, decision planning messages, and control messages; and The message supply module is configured to provide driving-related information to the vehicle for controlling the driving of the vehicle relative to the target object.
  • a driving control device configured to include: a message receiving module configured to receive driving-related messages from external equipment of the vehicle, the driving-related messages are generated based on perception information related to the environment in which the vehicle is located and indicate descriptions related to target objects outside the vehicle , And driving related messages include at least one of perception messages, decision planning messages, and control messages; and a driving control module configured to control the driving of the vehicle in the environment relative to the target object based on the driving related messages.
  • an electronic device in a fifth aspect of the present disclosure, includes one or more processors; and a storage device for storing one or more programs. When the one or more programs are executed by the one or more processors, the one or more processors can implement according to the present disclosure.
  • the first aspect of the method is provided.
  • an electronic device in a sixth aspect of the present disclosure, includes one or more processors; and a storage device for storing one or more programs. When the one or more programs are executed by the one or more processors, the one or more processors can implement according to the present disclosure.
  • the second aspect of the method is provided.
  • a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the method according to the first aspect of the present disclosure.
  • a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the method according to the second aspect of the present disclosure.
  • a system for coordinated driving control including: a roadside subsystem including the device according to the third aspect of the present disclosure; and an on-board subsystem including the first Four-sided device.
  • FIG. 1 shows a schematic diagram of an example environment in which multiple embodiments of the present disclosure can be implemented
  • Figure 2 shows a block diagram of a cooperative driving control system according to some embodiments of the present disclosure
  • 3A to 3F show schematic diagrams of example scenarios of coordinated driving control according to some embodiments of the present disclosure
  • 4A is a flowchart of a method for assisting driving control on the roadside according to some embodiments of the present disclosure
  • 4B is a flowchart of a method for assisting driving control on the roadside according to some embodiments of the present disclosure
  • 5A is a flowchart of a method for driving control on the vehicle side according to some embodiments of the present disclosure
  • Fig. 5B is a flowchart of a method for driving control on the vehicle side according to some embodiments of the present disclosure
  • Fig. 6A is a schematic block diagram of a device for driving control on the roadside according to some embodiments of the present disclosure
  • Fig. 6B is a schematic block diagram of a device for driving control on the roadside according to some embodiments of the present disclosure
  • Fig. 7A is a schematic block diagram of an apparatus for driving control on the vehicle side according to some embodiments of the present disclosure
  • Fig. 7B is a schematic block diagram of an apparatus for driving control on the vehicle side according to some embodiments of the present disclosure.
  • FIG. 8 shows a block diagram of a device capable of implementing various embodiments of the present disclosure.
  • an improved driving control scheme is proposed. This solution is based on multi-side coordination to realize the driving control of vehicles. Specifically, an external device of the vehicle generates at least one of a perception message, a decision planning message, and a control message based on perception information related to the environment. The generated perception messages, decision planning messages, and/or control messages are provided to the vehicle. The vehicle uses the received message to achieve driving control. In some embodiments, descriptions related to target objects existing in the environment can be determined based on the perception information, especially descriptions related to target objects located outside the vehicle. Driving-related messages are generated based on descriptions related to the target object. Thus, the automatic driving of the vehicle can be controlled relative to the target object, and more flexible and efficient driving control can be realized.
  • an improved cooperative driving control scheme is also proposed. Specifically, the remote device determines whether the target object is in an abnormal motion state based on the perceptual information related to the environment. When it is determined that the target object is in an abnormal motion state, a driving-related message is generated to indicate the abnormal motion state of the target object. The generated driving-related messages are provided to the vehicle for controlling the vehicle to travel on a trajectory that does not collide with the target object. As a result, abnormal objects in the environment that may affect the driving of the vehicle can be detected more accurately and in time, so as to guide the safe driving of the vehicle.
  • FIG. 1 shows a schematic diagram of an example traffic environment 100 in which multiple embodiments of the present disclosure can be implemented.
  • One or more vehicles 130-1, 130-2, 130-3 are included in this example environment 100.
  • the vehicles 130-1, 130-2, and 130-3 are collectively referred to as the vehicle 130.
  • a vehicle refers to any type of tool that can carry people and/or things and is movable.
  • the vehicle 130 is illustrated as a vehicle.
  • the vehicle may be a motor vehicle or a non-motor vehicle, examples of which include but are not limited to cars, cars, trucks, buses, electric vehicles, motorcycles, bicycles, and so on.
  • the vehicle is only an example of a vehicle.
  • the embodiments of the present disclosure are equally applicable to other means of transportation besides vehicles, such as ships, trains, airplanes, and so on.
  • One or more vehicles 130 in the environment 100 may be vehicles with certain autonomous driving capabilities, which are also referred to as unmanned vehicles.
  • the other one or some vehicles 130 in the environment 100 may be vehicles that do not have automatic driving capabilities.
  • Such vehicles can be controlled by the driver.
  • the integrated devices or removable devices in one or more vehicles 130 may have the ability to communicate with other devices based on one or more communication technologies, such as vehicle-to-vehicle (V2V) technology, vehicle-to-infrastructure (V2I) Technology, vehicle-to-network (V2N), vehicle-to-everything or vehicle-to-everything (V2X) technology or any other communication technology to communicate with other vehicles or other devices.
  • the vehicle 130 may be equipped with a positioning device to determine its own position, and the positioning device may, for example, realize positioning based on any of the following technologies: positioning technology based on laser point cloud data, global positioning system (GPS) technology, global navigation Satellite system (GLONASS) technology, Beidou navigation system technology, Galileo positioning system (Galileo) technology, quasi-zenith satellite system (QAZZ) technology, base station positioning technology, Wi-Fi positioning technology, etc.
  • GPS global positioning system
  • GLONASS global navigation Satellite system
  • Beidou navigation system technology Beidou navigation system technology
  • Galileo positioning system Galileo
  • QAZZ quasi-zenith satellite system
  • base station positioning technology Wi-Fi positioning technology
  • the traffic infrastructure includes objects used to guide traffic and indicate traffic rules, such as traffic lights 150-3, traffic signs (not shown), street lights, and so on.
  • Objects located outside the vehicle are collectively referred to as objects outside the vehicle 105.
  • target objects During the driving control process of a certain vehicle 130, attention needs to be paid to objects outside the vehicle that may affect the driving of the vehicle, and such objects are called target objects.
  • One or more external devices of the vehicle 130 are also deployed in the environment 100.
  • the external equipment is usually arranged independently of the vehicle 130 and can be located away from the vehicle 130.
  • a device “independent” of the vehicle 130 refers to a device that does not move with the movement of the vehicle 130 at least.
  • the roadside device 110 and the remote device 120 may be any devices, nodes, units, facilities, etc. that have computing capabilities.
  • the remote device may be a general-purpose computer, a server, a large server, a network node such as an edge computing node, a cloud computing device such as a virtual machine (VM), and any other device that provides computing capabilities.
  • VM virtual machine
  • the vehicle 130 may have a wired and/or wireless communication connection with the roadside device 110.
  • one or more vehicles 130 may have a communication connection 101 with the roadside device 110 and a communication connection 103 with the remote device 120, and there may also be a communication connection 105 between the roadside devices 110 and 120.
  • a communication connection with one of the roadside devices 110 and 120 may be established, or may have a communication connection with both.
  • the roadside device 110 and/or the remote device 120 can master or assist the driving of one or more vehicles 130 through signaling transmission and reception.
  • a roadside subsystem 112 (also referred to as a “road subsystem”) is integrated/installed/fixed in the roadside equipment 110, and an on-board subsystem 132 is integrated/installed/fixed in the vehicle 130-1.
  • the roadside subsystem 112 and the vehicle-mounted subsystem 132 communicate with each other to realize driving control of the vehicle 130.
  • Different in-vehicle subsystems 132 can also communicate with each other to realize coordinated driving control.
  • other roadside equipment and vehicles may also be equipped with a roadside subsystem 112 and an onboard subsystem 132, respectively.
  • the roadside device 110 may be deployed near the geographic area where the vehicle 130 is traveling and/or parked, for example, it may be deployed on both sides of the road at certain intervals, or at a predetermined distance from the location where the vehicle 130 may appear.
  • the communication connection 101 between the vehicle 130 and the roadside device 110 may be based on short-range communication technology.
  • the communication connection 101 may also be based on other distance communication technologies, and the scope of the present disclosure is not limited in this respect.
  • the remote device 120 may be a networked computing infrastructure.
  • the remote device 120 may be deployed on a computing node in the cloud or in another network environment, such as a remote computing node, a server, or an edge computing device. In a cloud environment, the remote device 120 may sometimes be called a cloud device.
  • the remote device 120 may provide higher computing capability, storage capability, and/or communication capability.
  • the communication connection 103 between the remote device 120 and the roadside device 110 and/or the communication connection 105 with the vehicle 130 may be based on a long-distance communication technology. Although it may not be physically located near the driving area of the vehicle 130, the remote device 120 can still obtain real-time control and/or non-real-time control of the vehicle 130 through a high-speed communication connection.
  • One or more sensing devices 107 are also arranged in the environment 100.
  • the sensor device 107 is arranged independently of the vehicle 130 and is used to monitor the condition of the environment 100 to obtain perceptual information related to the environment 100.
  • the sensing device 107 can sense traffic participants in the environment 100, including but not limited to target objects such as vehicles 130, pedestrians, and cyclists, and detect the type, location, speed, direction and other information of the target object.
  • the sensing device 107 may also be referred to as a roadside sensing device.
  • the roadside device 110 especially the roadside device 110 deployed near the driving environment, may have wired and/or wireless communication connections with one or more sensor devices 107.
  • the farther remote device 120 may also communicate with one or more sensor devices 107, or transfer via the roadside device 110.
  • the perception information collected by the sensor device 107 arranged corresponding to the road may also be referred to as roadside perception information.
  • the sensing information of the sensing device 107 can be provided to the roadside device 110 having a communication connection. Although shown as mutually independent devices, in some implementations, the sensing device 107 may also be partially or fully integrated with the roadside devices 110, 120.
  • a plurality of sensing devices 107 may be arranged at certain intervals to monitor a specific geographic area of the environment 100.
  • a movable sensor device 107 such as a movable sensing station, can also be provided.
  • the sensing range of the sensing device 107 is limited.
  • FIG. 1 schematically shows the sensing range 102 of the sensing device 107. Objects or phenomena appearing in the sensing range 102 can be sensed by the sensing device 107.
  • the objects 105-1 to 105-3 outside the vehicle and the vehicles 130-1 and 130-2 are all located within the sensing range 102 of the sensing device 107.
  • the vehicle 130-3 is not in the sensing range 102, it may not be in the sensing range of any sensing device, or may be located in the sensing range of other sensing devices not shown. In some cases, the sensing ranges of a plurality of adjacently deployed sensing devices 107 may partially overlap.
  • the sensing device 107 may be arranged near the area where the vehicle 130 is traveling and/or parking. According to needs, the sensing device 107 can be arranged on the roadside, on the road surface, or deployed at a certain height, for example, fixed at a certain height by a support rod.
  • the sensing device 107 may include one or more sensor units, and these sensor units may be of the same type or different types, and may be distributed in the same position or different positions of the sensing range 102.
  • Examples of the sensor unit in the sensing device 107 may include, but are not limited to: image sensors (such as cameras), lidar, millimeter wave radar, infrared sensors, positioning sensors, light sensors, pressure sensors, temperature sensors, humidity sensors, wind speed sensors , Wind direction sensor, air quality sensor, etc.
  • Image sensors can collect image information; lidar and millimeter wave radars can collect laser point cloud data; infrared sensors can use infrared to detect environmental conditions in the environment; positioning sensors can collect object position information; light sensors can collect information indicating the environment The measurement value of light intensity; pressure, temperature and humidity sensors can collect measurement values indicating pressure, temperature and humidity respectively; wind speed and wind direction sensors can collect measurement values used to indicate wind speed and wind direction respectively; air quality sensors can collect some measurement values that are related to air Quality-related indicators, such as the oxygen concentration, carbon dioxide concentration, dust concentration, and pollutant concentration in the air. It should be understood that only some examples of sensor units are listed above. According to actual needs, other types of sensors may also exist.
  • FIG. 1 the facilities and objects shown in FIG. 1 are only examples. The type, number, and relative arrangement of objects that appear in different environments may vary. The scope of the present disclosure is not limited in this respect.
  • the environment 100 may have more roadside devices 110 and sensing devices 107 deployed on the roadside to monitor additional geographic locations.
  • the embodiments of the present disclosure may involve multiple remote devices 120, or may not involve remote devices 120.
  • equipment outside the vehicle including roadside equipment or remote equipment, is configured to work with the vehicle to provide partial or full driving of the vehicle control.
  • Part or all of the driving control is realized by providing perception messages, decision planning messages and/or control messages to the vehicle.
  • the roadside subsystem of the remote device and the vehicle-mounted subsystem on the vehicle realize cooperative driving control.
  • the driving control function of the vehicle can be realized, and effective safe automatic driving can be realized.
  • FIG. 2 shows a schematic block diagram of a cooperative driving control system 200 according to some embodiments of the present disclosure.
  • the system 200 involves the vehicle 130, the roadside device 110, and the roadside sensor device 107 of FIG. 1.
  • the remote device 120 may also perform driving control together with or instead of the roadside device 110, especially when the communication rate between the remote device 120 and the vehicle 130 allows. Therefore, the functions described below for the roadside device 110, especially the roadside subsystem 112, can also be implemented at the remote device 120 accordingly.
  • cooperative driving control may also involve traffic infrastructure in the driving environment, such as traffic lights 150-3.
  • the roadside subsystem 112 of the roadside equipment 110 includes a roadside unit (RSU) 212 and a driving control module 214
  • the onboard subsystem 132 in the vehicle 130 includes an onboard unit (OBU) 232 and a driving control module. 234, and optionally includes one or more on-board sensing devices 236 arranged in association with the vehicle 130.
  • the RSU 212 is configured to communicate with devices and/or components outside the roadside subsystem 112, such as with other roadside subsystems 112, remote devices 120, and/or on-board subsystems 132, etc.
  • the OBU 232 is configured to communicate with the on-board Devices and/or components external to the subsystem 132 communicate, for example, with the roadside subsystem 112, other on-board subsystems 132, and/or remote devices 120.
  • the driving control module 214 in the roadside subsystem 112 is configured to process information related to driving control and generate driving-related messages to be transmitted by the RSU 212 for driving control of the vehicle 130.
  • the driving control module 234 in the in-vehicle subsystem 132 is configured to process information related to driving control, obtain driving-related messages, and control the driving of the vehicle 130 based on the information processing results and/or the obtained driving-related messages.
  • the driving-related message may have any format that conforms to the communication technology used between the roadside subsystem 112 and the on-board subsystem 132.
  • the driving-related messages may include at least one of the following: a perception message 202 indicating the result of environmental perception, a decision planning message 204 indicating a decision plan of the vehicle 130 while driving, and a control message indicating a specific driving operation of the vehicle 130 while driving 206.
  • a perception message 202 indicating the result of environmental perception
  • a decision planning message 204 indicating a decision plan of the vehicle 130 while driving
  • a control message indicating a specific driving operation of the vehicle 130 while driving 206.
  • the information indicated by these three types of messages all depend on the information contained in the previous type of messages. For example, decision planning is based at least on the results of environmental perception, and control messages are usually made on the basis of decision planning or on the basis of environmental perception results.
  • the vehicle 130 may be equipped with a wired control system, including a braking system, a steering system, a transmission system, and a body control of the vehicle, and can control acceleration, deceleration, steering, lights, double flashes, etc. of the vehicle.
  • a wired control system including a braking system, a steering system, a transmission system, and a body control of the vehicle, and can control acceleration, deceleration, steering, lights, double flashes, etc. of the vehicle.
  • the driving control of the vehicle 130 is constantly clarified, and the requirements for the automatic driving control capability of the vehicle 130 are gradually reduced.
  • the specific types of driving-related messages provided may depend on various triggers, such as at least one of time-based triggers, location-based triggers, and event-based triggers.
  • the time-based trigger may be, for example, a periodic broadcast or the sending of one or more predetermined types of driving-related messages within a specific time period, such as sending a perception message. All vehicles 130 that have established a communication connection with the roadside device 110 and are within the communication range can receive the broadcast driving-related messages, particularly the perception message 202.
  • the location-based trigger may, for example, trigger one or more predetermined types of driving-related messages based on the location of the vehicle 130. It can be determined whether the vehicle 130 is in a predetermined area (for example, a fork in the road), in a specific road section, and/or the distance from a reference object (for example, the roadside device 110) is within a predetermined threshold, etc., and when it is determined that the condition is satisfied Send predetermined types of driving-related messages. Under such triggering conditions, a perception message 202, a decision planning message 204, and/or a control message 206 may be sent.
  • a predetermined area for example, a fork in the road
  • a reference object for example, the roadside device 110
  • the event-based trigger may include, for example, a request from the on-board subsystem 132.
  • the on-board subsystem 132 may send specific types of driving-related messages according to the instructions of the request.
  • the event-based trigger may also include detecting the occurrence of a predetermined event related to the vehicle 130. For example, if it is detected that the vehicle 130 is in a dangerous state, a fault state, or a power shortage state, the decision planning message 204 and a control message 206 may be sent to deal with the dangerous and fault state, and reduce the calculation and power consumption of the vehicle 130. You can also define other events that trigger the transmission of one or more driving-related messages.
  • the RSU 212 in the roadside subsystem 112 is configured to obtain sensing information related to the detected environment 100 from one or more environmental sensing sources, and generate driving-related messages 202, 204, and 206 based on the sensing information.
  • the perception information may indicate one or more aspects of the environment 100 and the objects present in it, depending on the ability of the environment's perception source.
  • the perception information includes at least information related to objects existing in the environment 100.
  • the environmental perception source may include one or more roadside sensing devices 107 that provide roadside perception information 220 to the roadside subsystem 112.
  • roadside sensing devices 107 can provide a better viewing angle and a more comprehensive and accurate environmental perception. Due to the existence of the roadside sensing device 107, even when the vehicle 130 has no perception ability, only has limited perception ability, and/or has a perception blind zone due to the occlusion of other objects, it can still obtain the perception that depends on the roadside. Complete the automatic driving function, thereby realizing low-cost safe automatic driving.
  • the environmental perception source may also include the vehicle-mounted sensor device 236 on the vehicle 130, including the target vehicle 130 to be controlled and other vehicles 130, which provide vehicle perception information to the roadside subsystem 112 224.
  • the type, number, and/or sensing capabilities of the vehicle-mounted sensor devices 236 may be the same as or different from the one or more roadside sensor devices 107.
  • the environment perception source may further include a remote device 120, which may obtain perception information related to the environment 100 from a database and/or from other devices. By synthesizing the perception information from different environmental perception sources, a collaborative perception of the environment 100 can be realized to perceive the environment 100 more comprehensively. Such collaborative perception can make up for the limitations of one or more aspects of perception accuracy, perception perspective, and perception types of different environmental perception sources.
  • One or more environmental perception sources can use corresponding types of sensing devices to monitor static and/or dynamic objects in the environment 100, such as pedestrians, cyclists, vehicles, objects protruding from the road, etc., and can also detect traffic related Transportation facilities, such as traffic lights, traffic signs. Alternatively or in addition, the environmental perception source can also monitor the road surface conditions, road traffic conditions, weather conditions in the environment, geographic areas, monitoring and diagnosis of target vehicles, and so on.
  • the sensing information obtained by the RSU 212 from the environmental sensing source may be the original sensing data collected by the sensing device, the partially processed sensing data, and/or the sensing result.
  • the perception information from the on-board subsystem 132 may be raw perception data, or perception data or perception results that are partially or fully processed by the driving control module 234.
  • the obtained perception information is provided to the driving control module 214.
  • the driving control module 214 is configured to perform analysis based on the perception information. Specifically, the driving control module 214 may be configured to recognize objects that may appear in the environment 100 based on the perception information, and determine information related to the recognized objects. The driving control module 214 may also be configured to determine a perception result of one or more other aspects of the environment 100 and/or the vehicle 130 in the environment 100 based on the perception information. The driving control module 214 may use various data analysis technologies such as data fusion technology to process the perception information. In some embodiments, by analyzing the perception information, the driving control module 214 may generate the perception message 202, which includes the analysis result of the perception information. The sensing message 202 may be provided to the OBU 232 in the on-board subsystem 132 via the RSU 212.
  • the driving control module 234 in the in-vehicle subsystem 132 may perform driving control of the vehicle 130 based on the perception message 202.
  • the driving control module 234 may generate a decision plan of the self-vehicle based on the perception message 202, and control the driving operation of the self-vehicle based on the generated decision plan.
  • the vehicle 130 has environmental awareness capabilities, for example, it is integrated with a vehicle-mounted sensor device 236, and/or can obtain sensing information from other devices (such as other vehicles 130) other than the roadside device 110, then The driving control module 234 may perform driving control of the vehicle 130 based on the perception message 202 received from the RSU 212 together with other perception information.
  • the sources of different perception messages/information can be processed through technologies such as data fusion.
  • the driving control module 234 may directly use the received perception message 202 as the input of the decision control of the vehicle 130.
  • the perception message 202 may include instruction information related to the environment 100 and/or one or more other aspects of the vehicle 130 in the environment 100.
  • the perception message 202 may include one or more of the following information: descriptions related to the target object existing in the environment 100, such as the classification, location information, speed information, direction information, At least one of physical appearance description information, specific type, movement state, retention time of the movement state, perceived confidence, historical movement trajectory information, predicted movement trajectory information, and state tracking information; and the physical condition of the road in the environment 100 Relevant information to indicate at least one of the physical condition of the road surface and structured information of the road; information related to the traffic facilities in the environment 100 to indicate at least one of the state of signal lights and traffic signs on the road; Road traffic condition information in the environment 100 to indicate at least one of lane signs, traffic flow, and traffic incidents related to the road and/or lanes in the road; and information related to weather conditions in the environment 100.
  • the perception message 202 may also include auxiliary
  • the perception message 202 may further include map information.
  • the map information may, for example, indicate at least one of the identifier of the map, the update method of the map, the area to be updated, and the location information.
  • the map information can indicate a high-precision map that meets the requirements of the autonomous driving scene.
  • the identifier of the map may include, for example, the version number of the map, the update frequency of the map, and so on.
  • the update method of the map may, for example, indicate the update source of the map (from the remote device 120 or from other external sources), update link, update time, and so on.
  • the perception message 202 may also include information related to the parking area of the vehicle 130.
  • the perception message 202 may include information related to the parking area of the vehicle 130.
  • the information related to the parking area may include related information of parking points (for example, parking spaces) in the parking area, target object information in the parking area, and the like.
  • the information related to the parking area may include, for example, the predetermined number of the parking point in the parking area, the location information of the parking point, the description of the physical state of the parking point, the free status of the parking point, and/or the parking area Information about entrances and exits.
  • the information related to the parking area may be provided by the remote device 120 to the roadside device 110, for example.
  • the roadside device 110 includes the information in the perception message 202 and provides it to the vehicle 130 to assist the vehicle 130 to find a suitable parking point as soon as possible.
  • the roadside device 110 may not include the information related to the parking area in the perception message 202, but generate the subsequent decision planning message 204 and/or the control message 206 based on such information to control the vehicle 130 Driving and/or parking in a parking area.
  • Motion state Remarks still Speed is zero or less than a threshold movement Has a certain speed (greater than a threshold) ... You can also subdivide other sports states by moving speed
  • the perception message 202 may also include other content, including less or more content.
  • the perception message 202 provided by the roadside subsystem 112 to the on-board subsystem 132 may be all or part of the content in the above Table 1.1 to Table 1.4, or may include other content.
  • the sensing message 202 or its constituent elements may be compressed, for example, to further reduce the amount of data transmitted between the roadside and the vehicle.
  • the in-vehicle subsystem 132 on one vehicle 130 may also provide sensing data to the in-vehicle subsystem 132 on other vehicles 130 (for example, through a communication connection between OBUs), for example, to sense the message 202.
  • the format provides perception data.
  • the perception message 202 exchanged between the on-board subsystems 132 may mainly include descriptions related to the target object, for example, the descriptions related to the target object designed in Table 1.1 to Table 1.4 above.
  • the perception messages exchanged between the on-board subsystems 132 may be listed separately in Table 1.5 below, for example.
  • the decision planning message 204 may be generated by the driving control module 214 of the roadside subsystem 112 based on the perception information.
  • the driving control module 214 processes the perception information, determines the perception result, and generates a decision planning message 204 based on the perception result.
  • the driving control module 214 may determine to perform the planning according to the road or lane level, and perform unified planning for the vehicles 130 on the same road or lane.
  • the driving control module 214 may also perform planning according to the vehicle level. Due to the more comprehensive perception information of the environment 100, the roadside subsystem 112 can consider the conditions of all vehicles or traffic participants in a certain geographic area to determine a more reasonable decision plan.
  • the roadside subsystem 112 when it is determined that the roadside end is required to intervene in the driving decision or planning of the vehicle 130, or when the roadside subsystem 112 takes over the driving control of the vehicle 130 (for example, according to geographic area, according to Upon request of the vehicle 130 or after obtaining the permission of the vehicle 130), the roadside subsystem 112 provides a decision planning message 204 to the vehicle-mounted subsystem 132.
  • the decision planning message 204 may be provided to the in-vehicle subsystem 132 in real time or at small time intervals (which may depend on the moving speed of the vehicle 130), for example.
  • the driving control module 234 in the in-vehicle subsystem 132 may perform driving control of the vehicle 130 based on the decision planning message 204. For example, the driving control module 234 may generate self-vehicle control information based on the decision planning message 204, which indicates specific driving operations of the vehicle 130. In some embodiments, the in-vehicle subsystem 132, such as the driving control module 234 therein, may determine whether to use the decision planning message 204 to control the vehicle 130. Without using the decision planning message 204, the message can be discarded.
  • the driving control module 234 can fully comply with the decision plan for the vehicle 130 indicated in the decision planning message 204, or alternatively, it can be formulated in conjunction with local strategies and with reference to the decision plan indicated in the message 204 Local decision planning.
  • the embodiments of the present disclosure are not limited in this respect.
  • the roadside subsystem 112 can obtain global environment perception through various environment perception sources, including but not limited to road traffic status information, road physical state related information, infrastructure information, map information, etc., and can be used when planning the driving of the vehicle 130 More reasonable and effective use of road resources to ensure safer and more effective driving.
  • environment perception sources including but not limited to road traffic status information, road physical state related information, infrastructure information, map information, etc.
  • the decision planning message 204 may include one or more of the following information: an indication of the driving road on which the decision is to be applied, the start time information and/or end time information of the decision plan, and the start time of the decision plan.
  • Start position information and/or end position information the identification of the vehicle targeted by the decision planning, the decision information related to the driving behavior of the vehicle, the decision information related to the driving action of the vehicle, the information of the trajectory point of the path planning, The expected time of the trajectory point of the path planning, the information related to other vehicles involved in the decision planning, map information and time information, etc.
  • the map information may, for example, indicate at least one of the identification of the map, the update method of the map, the area to be updated, and the location information.
  • the identifier of the map may include, for example, the version number of the map, the update frequency of the map, and so on.
  • the update method of the map may, for example, indicate the update source of the map (from the remote device 120 or from other external sources), update link, update time, and so on.
  • the decision planning message 204 may also include other content, including less or more content.
  • the decision planning message 204 provided by the roadside subsystem 112 to the onboard subsystem 132 may be all or part of the content in Table 2 above, or may include other content.
  • the decision planning message 204 or its constituent elements may be such as data compression to further reduce the amount of data transmitted between the roadside and the vehicle.
  • the control message 206 may be generated by the driving control module 214 of the roadside subsystem 112 based on the decision planning information.
  • the control message 206 may specify a specific driving operation of the vehicle 206.
  • the roadside subsystem 112 when it is determined that the roadside end is required to intervene in the driving decision or planning of the vehicle 130, or when the roadside subsystem 112 takes over the driving control of the vehicle 130 (for example, according to geographic area, according to Upon the request of the vehicle 130 or after obtaining the permission of the vehicle 130), the roadside subsystem 112 provides a control message 206 to the onboard subsystem 132.
  • the control message 206 may be provided to the on-board subsystem 132 in real time or at small time intervals (which may depend on the moving speed of the vehicle 130), for example.
  • the driving control module 234 in the in-vehicle subsystem 132 may perform driving control of the vehicle 130 based on the control message 206. For example, the driving control module 234 may determine a specific driving operation of the vehicle 130 based on the control message 206. In some embodiments, the in-vehicle subsystem 132, such as the driving control module 234 therein, may determine whether to use the control message 206 to control the vehicle 130. In the case that the control message 206 is not used, the message can be discarded.
  • the driving control module 234 can fully comply with the control operation for the vehicle 130 indicated in the control message 206, or alternatively, it can combine local policies and refer to the control operation indicated by the message 206 to determine the local Control operation.
  • the embodiments of the present disclosure are not limited in this respect.
  • the roadside subsystem 112 can obtain global environment perception through various environment perception sources, including but not limited to road traffic status information, road physical state related information, infrastructure information, map information, etc., when making control operations on the vehicle 130 It can make more reasonable and effective use of road resources to ensure the safe driving of autonomous vehicles under special needs.
  • environment perception sources including but not limited to road traffic status information, road physical state related information, infrastructure information, map information, etc.
  • control message 206 may include information related to one or more of the following: kinematic control information related to the movement of the vehicle, and related to the power system, transmission system, braking system, and steering system of the vehicle Dynamic control information related to at least one item in the vehicle, control information related to the riding experience of the occupant in the vehicle, control information related to the traffic warning system of the vehicle, and time information.
  • control message 206 For ease of understanding, some specific examples and descriptions of different types of information in the control message 206 are given in Table 3 below.
  • control message 206 may also include other content, including less or more content.
  • control message 206 provided by the roadside subsystem 112 to the on-board subsystem 132 may be all or part of the content in Table 2 above, or may include other content.
  • the control message 206 or its constituent elements may be such as data compression to further reduce the amount of data transmitted between the roadside and the vehicle.
  • the on-board subsystem 132 at the vehicle 130 may also provide the vehicle perception information 224 to the roadside subsystem 112 for the driving control module 214 to use to generate driving-related messages.
  • the vehicle perception information 224 may be directly sent by the OBU232 to the RSU 212, or may be processed and generated in a certain format of perception messages.
  • the perception message provided by the on-board subsystem 132 includes information related to the environment detected by the on-board sensor device 236, and the type of content therein may include the content in the perception message 202 from the roadside subsystem 112 to the on-board subsystem 132 One or more types.
  • the perception message provided by the on-board subsystem 132 may include descriptions related to the target object existing in the environment 100 to indicate the type, location information, speed information, direction information, and physical appearance of the target object. , At least one of historical trajectory and predicted trajectory.
  • the perception message provided by the on-board subsystem 132 may alternatively or add information related to weather conditions in the environment; auxiliary information related to the positioning of the vehicle 130 includes information such as road traffic in the environment. Status information; diagnosis information for the failure of the vehicle 130; information related to the aerial upgrade of the software system in the vehicle 130; and/or information related to the parking area of the vehicle 130.
  • the information related to the parking area of the vehicle 130 may be detected by the on-board sensing device 236 when the vehicle 130 is close to the parking area.
  • the on-board subsystem 132 can determine the predetermined number of the parking point in the parking area, the location information of the parking point and/or the idle status of the parking point, etc., based on the perception information of the sensor device 236, as a reference to the parking area. Area-related information.
  • the driving control module 214 in the roadside subsystem 112 can more accurately control the driving and/or parking of the vehicle 130 in the parking area.
  • types of specific content contained in the perception message from the vehicle 130 for example, OBU 232
  • the roadside equipment 110 RSU 212
  • types of specific content contained in the perception message from the vehicle 130 for example, OBU 232
  • the roadside equipment 110 RSU 212
  • types of specific content contained in the perception message from the vehicle 130 for example, OBU 232
  • RSU 212 roadside equipment 110
  • Table 1.1 to Table 1.4 For example types of specific content contained in the perception message from the vehicle 130 (for example, OBU 232) to the roadside equipment 110 (RSU 212), refer to the perception message 202 shown in Table 1.1 to Table 1.4.
  • the roadside subsystem 112 may also obtain other types of information that the vehicle 130 can provide (for example, received from the OBU 232 through the RSU 212).
  • the acquired information can be used as auxiliary information for determining the decision planning message 204 and/or the control message 206 by the driving control module 214 of the roadside subsystem 112.
  • Such auxiliary information can be generated and transmitted in any message format.
  • the roadside subsystem 112 can obtain real-time operating information of the vehicle 130, for example.
  • the real-time operating information is related to the current operating status of the vehicle 130, and may include, for example, at least one of location information, driving direction information, driving route information, driving speed information, operating status information, and component status information of the vehicle 130.
  • the auxiliary information obtained from the on-board subsystem 132 may also include auxiliary planning information of the vehicle 130, which indicates the planning status of the vehicle 130.
  • the auxiliary planning information may include, for example, at least one of an indication of the driving intention of the vehicle 130, planned driving route information, and speed limit information.
  • the auxiliary information obtained from the on-board subsystem 132 may also include body information of the vehicle 130 to describe the physical state and appearance state of the vehicle 130 itself.
  • the vehicle body information may include at least one of the identification, type, descriptive information, current driving route information, and fault-related information of the vehicle 130, for example.
  • the auxiliary information may also include at least one of a special vehicle type and a right of way requirement level, for example.
  • the auxiliary information may also include other content, including less or more content.
  • the auxiliary information provided by the roadside subsystem 112 to the on-board subsystem 132 may be all or part of the content in Table 4 above, or may include other content.
  • the auxiliary information from the vehicle or its constituent elements may be compressed such as data to further reduce the amount of data transmitted between the roadside and the vehicle.
  • the above describes the cooperative driving control system and some examples of message interaction and message composition.
  • the message interaction between the roadside subsystem 112 and the on-board subsystem 132 for realizing cooperative driving control may be different, and other information interaction may also exist.
  • the following will refer to some example scenarios to describe the implementation of cooperative driving control on the roadside, the vehicle side, and possibly the cloud.
  • vehicles with autonomous driving capabilities can obtain the current status of the signal lights through sensing means, and pass through the intersection in accordance with the traffic rules of "stop at red light and go on green light".
  • traffic rules of "stop at red light and go on green light”.
  • each self-driving vehicle may only rely on its own decision-making control, and there will be problems with low traffic efficiency, such as low traffic due to continuous "games" between various self-driving vehicles effectiveness.
  • external equipment such as roadside equipment 110
  • roadside equipment 110 may be based on Roadside perception information and possibly other sources of perception information are used to perceive the road traffic conditions of the surrounding roads, and to perform global driving control on the passage of vehicles according to the overall rational traffic conditions, which can improve the traffic strategy and achieve more effective, reasonable and Safe driving control.
  • the roadside subsystem 112 of the roadside device 110 may send decision planning messages and/or control messages to the vehicles 130 on the road to guide the passage of the vehicles 130.
  • decision planning messages may be specific to one or more lanes in the environment 100, or specific to the vehicle 130. If a control message is to be sent, the control message will be specific to the vehicle 130.
  • FIG. 3A shows cooperative driving control under an example scene 310 of an intersection.
  • multiple vehicles ie, vehicles 130-1 to 130-8 are driving toward an intersection, and the roadside subsystem 112 can implement lane-level decision planning.
  • the OBU 232 of the onboard subsystem 132 of one or more vehicles 130 may send the vehicle perception information 224 to the roadside subsystem 112 of the roadside device 110, and may also send information such as One or more auxiliary information such as real-time operation information, auxiliary planning information, vehicle body information, and special vehicle instruction information.
  • the RSU 212 of the roadside subsystem 112 receives the information sent by the OBU 232, and receives the roadside sensor device 107 and possibly information from the remote device 120, thereby obtaining global environmental information.
  • the in-vehicle subsystem 132 may also indicate the specific needs of the vehicle 130 to the roadside subsystem 112 through sensing messages.
  • Such perception messages may include auxiliary information of the vehicle 130, such as the auxiliary information listed in Table 4, for example.
  • the roadside subsystem 112 may determine driving-related messages based on the collected perception information. If the generated driving-related message includes a perception message 202, the environment perception result indicated in the perception message 202 may be more accurate and comprehensive than the environment perception result sensed by a single vehicle 130.
  • the driving control module 214 of the roadside subsystem 112 may generate the decision planning message 204 based on information from multiple sources to Implement decision-making and planning at the road, lane, or vehicle level to achieve effective and reasonable intersection traffic.
  • the driving control module 214 generates lane-specific decision planning messages.
  • the decision planning message may indicate that different lanes are specified for different rights of way, that is, the right of passage for vehicles 130 on the lane, and may also indicate corresponding The start time and end time of the right of way.
  • the right-of-way related information of the lane may be obtained from, for example, the remote device 120, such as the system of the traffic management part.
  • the remote device 120 such as the system of the traffic management part.
  • the roadside subsystem 112 determines that the lanes in which the vehicles 130-1 and 130-2 are located have the right of way to go straight at the intersection, and also specify the start time and end time of the straight going .
  • the decision planning message may be transmitted to the on-board subsystem 232 within the time and starting position when it is determined that the vehicle 130 is under roadside control.
  • the lane-specific decision planning message may be broadcast. After receiving it, the vehicle 130 determines whether to use the received decision planning message according to the lane it is in.
  • the roadside subsystem 112 may first determine that a specific vehicle 130 is in a certain lane, and then send driving-related messages specific to the lane toward the vehicle 130.
  • the vehicle 130 can control the vehicle 130 according to the decision planning strategy indicated in the decision planning message 204 to ensure that the vehicle 130 drives in accordance with the decision planning strategy of the roadside subsystem 112. For example, in FIG. 3A, after receiving the decision planning message, the vehicles 130-1 and 130-2 may begin to go straight through the intersection at the starting time indicated by the decision planning message. When the end time indicated in the decision planning message is reached, the vehicle 130 may stop driving through the intersection.
  • driving related messages may be specific to the vehicle 130.
  • the perception message may include a description of the target object related to the vehicle 130, and the decision planning message may be customized for the individual vehicle 130.
  • the control message is completely used to control the driving of a specific vehicle 130.
  • the on-board subsystem 132 on the vehicle 130 can perform driving control accordingly.
  • the driving control module 214 may directly generate a control message 206 to indicate a specific control operation on the vehicle 130, and then the RSU 212 provides the control message 206 to the OBU 232.
  • FIG. 3B shows coordinated driving control under an example scene 312 of an intersection.
  • multiple vehicles ie, vehicles
  • the roadside subsystem 112 can implement vehicle-level decision planning.
  • the driving control module 214 of the roadside subsystem 112 can plan the vehicle 130-1 to turn right, and the vehicle 130-3 to go straight through the intersection.
  • the driving control module 214 generates a decision planning message specific to the vehicle 130-1 to guide the vehicle 130-1 to perform a right turn operation.
  • the driving control module 214 also generates a decision planning message specific to the vehicle 130-4 to guide the vehicle 130-1 to go straight.
  • the roadside sub-system 112 may also control the traffic signal lights 150-3 on the roadside to jointly implement a communication strategy. In some embodiments, the decision-making planning of the roadside subsystem 112 can also be implemented to allow safe and efficient passage of vehicles on roads without signal lights.
  • the application of the decision control message 204 or the control message 206 means that part or all of the driving control of the vehicle 130 will be executed by an external device, such as the roadside device 110.
  • an external device such as the roadside device 110.
  • such partial driving control or all driving control is activated only when it is determined that the roadside device 110 can take over the vehicle 130.
  • the takeover of the vehicle 130 may be caused by many reasons. For example, it may be because the vehicle 130, especially the vehicle 130 in the automatic driving mode, cannot cope with the current driving scenario, such as a software and/or hardware failure that requires other vehicles. Only through coordinated operation can normal driving (for example, stop driving due to being surrounded by surrounding vehicles) and so on.
  • the intervention of roadside equipment or other vehicles can help the vehicle 130 "get out of trouble” without manual intervention, which improves the automatic operation capability of the self-driving vehicle.
  • the vehicle 130 may request the roadside device 110 to take over for any other reasons, for example, in order to realize automatic parking, and so on.
  • FIG. 3C shows the cooperative driving control under an example scenario 320 of taking over the vehicle 130.
  • the vehicle 130-1 is blocked by surrounding vehicles or malfunctions, for example, the automatic driving is stopped on the road.
  • the on-board subsystem 132 in the vehicle 130-1 may send a takeover request message to the roadside subsystem 112 to request to take over the driving control of the vehicle 130.
  • the driving control module 214 of the roadside subsystem 112 can thus provide the control message 206 to the vehicle 130-1.
  • the request to take over message may only request a partial takeover of the driving control of the vehicle.
  • the driving control module 214 can provide a decision planning message 204 so that the vehicle 130-1 can drive according to the decision plan.
  • the driving control module 214 may also provide a control message 206 to provide overall control over the driving of the vehicle 130-1.
  • FIG. 3D also shows the cooperative driving control under the example scenario 330 of taking over the vehicle 130.
  • the vehicle 130-1 is blocked by other surrounding vehicles 130-2 and 130-3, and cannot leave the predicament through the self-driving decision of the vehicle.
  • the on-board subsystem 132 in the vehicle 130-1 may send a takeover request message to the roadside subsystem 112 to request to take over part or all of the driving control of the vehicle 130-1.
  • the roadside subsystem 112 sends a takeover response message to the on-board subsystem 132 to indicate whether the roadside at least partially takes over the driving control of the vehicle 130-1 .
  • the takeover request message indicates the identification information of the vehicle, the travel plan information of the vehicle, the reason for requesting takeover, and time information.
  • the identification information of the vehicle is used to enable the roadside device 110 to identify and/or locate the vehicle 130.
  • the identification information may include, for example, the identification of the vehicle, such as a VIN code, a license plate number, etc., and may also include the type, size, descriptive information, location information, description of surrounding objects, and so on.
  • the driving plan information indicates the driving plan of the vehicle 130, including but not limited to the current driving direction, destination, planned driving route, speed limit information (for example, the maximum allowable speed and/or acceleration).
  • the reason for requesting takeover may, for example, indicate the reason for the vehicle 130 requesting takeover, for example, a failure of its own vehicle, a specific demand such as automatic parking, failure of an automatic driving strategy, and so on.
  • Some examples of the information contained in the takeover request message provided by the on-board subsystem 132 to the roadside subsystem 112 are shown in Table 6 below.
  • the takeover response message provided by the roadside subsystem 112 to the onboard subsystem 132 indicates whether to take over the vehicle at least partially and the start time of the takeover, as listed in Table 7.1 below.
  • the driving control module 214 of the roadside subsystem 112 can thus provide the decision planning message 204 and/or the control message 206 to the vehicle 130-1 to Used for driving control.
  • the driving control module 214 determines that the smooth driving of the vehicle 130-1 requires the cooperation of one or more other vehicles 130 through the perception information received by each sensor information source. For example, the driving control module 214 determines that the vehicle 130-1 cannot bypass the vehicle 130-2 to continue forward, and the vehicle 130-2 needs to escape a certain space. Thus, the driving control module 214 also generates another decision planning message and/or another control message based on the existing information.
  • the RSU 212 then provides the generated decision planning message and/or control message to the vehicle 130-2.
  • the vehicle 130-2 may perform corresponding driving actions according to the received message, so as to cooperate with the vehicle 130-1 to leave the current blocked state.
  • the roadside subsystem 112 may send a perception message to the vehicle owner subsystem 132 to indicate specific information in the parking area.
  • the perception message can indicate the description of the target object in the parking area, the parking space number, the location of the parking space, the description of the parking space, etc.
  • the decision planning message sent by the roadside subsystem 112 may also specifically indicate the specific driving decision or planning of the vehicle 130 in the parking area.
  • Some example contents of decision planning messages are given in Table 7.3 below, for example.
  • the roadside device 110 or other traffic management nodes may also actively request or require the roadside device 110 to take over one or more vehicles 130 when it detects the need to take over or when facing an emergency situation.
  • the RSU 212 if in response to a takeover request message or the roadside subsystem 112 actively determines to take over the driving control of the vehicle 130 at least partially, the RSU 212 provides a takeover notification message to the vehicle 130.
  • the takeover notification message indicates takeover related information, such as but not limited to one or more of the takeover start time, takeover end time, takeover type, takeover reason, and takeover strategy.
  • the escape of the vehicle 130 may not rely on the roadside device 110 or the roadside subsystem 112, but the assistance of escape may be achieved through the interaction between the vehicles 130.
  • the vehicle 130-1 may send a message requesting escape to surrounding vehicles, such as the vehicles 130-2 and 130-3.
  • the escape request message may include, but is not limited to: identification information of the vehicle 130-1, type of the vehicle 130-1, location information of the vehicle 130-1, driving direction information, driving route information, driving speed information, and/or Location information of surrounding vehicles that affect the travel of the vehicle 130-1, and so on.
  • the vehicles 130-2 and 130-3 receive the message requesting escape, and can determine whether their own vehicle will affect the vehicle 130-1. If the vehicles 130-2 and 130-3 determine that their own vehicle may affect the vehicle 130-1, their driving planning strategy can be adjusted to give way to the vehicle 130-1 to help the vehicle 130-1 get out of trouble.
  • the escape response message of the vehicle 130-2 or 130-3 includes at least one of the following: identification information of the vehicle 130-2 or 130-3, an indication of whether the vehicle 130-2 or 130-3 avoids the vehicle, The planned route of the vehicle 130-2 or 130-3, the planned speed of the vehicle 130-2 or 130-3, and so on.
  • Table 8 and Table 9 below list some examples of the information contained in the escape request message and the escape response message, respectively.
  • the perception message 202 and/or the decision planning message 204 may include map information, because the on-board subsystem 232 needs to rely on the map in the environment when performing specific driving control, especially the high-precision map to guide the vehicle. drive.
  • the map information may indicate at least one of the identification of the map, the update method of the map, the area to be updated, and the location information.
  • the roadside subsystem 112 may broadcast the map version message, for example, it may broadcast periodically.
  • the map version message includes map version information, the geographic location targeted by the map version, a description of the map, and so on.
  • the map update at the vehicle 130 may be detected by the driving control module 234 on the vehicle side or the driving control module 214 on the roadside, and/or in response to the map update request from the vehicle 130, such as from the OBU 232
  • the map update request message of the RSU 212 may include map information in the generated perception message 202 and/or decision planning message 204.
  • the on-board subsystem 132 may determine to request to update the map.
  • the map update request message may indicate the time information, the identification of the vehicle 130, the type of the vehicle 130, the descriptive information of the vehicle 130, the location information of the vehicle 130, the map version information, the area where the map is requested to be updated, and the map update information. the way.
  • the roadside subsystem 112 After receiving the map update request message, the roadside subsystem 112 sends the map data to the vehicle according to the corresponding demand of the vehicle 130.
  • the map data may include the map version, the geographic location targeted by the map version, the corresponding area of the map data, and the corresponding update method of the map data.
  • Table 10.1 Table 10.2, and Table 10.3 respectively list some examples of the content contained in the map version message, map update message, and map data.
  • map update request message may also include other content, including less or more content.
  • the map update request message provided by the in-vehicle subsystem 132 to the roadside subsystem 112 may be all or part of the content in the above Table 1.1 to Table 1.4, or may include other content.
  • the map update request message or its constituent elements may be data compression, etc., to further reduce the amount of data transmitted between the roadside and the vehicle.
  • the traffic conditions of the road section ahead when driving on real road conditions, if the traffic conditions of the road section ahead can be known in advance, it can better assist transportation means (such as vehicles) in path planning.
  • Roadside perception-based traffic status recognition means that in a mixed traffic environment, the roadside sensing device and the on-board sensing device continuously perceive the surrounding road traffic information, and after processing by the roadside subsystem, it can identify the traffic flow and congestion of the current road section in real time By sending the recognition of the traffic condition as a perception message to the vehicle to assist the vehicle in making correct decision control.
  • the vehicle-mounted subsystem can determine the best travel path of the vehicle according to the traffic condition information, so as to realize the safe and efficient driving of the vehicle.
  • the driving control module 214 of the roadside subsystem 112 generates a perception message to include traffic condition information of the road on which the vehicle is located.
  • Traffic status information includes at least one of the following: the density of traffic participants on a section of the road, the average speed of traffic participants on the road section, the description of the lane-level congestion of the road section, the starting position of the road section, and the continuation of current traffic conditions time.
  • the sensing message may be broadcast through the RSU 212 of the roadside subsystem 112 or provided to the vehicle 130 through other communication connections.
  • the vehicle-mounted subsystem 132 of the vehicle 130 can determine the driving plan and control strategy of the vehicle 130 based on the traffic condition information and other perception information or perception results available to the vehicle-mounted subsystem 132. For example, if the in-vehicle subsystem 132 learns from the roadside subsystem 112 that the road section ahead is congested, the driving plan may be determined to change lanes in advance to avoid the congestion ahead.
  • some examples of the perception messages provided by the roadside subsystem 112 to the on-board subsystem 132 are given in Table 11 below.
  • the vehicle 130 capable of autonomous driving can be monitored from remote information during operation, and the vehicle can be remotely driven by a person or other operator when needed. This requires a vehicle to complete complex tasks, and due to the limitations of the vehicle's own perception and processing, it requires manual remote driving for automatic remote driving. In such an implementation, the vehicle 130 can support switching between remote driving and automatic driving.
  • a remote actuator can be provided, which simulates an actuator used for driving on a general vehicle.
  • the actuator may include components in one or more systems such as the power system, transmission system, braking system, steering system, and feedback system of the vehicle.
  • a remote actuator may be implemented based on software, for example, operated by a driving simulator or a graphical interface.
  • Such a remote actuator can also be partially or completely implemented by hardware.
  • the driving operation on the remote actuator is synchronized to the local actuator of the vehicle 130. That is, the driving operation of the vehicle 130 is initiated and controlled by the remote actuator, and is mapped to the local actuator.
  • the execution operation on the remote actuator may be performed by other operators while a person is alive.
  • Such remote control may be initiated when, for example, the vehicle 130 cannot perform driving control or is in a stagnant state.
  • the remote actuator or the terminal device at the remote actuator may initiate a remote driving request message for the vehicle 130.
  • the remote driving request message may indicate the start and/or end time and/or location of the remote driving, and the message also includes time information.
  • the remote driving request message may be sent to the roadside device 110, such as the roadside subsystem 112, for example.
  • the RSU 212 of the roadside subsystem 112 may send the remote driving request message to the OBU 232 of the on-board subsystem 132.
  • the roadside device 110 may initiate a control message 206 based on the remote driving.
  • the roadside subsystem 112 determines the remote execution operation on the remote actuator associated with the vehicle 130, and can convert the remote execution operation into control based on the parameters related to the local actuator of the vehicle 130 Message 206 to control the local execution agency to perform the same operation as the remote execution operation.
  • the parameters related to the local actuator of the vehicle 130 may be obtained from the storage device of the remote device 120 or directly from the on-board subsystem 132, for example.
  • the roadside subsystem 112 may provide an indication of the end of the remote driving to the onboard subsystem 132 of the vehicle 130. At this time, the vehicle 130 returns to the normal driving mode.
  • a device external to the vehicle 130 such as the on-board subsystem 132 on the roadside device 110 or other Giti tool 130, is based on the perception information related to the environment 100, In particular, the perceptual information related to the target object in the environment 100 is used to determine whether the target object is in an abnormal motion state.
  • one or more of the driving-related messages provided to the vehicle 130 such as the perception message 202, the decision planning message 204, and the control message 206, will indicate the abnormal motion state of the target object. Based on such driving-related information, the driving of the corresponding vehicle 130 can be controlled so that the vehicle 130 runs on a moving track that does not collide with the target object.
  • target objects that may threaten the driving safety of the vehicle 130 may include continuously stationary target objects, target objects that conflict with the moving direction of the vehicle 130, and the vehicle 130
  • the movement speed of 130 significantly does not match the target object, and so on.
  • the sensing and/or computing capabilities of the roadside device 110 may be used to assist the driving control of the vehicle 130 , And provide driving-related information to the vehicle 130 to indicate the target object in an abnormal motion state.
  • the target object of interest may be various types of objects that may exist in the environment 100, especially objects on or near the driving road of the vehicle 130, such as various types of vehicles, people, animals, and other objects. .
  • abnormal static state refers to continuous static for a period of time, which may be a period of time greater than a certain threshold. Abnormally stationary objects, especially objects stopped on the road, need to be notified to the driving control system of the driving vehicle 130 in a timely and accurate manner in order to make correct driving decisions and control.
  • Part or all of the multiple information sets at multiple time points may be sensed by the roadside sensor device 107 and reported periodically, or part or all may come from one or more other vehicles passing by a stationary target object 130 on-board sensing device 236 sensed.
  • the vehicle-mounted subsystem 132 of a certain vehicle performs the judgment of the motion state of the target object
  • the vehicle-mounted subsystem 132 can obtain multiple time points from the roadside subsystem 112 or from the remote device 120. Of multiple information sets.
  • the roadside subsystem 112 or the related on-board subsystem 132 can determine whether the target object has been stationary. If the target object has been stationary for a period of time, it can be determined that the target object is abnormally stationary.
  • the target object in an abnormally stationary state may be a vehicle in the environment 100, such as a vehicle parked on both sides of the road for a long time.
  • vehicle in the environment 100
  • Such long-term parking vehicles are sometimes called "zombie vehicles.”
  • the length of the time period for judging the abnormal static state can be set to a longer time, such as several weeks, months, etc.
  • the roadside subsystem 112 or the on-board subsystem 132 of other vehicles 130 may generate a perception message, which may include information such as those listed in Table 1.1 to Table 1.4 above. All or part of the content.
  • the generated perception message may include descriptions related to the target object, especially including the abnormal static state of the target object.
  • the perception message may also specifically indicate a vehicle (ie, a target object) that has been stationary for a long time as a "zombie vehicle.”
  • the roadside subsystem 112 or the in-vehicle subsystem 132 of other vehicles 130 may further generate decision planning messages or control messages for specific vehicles 130.
  • the decision planning message or the control message may also indicate the abnormal static state of the target object, which may be achieved by planning the driving trajectory of the vehicle 130 or controlling the driving action of the vehicle 130 relative to the target object in the abnormal static state. In this way, the vehicle 130 can be prevented from running on a moving track that may collide with a stationary target object.
  • the decision planning message or the control message may guide or control the moving direction, moving speed, or overall driving route of the vehicle 130. For example, if it is determined that there is a target object in an abnormally stationary state in a lane of the road, the decision planning message or the control message can guide the vehicle 130 to change lanes in advance to drive to other lanes, thereby avoiding collision with the target object.
  • the onboard subsystem 132 of the vehicle 130 may also use the information contained in the perception message received from the outside (for example, the description related to the target object) as the decision plan and/ Or control input to determine how to perform driving planning and control, such as controlling the moving direction, speed and route of the vehicle 130 to avoid collisions with stationary target objects.
  • the roadside subsystem 112 or the on-board subsystem 132 of the other vehicle 130 may broadcast the generated perception message after determining the target object in an abnormally stationary state.
  • the vehicle 130 within the communication range of the roadside subsystem 112 or the onboard subsystem 132 of other vehicles 130 may receive such perception messages.
  • the decision planning message and/or control message generated by the roadside subsystem 112 or the onboard subsystem 132 of the other vehicle 130 may be specific to the specific vehicle 130, and thus may be passed through the established The communication connection is transmitted to a specific vehicle 130.
  • FIG. 3E shows an example scene 340 in which the roadside subsystem 112 detects a target object in an abnormally stationary state in the environment 100.
  • the roadside device 110 obtains perception information related to the environment from the roadside sensor device 107 and/or the vehicle-mounted sensor device 236 of the vehicle 130, especially related to the vehicles 130-2 and 130 in the environment.
  • the roadside sensing device 107 may periodically detect the sensing information within its sensing range 102 and report it to the roadside subsystem 112 (for example, the RSU 212 therein).
  • the roadside subsystem 112 may also obtain the perception information collected by the on-board sensor device 236 from the on-board subsystem 132 on the other vehicles 130 passing by the vehicles 130-2 and 130-3.
  • the roadside device 110 (for example, the driving control module 214) may determine the perception result from the original perception information, and record the perception result of the vehicles 130-2 and 130-3 at multiple time points, for example, at each time point The location and speed of vehicles 130-2 and 130-3.
  • the roadside subsystem 112 (for example, the driving control module 214) can determine that the vehicles 130-2 and 130-3 are abnormally stationary Status, and driving-related messages (such as perception messages, and/or decision planning messages and/or control messages for a specific vehicle 130, for example, vehicle 130-1) may be generated.
  • driving-related messages such as perception messages, and/or decision planning messages and/or control messages for a specific vehicle 130, for example, vehicle 130-1
  • the RSU 212 of the roadside subsystem 112 may broadcast sensing messages, or may send decision planning messages and/or control messages to the vehicle-mounted subsystem 132 of the specific vehicle 130-1.
  • the vehicle-mounted subsystem 132 of the vehicle 130-1 After the vehicle-mounted subsystem 132 of the vehicle 130-1 receives the driving-related message, it can merge the driving-related message with the sensing result detected by the vehicle 130-1, determine the driving strategy of the vehicle 130-1, and transmit the driving strategy
  • the wire control system for the vehicle 130-1 realizes real-time control of the vehicle 130-1. For example, when it is known that the vehicles 130-2 and 130-3 are in an abnormally stationary state, the vehicle 130-1 can be controlled to perform an advance lane change so as to drive to the point where the vehicles 130-2 and 130-3 do not In the lane where the collision occurred.
  • a target object in an abnormally moving state in the environment 100 such as a target object that conflicts with the moving direction of the vehicle 130, and a target that does not significantly match the moving speed of the vehicle 130.
  • Objects etc. In a traffic environment, a target object in an abnormally moving state may threaten the driving safety of the vehicle 130. Therefore, such a target object needs to be detected in time to make correct driving decisions and control on the vehicle 130.
  • the judgment of the target object in the abnormally moving state can also be performed by the roadside subsystem 212 or the on-board subsystem 132 on a certain vehicle 130.
  • the movement of the target object when determining whether the target object is in an abnormal motion state, can also be determined from the perceptual information related to the target object, including at least the movement direction and/or the movement speed.
  • the acquired perception information also includes the perception information related to the reference object in the environment 100, and the movement of the reference object is determined, including at least the movement direction and/or the movement speed.
  • the reference object is usually selected as an object with a greater probability of being in a normal movement state, such as the vehicle 130 in the environment 100. The selection of the reference object depends on the motion state of the target object to be judged, which will be described below. By comparing the movement of the target object with the movement of the reference object, it can be determined whether the target object is in an abnormal movement state.
  • Abnormal movement states can include: retrograde state, that is, the movement direction of the target object is different from the movement direction of most objects in the environment, or does not conform to the movement direction prescribed by traffic rules; abnormal speed state, that is, the movement speed of the target object is compared with the reference The difference between the moving speeds of the objects is too large (for example, exceeding the speed difference threshold).
  • the abnormal speed state can be further divided into an abnormally slow state and an abnormally fast state. In the abnormally slow state, the moving speed of the target object is greater than that of the reference object, and in the abnormally fast state, the moving speed of the target object is less than that of the reference object. speed.
  • the moving direction of the target object may be compared with the moving direction of the reference object.
  • the reference object may be selected as an object on the same lane as the target object, for example, the target object is the target vehicle, and the reference object is selected as the reference vehicle on the same lane as the target vehicle (sometimes this The reference object in this case is also called the "first reference object").
  • the traffic lane refers to the part of the strip road that allows vehicles to be arranged longitudinally and drive safely, and it can consist of one or more lanes. There may be one-way traffic lanes on the driving road, in which the vehicles all drive in substantially the same direction.
  • the driving road may also have two-way traffic lanes, in which vehicles on the two traffic lanes drive in different directions. If it is detected that the movement direction of the response target object is opposite to the movement direction of the reference object, it can be determined that the target object is in a retrograde state. In order to accurately detect whether the target object is in a retrograde state, the reference object may have a higher confidence that the object is in the correct driving direction.
  • the moving speed of the target object is compared with the moving speed of the reference object.
  • the reference object used to determine the abnormal speed may be an object with the same expected moving direction as the target object.
  • both the reference object and the target object may be the reference vehicle and the target traffic on the lane in the same direction. tool.
  • the reference object at this time is sometimes referred to as the "second reference object".
  • the speed difference threshold may be set according to the relevant traffic rules of the road, or may be set based on the safe distance between objects (especially vehicles).
  • the reference object may be selected as an object having a greater probability of being in a normal moving or stationary state, such as a vehicle 130 in the environment 100 that is on the same lane as the target vehicle (target object). In the case that the speed difference between the target object and the reference object exceeds the speed difference threshold, if it is also determined that the moving speed of the target object is greater than the moving speed of the reference object, it can be determined that the target object is in an abnormally slow state.
  • the moving speed of the target object is less than the moving speed of the reference object, it can be determined that the target object is in an abnormally fast state.
  • the speed of vehicles needs to meet specific speed requirements. Vehicles that exceed the prescribed upper speed limit or lower than the lower speed limit may threaten the safety of other vehicles.
  • special planning and control of driving operations such as decelerating, accelerating, turning or changing the driving route, etc. are also required to avoid collisions .
  • the roadside subsystem 112 or the on-board subsystem 132 of the other vehicle 130 may generate a perception message after determining the target object in an abnormally moving state.
  • the perception message may include descriptions related to the target object, especially including descriptions of the abnormal movement state of the target object and other aspects of the target object, such as all or part of the content listed in Table 1.1 to Table 1.4 above.
  • descriptions related to the target object such as the movement state of the target object in Table 1.1, can also specifically indicate the specific abnormal movement state of the target object, such as the retrograde state, the abnormally slow state, and the abnormally fast state.
  • the roadside subsystem 112 or the in-vehicle subsystem 132 of other vehicles 130 may further generate decision planning messages or control messages for specific vehicles 130.
  • the decision planning message or the control message can also indicate the abnormal motion state of the target object. This can be achieved by planning the driving trajectory of the vehicle 130 or controlling the driving action of the vehicle 130 with respect to the target object in an abnormally moving state. In this way, the vehicle 130 can be prevented from running on a movement track that may collide with a target object that is moving abnormally.
  • the decision planning message or the control message may guide or control the moving direction, moving speed, or overall driving route of the vehicle 130. For example, if it is determined that there is a target object in a retrograde state on a lane of the road, a decision planning message or a control message can guide the vehicle 130 to avoid changing to that lane or to change lanes from that lane. For another example, if it is determined that there is an abnormally slow target object in the same lane and in front of the vehicle 130, the planning message or control message can guide the vehicle 130 to slow down and/or change lanes to avoid collisions. . Alternatively, if there is an abnormally fast target object in the same lane and behind the vehicle 130, the planning message or the control message can guide the vehicle 130 to accelerate and/or change lanes.
  • the roadside subsystem 112 or the on-board subsystem 132 of the other vehicle 130 may broadcast the generated perception message after determining the target object in an abnormally moving state.
  • the vehicle 130 within the communication range of the roadside subsystem 112 or the onboard subsystem 132 of other vehicles 130 may receive such perception messages.
  • the decision planning message and/or control message generated by the roadside subsystem 112 or the onboard subsystem 132 of the other vehicle 130 may be specific to the specific vehicle 130, and thus may be passed through the established The communication connection is transmitted to a specific vehicle 130.
  • FIG. 3F shows an example scene 350 in which the roadside subsystem 112 detects a target object in an abnormally stationary state in the environment 100.
  • the vehicles 130-1 to 130-5 are driving on the traffic lane in the same direction.
  • the roadside sensor device 107 and/or the vehicle-mounted sensor device 236 of the vehicle 130 obtain the perception information within the respective perception ranges 102 and 302, especially the perception information related to each vehicle therein.
  • the roadside sensor device 107 and the vehicle-mounted sensor device 236 can transmit the sensing information to the roadside subsystem 112 or the vehicle-mounted subsystem 132 of a certain vehicle 130.
  • the roadside subsystem 112 or the on-board subsystem 132 can determine the movement direction of the vehicle 130-3 and other vehicles on the lane (for example, the vehicle 130-5) based on the perception information. Or the moving direction of the vehicle 130-3) is different (opposite), so it can be determined that the vehicle 130-3 is in a retrograde state.
  • the roadside subsystem 112 or the on-board subsystem 132 may broadcast sensing messages, or may send decision planning messages and/or control messages to the on-board subsystem 132 of a specific vehicle 130-1 to indicate Vehicle 130-3 in a retrograde state.
  • the vehicle-mounted subsystem 132 of the vehicle 130-3 After the vehicle-mounted subsystem 132 of the vehicle 130-3 receives the driving-related message, it can merge the driving-related message with the sensing result detected by the vehicle 130-1 to determine the driving strategy of the vehicle 130-3 to control the vehicle 130 -3's wire control system drives according to the driving strategy. For example, the vehicle 130-1 may be controlled to not change lanes in the lane where the vehicle 130-4 is traveling backward.
  • the roadside subsystem 112 or the vehicle-mounted subsystem 132 of the vehicle 130-2 may determine that the moving speed of the vehicle 130-3 is less than that of the vehicle 130 based on the perception information. -2, and the difference between the two is too large (greater than the speed difference threshold), it can be determined that the vehicle 130-3 is in an abnormally slow state.
  • the roadside subsystem 112 or the on-board subsystem 132 may broadcast sensing messages, or may send decision planning messages and/or control messages to the on-board subsystem 132 of a specific vehicle 130-1 to indicate Vehicle 130-3 in an abnormally slow state.
  • the vehicle-mounted subsystem 132 of the vehicle 130-3 After the vehicle-mounted subsystem 132 of the vehicle 130-3 receives the driving-related message, it can merge the driving-related message with the sensing result detected by the vehicle 130-1 to determine the driving strategy of the vehicle 130-3 to control the vehicle 130 -3's wire control system drives according to the driving strategy. For example, the vehicle 130-1 may be controlled to reduce the speed and/or change lanes in advance.
  • FIG. 4A shows a flowchart of a method 400 for assisting driving control according to an embodiment of the present disclosure.
  • the method 400 may be implemented at the roadside subsystem 112 in FIGS. 1 and 2. It should be understood that although shown in a specific order, some steps in method 400 may be performed in a different order than shown or in a parallel manner. The embodiments of the present disclosure are not limited in this respect.
  • perceptual information related to the environment in which the vehicle is located is acquired, and the perceptual information includes at least information related to objects existing in the environment.
  • a driving-related message for the vehicle is generated, and the driving-related message includes at least one of a perception message, a decision planning message, and a control message.
  • acquiring the perception information includes acquiring at least one of the following: roadside perception information sensed by a sensor device that is arranged in the environment and independent of the vehicle; Vehicle perception information sensed by the sensing device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • the perception message includes at least one of the following: a description related to a target object existing in the environment to indicate the type, position information, speed information, direction information, physical appearance description information, historical trajectory, and At least one item in the predicted trajectory; information related to the physical condition of the road in the environment to indicate at least one of the physical condition of the road surface and the structured information of the road; information related to the traffic facilities in the environment Indicate at least one of the state of signal lights and traffic signs on the road; and road traffic condition information in the environment to indicate at least one of signs, traffic flow, and traffic incidents on the road and/or lanes in the road; and Information related to meteorological conditions in the environment; auxiliary information related to the positioning of vehicles; diagnostic information for vehicle faults; information related to air upgrades of software systems in vehicles; map information, indicating map signs and maps At least one of the update method, the area and location information of the map to be updated; information related to the parking area of the vehicle; and time information.
  • a description related to a target object existing in the environment to indicate the type, position information
  • the decision planning message includes at least one of the following: an indication of the road to which the decision is to be applied, start time information and/or end time information of the decision plan, and start position information and/or end point of the decision plan Location information, the identification of the vehicle targeted by the decision-making plan, the decision-making information related to the driving behavior of the vehicle, the decision-making information related to the driving action of the vehicle, the information of the trajectory point of the path planning, the expected time to reach the trajectory point of the path planning , Information related to other vehicles involved in decision-making and planning, map information, indicating at least one of the map's identifier, the map update method, the area and location information to be updated, and time information.
  • control message includes at least one of the following: kinematics control information related to the movement of the vehicle, and power related to at least one of the power system, transmission system, braking system, and steering system of the vehicle Learning control information, control information related to the ride experience of passengers in the vehicle, control information related to the traffic warning system of the vehicle, and time information.
  • generating a driving-related message includes: in response to detecting that the map information used by the vehicle is to be updated and/or in response to obtaining a map update request message from the vehicle, generating a perception message and a decision plan including the map information At least one item in the message, wherein the map update request message indicates at least one of the following: time information, identification information of a vehicle, an area requesting an update of the map, and a way of updating the map.
  • generating driving-related information further includes: acquiring at least one of real-time operation information, auxiliary planning information, body information, and special vehicle instruction information of the vehicle; and generating a decision plan based on the acquired information At least one of a message and a control message.
  • the real-time operating information includes at least one of location information, driving direction information, driving route information, driving speed information, operating status information, and component status information of the vehicle.
  • the auxiliary planning information includes at least one of an indication of the driving intention of the vehicle, planned driving route information, and speed limit information.
  • the vehicle body information includes at least one of the identification, type, descriptive information, current driving route information, and fault-related information of the vehicle.
  • the special vehicle indication information includes at least one of a special vehicle type and a right of way requirement level.
  • providing driving-related messages includes: providing driving-related messages to the vehicle in response to at least one of a time-based trigger, a location-based trigger, and an event-based trigger.
  • providing driving-related messages includes: receiving a request to take over the vehicle, the request to take over message indicates a request to take over at least part of the driving control of the vehicle; and in response to the request to take over the message, providing a decision planning message to the vehicle And at least one of the control messages.
  • the takeover request message indicates at least one of the following: identification information of the vehicle, travel plan information of the vehicle, reason for requesting takeover, and time information.
  • the method 400 further includes: in response to determining that the driving control of the vehicle is to be at least partially taken over, providing a takeover notification message to the vehicle, the takeover notification message indicating at least one of the following: start time of the takeover, end of the takeover Time, type of takeover, reason for takeover, and takeover strategy.
  • the method 400 further includes: in response to determining that the driving control of the vehicle requires the cooperation of another vehicle, generating at least one of another decision planning message and another control message based on the perception information; and The other vehicle provides at least one of the generated another decision planning message and another control message for driving control of the other vehicle.
  • generating a driving-related message includes: determining a remote execution operation on a remote actuator associated with the vehicle; and converting the remote execution operation into a parameter related to the local actuator of the vehicle Control messages to control the local actuators to perform the same operations as the remote ones.
  • FIG. 4B shows a flowchart of a method 402 for assisting driving control according to an embodiment of the present disclosure.
  • the method 402 may be implemented at the roadside subsystem 112 in FIGS. 1 and 2. It should be understood that although shown in a specific order, some steps in method 402 may be performed in a different order than shown or in a parallel manner. The embodiments of the present disclosure are not limited in this respect.
  • perceptual information related to the environment in which the vehicle is located is obtained.
  • a description related to a target object existing in the environment is determined based on the perceptual information, the target object being located outside the vehicle.
  • a driving-related message is generated based at least on the description related to the target object, the driving-related message including at least one of a perception message, a decision planning message, and a control message.
  • driving related information is provided to the vehicle for controlling the travel of the vehicle relative to the target object.
  • acquiring the perception information includes acquiring at least one of the following: roadside perception information sensed by a sensor device that is arranged in the environment and independent of the vehicle; Vehicle perception information sensed by the sensing device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • generating a driving-related message includes: generating a perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, motion state, retention time of the motion state, physical appearance
  • the description information type, location information, movement speed, movement direction, acceleration, acceleration direction, historical movement trajectory information, and predicted movement trajectory information.
  • generating a driving-related message further includes: generating a perception message to include traffic condition information of the road on which the vehicle is located, and the traffic condition information includes at least one of the following: the density of traffic participants on a section of the road, the section The average speed of traffic participants on the road, the description of the congestion condition of the lane level of the road segment, the starting position of the road segment, and the duration of the current traffic condition.
  • the driving related message is specific to the lane in the environment, and providing the driving related message to the vehicle includes: broadcasting the driving related message so that the vehicle in the lane can obtain the driving related message, or in response to determining the vehicle In the lane, send driving-related messages toward the vehicle.
  • the driving-related message includes a decision-making planning message
  • the decision-making planning message includes at least one of the following: lane identification, lane right-of-way information, starting time for vehicle control, and vehicle-specific information The starting position of the control.
  • driving-related messages are specific to the vehicle.
  • providing driving-related messages includes: receiving a takeover request message for a vehicle, the takeover request message indicating a request for at least partial takeover of driving control of the vehicle; and providing a takeover response message to the vehicle to indicate whether to take over at least partially Driving control of the vehicle; and in response to determining that the driving control of the vehicle is at least partially taken over, providing at least one of a decision planning message and a control message to the vehicle.
  • the takeover request message indicates at least one of the following: identification information of the vehicle, the type of the vehicle, the location information of the vehicle, the travel plan information of the vehicle, the reason for requesting the takeover, and time information, where the travel plan The information includes at least one of the following: the driving direction of the vehicle, the driving destination, the planned driving route, the maximum allowed speed, and the maximum allowed acceleration.
  • the takeover response message indicates at least one of the following: whether to take over the vehicle at least partially and the start time of taking over the vehicle.
  • FIG. 5A shows a flowchart of a method 500 for driving control according to an embodiment of the present disclosure.
  • the method 500 may be implemented at the on-board subsystem 132 in FIGS. 1 and 2. It should be understood that although shown in a specific order, some steps in method 500 may be performed in a different order than shown or in a parallel manner. The embodiments of the present disclosure are not limited in this respect.
  • a driving-related message is received from an external device of the vehicle, the driving-related message is generated based on perception information related to the environment in which the vehicle is located, and includes at least one of a perception message, a decision planning message, and a control message.
  • the driving of the vehicle in the environment is controlled based on the driving-related message.
  • the method 500 further includes: providing vehicle perception information sensed by a sensor device arranged in association with the vehicle to an external device as at least a part of the perception information.
  • the method 500 further includes: providing at least one of real-time operation information of the vehicle, auxiliary planning information, body information, and special vehicle indication information to the external device for use in decision-making planning messages and control messages. The generation of at least one item.
  • receiving a driving-related message includes: providing an external device with a request to take over the vehicle, the request to take over message indicates a request to take over at least part of the driving control of the vehicle; and receiving a decision planning message in response to the request to take over the message And at least one of the control messages.
  • the takeover request message indicates at least one of the following: identification information of the vehicle, travel plan information of the vehicle, reason for requesting takeover, and time information.
  • the method 500 further includes: receiving a takeover notification message for a takeover request message from an external device, the takeover notification message indicating at least one of the following: the start time of the takeover, the end time of the takeover, the type of takeover, and the reason for the takeover. And the strategy of takeover.
  • receiving a driving-related message includes: providing an external device with a remote driving request message for the vehicle; and receiving a control message in response to the remote driving request message, the control message including an executable by a local executive agency of the vehicle Control instructions and control messages are obtained by converting the execution operation on the remote actuator based on the parameters related to the local actuator.
  • FIG. 5B shows a flowchart of a method 502 for driving control according to an embodiment of the present disclosure.
  • the method 502 may be implemented at the on-board subsystem 132 in FIGS. 1 and 2. It should be understood that although shown in a specific order, some steps in method 502 may be performed in a different order than shown or in a parallel manner. The embodiments of the present disclosure are not limited in this respect.
  • a driving-related message is received from an external device of the vehicle, the driving-related message is generated based on the perception information related to the environment in which the vehicle is located and indicates a description related to a target object outside the vehicle, and the driving-related message includes perception At least one of a message, a decision planning message, and a control message.
  • the driving of the vehicle in the environment is controlled with respect to the target object.
  • the perceptual information includes at least one of the following: roadside perceptual information sensed by a sensing device arranged in the environment and independent of the vehicle; and transmitted in association with the vehicle. Vehicle perception information sensed by the sensing device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • receiving a driving-related message includes: receiving a perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, movement state, retention time of the movement state, physical appearance
  • the description information type, location information, movement speed, movement direction, acceleration, acceleration direction, historical movement trajectory information, and predicted movement trajectory information.
  • receiving driving-related messages further includes: receiving perception messages to include traffic condition information of the road on which the vehicle is located, and the traffic condition information includes at least one of the following: the density of traffic participants on a section of the road, the section The average speed of traffic participants on the road, the description of the congestion condition of the lane level of the road segment, the starting position of the road segment, and the duration of the current traffic condition.
  • the driving-related message is specific to the lane in which the vehicle is located, and the driving-related message includes a decision planning message.
  • the decision planning message includes at least one of the following: lane identification, lane right-of-way information, and traffic The start time of the control of the tool and the start position of the control of the vehicle.
  • driving-related messages are specific to the vehicle.
  • the method 502 further includes: sending an escape request message to at least another vehicle, the escape request message including at least one of the following: identification information of the vehicle, type of the vehicle, location information of the vehicle, and driving direction Information, driving route information, driving speed information, and location information of surrounding vehicles that affect the driving of the vehicle; and receiving an escape response message from another vehicle, the escape response message including at least one of the following: an identification of the other vehicle Information, an indication of whether another vehicle is avoiding the vehicle, the planned route of the other vehicle, and the planned speed of the other vehicle.
  • the method 502 further includes: providing another perception message to an on-board subsystem associated with another vehicle, the perception message including a description of another target object in the environment, and the perception message including the following: At least one of the items: classification, movement status, retention time of the movement status, description information of the physical appearance, type, position information, movement speed, movement direction, acceleration, acceleration direction, historical movement trajectory information, and predicted movement trajectory information.
  • receiving a driving-related message includes: providing an external device with a request to take over the vehicle, and the request to take over message indicates a request to take over at least part of the driving control of the vehicle; in response to the request to take over the message, receiving a decision planning message and At least one of the control messages; and the request to take over message indicates at least one of the following: the request to take over message indicates at least one of the following: identification information of the vehicle, type of the vehicle, location information of the vehicle, and travel plan information of the vehicle , The reason for the request to take over and time information, where the driving plan information includes at least one of the following: the driving direction of the vehicle, the driving destination, the planned driving route, the maximum allowed speed and the maximum acceleration allowed.
  • FIG. 6A shows a schematic block diagram of an apparatus 600 for assisting driving control according to an embodiment of the present disclosure.
  • the device 600 may be included in the roadside subsystem 112 in FIGS. 1 and 2 or implemented as the roadside subsystem 112.
  • the apparatus 600 includes a perception acquisition module 610, configured to obtain perception information related to the environment in which the vehicle is located by an external device of the vehicle, and the perception information includes at least information related to objects existing in the environment;
  • the message generation module 620 is configured to generate driving-related messages for vehicles based on at least the perception information, the driving-related messages including at least one of perception messages, decision planning messages, and control messages; and the message supply module 630, configured to The vehicle provides driving-related information for driving control of the vehicle.
  • the perception acquisition module is configured to acquire at least one of the following: roadside perception information sensed by a sensor device arranged in the environment and independent of the vehicle; and arranged in association with the vehicle Vehicle perception information sensed by the sensor device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • the message generation module includes: a map update-based message generation module configured to update in response to detecting that the map information used by the vehicle is to be updated and/or in response to obtaining a map update request message from the vehicle, At least one of a perception message and a decision planning message including map information is generated.
  • the map update request message indicates at least one of the following: time information, identification information of a vehicle, an area for which a map update is requested, and a method of map update.
  • the message generation module further includes: an auxiliary information acquisition module configured to acquire at least one of real-time operation information, auxiliary planning information, vehicle body information, and special vehicle indication information of the vehicle; and based on the auxiliary information
  • the message generating module of is configured to generate at least one of a decision planning message and a control message based on the acquired information.
  • the message supply module is configured as: a trigger-based message supply module is configured to provide driving to the vehicle in response to at least one of a time-based trigger, a location-based trigger, and an event-based trigger Related news.
  • the message supply module includes: a takeover request receiving module configured to receive a takeover request message for a vehicle, the takeover request message indicating a request to take over at least part of the driving control of the vehicle; and a message supply based on the takeover request The module is configured to provide at least one of a decision planning message and a control message to the vehicle in response to the request to take over the message.
  • the apparatus 600 further includes: a notification supply module configured to provide a takeover notification message to the vehicle in response to determining that the driving control of the vehicle is to be at least partially taken over, the takeover notification message indicating at least one of the following: The start time, the end time of the takeover, the type of takeover, the reason for the takeover, and the takeover strategy.
  • a notification supply module configured to provide a takeover notification message to the vehicle in response to determining that the driving control of the vehicle is to be at least partially taken over, the takeover notification message indicating at least one of the following: The start time, the end time of the takeover, the type of takeover, the reason for the takeover, and the takeover strategy.
  • the apparatus 600 further includes: another message generating module configured to generate another decision planning message and another message based on the perception information in response to determining that the driving control of the vehicle requires the cooperation of another vehicle. At least one of the control messages; and another message supply module configured to provide at least one of the generated another decision planning message and another control message to another vehicle for driving another vehicle control.
  • the message generation module includes: an operation determination module configured to determine a remote execution operation on a remote actuator associated with the vehicle; a conversion-based message generation module configured to also be based on The parameters related to the local actuator of the tool convert the remote execution operation into a control message to control the local actuator to perform the same operation as the remote execution operation.
  • FIG. 6B shows a schematic block diagram of a device 602 for assisting driving control according to an embodiment of the present disclosure.
  • the device 602 may be included in the roadside subsystem 112 in FIGS. 1 and 2 or implemented as the roadside subsystem 112. As shown in FIG.
  • the device 602 includes a perception acquisition module 680 configured to obtain perception information related to the environment in which the vehicle is located by an external device of the vehicle; the target determination module 685 is configured to determine the The description related to the target object existing in the environment, the target object is located outside the vehicle; the message generation module 690 is configured to generate driving-related messages based on at least the description related to the target object, and the driving-related messages include perception messages, decision planning messages, and At least one of the control messages; and the message supply module 695, configured to provide driving-related messages to the vehicle for controlling the driving of the vehicle relative to the target object.
  • a perception acquisition module 680 configured to obtain perception information related to the environment in which the vehicle is located by an external device of the vehicle
  • the target determination module 685 is configured to determine the The description related to the target object existing in the environment, the target object is located outside the vehicle
  • the message generation module 690 is configured to generate driving-related messages based on at least the description related to the target object, and the driving-related messages include perception messages, decision planning messages
  • the perception acquisition module is configured to acquire at least one of the following: roadside perception information sensed by a sensor device arranged in the environment and independent of the vehicle; and arranged in association with the vehicle Vehicle perception information sensed by the sensor device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • the message generation module includes: a first perception message generation module, configured to generate a perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, movement state , The retention time of the motion state, the description information of the physical appearance, type, position information, moving speed, moving direction, acceleration, acceleration direction, historical motion trajectory information, and predicted motion trajectory information.
  • a first perception message generation module configured to generate a perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, movement state , The retention time of the motion state, the description information of the physical appearance, type, position information, moving speed, moving direction, acceleration, acceleration direction, historical motion trajectory information, and predicted motion trajectory information.
  • the message generation module includes: a second perception message generation module configured to generate a perception message to include traffic condition information of the road on which the vehicle is located, and the traffic condition information includes at least one of the following: a section of the road The density of traffic participants on the road, the average speed of traffic participants on the road segment, the description of the lane-level congestion condition of the road segment, the starting position of the road segment, and the duration of current traffic conditions.
  • the lane in the environment includes: a message broadcasting module configured to broadcast driving-related messages so that vehicles in the lane can obtain driving-related messages; or a message sending module configured to In response to determining that the vehicle is in the lane, a driving-related message is sent toward the vehicle.
  • a message broadcasting module configured to broadcast driving-related messages so that vehicles in the lane can obtain driving-related messages
  • a message sending module configured to In response to determining that the vehicle is in the lane, a driving-related message is sent toward the vehicle.
  • the driving-related message includes a decision-making planning message
  • the decision-making planning message includes at least one of the following: lane identification, lane right-of-way information, starting time for vehicle control, and vehicle-specific information The starting position of the control.
  • driving-related messages are specific to the vehicle.
  • the message supply module includes: a request receiving module configured to receive a request to take over message for a vehicle, the request to take over message indicates a request to take over at least part of the driving control of the vehicle; and a response supply module configured to send The vehicle provides a takeover response message to indicate whether to at least partially take over the driving control of the vehicle; and the decision planning and/or control message supply module is configured to respond to the determination to take over the driving control of the vehicle at least in part, to the vehicle Provide at least one of a decision planning message and a control message.
  • the takeover request message indicates at least one of the following: identification information of the vehicle, the type of the vehicle, the location information of the vehicle, the travel plan information of the vehicle, the reason for requesting the takeover, and time information, where the travel plan The information includes at least one of the following: the driving direction of the vehicle, the driving destination, the planned driving route, the maximum allowed speed, and the maximum allowed acceleration.
  • the takeover response message indicates at least one of the following: whether to take over the vehicle at least partially and the start time of taking over the vehicle.
  • FIG. 7A shows a schematic block diagram of an apparatus 700 for driving control according to an embodiment of the present disclosure.
  • the apparatus 700 may be included in the on-board subsystem 132 in FIGS. 1 and 2 or implemented as the on-board subsystem 132.
  • the apparatus 700 includes a message receiving module 710 configured to receive driving-related messages from external devices of the vehicle.
  • the driving-related messages are generated based on perception information related to the environment in which the vehicle is located, and include perception messages, At least one of a decision planning message and a control message; and a driving-related control module 720 configured to control the driving of the vehicle in the environment based on the driving-related message.
  • the apparatus 700 further includes a perception supply module configured to provide vehicle perception information sensed by a sensing device arranged in association with the vehicle to an external device as at least a part of the perception information .
  • the apparatus 700 further includes: an auxiliary information supply module configured to provide at least one of real-time operation information, auxiliary planning information, vehicle body information, and special vehicle instruction information of the vehicle to an external device for use in The generation of at least one of the decision planning message and the control message.
  • an auxiliary information supply module configured to provide at least one of real-time operation information, auxiliary planning information, vehicle body information, and special vehicle instruction information of the vehicle to an external device for use in The generation of at least one of the decision planning message and the control message.
  • the message receiving module includes: a request to take over supply module configured to provide an external device with a request to take over message for a vehicle, the request to take over message indicating a request to take over at least part of the driving control of the vehicle; and a takeover-based
  • the message receiving module is configured to receive at least one of a decision planning message and a control message in response to the request to take over the message.
  • the takeover request message indicates at least one of the following: identification information of the vehicle, travel plan information of the vehicle, reason for requesting takeover, and time information.
  • the apparatus 700 further includes: a notification receiving module configured to receive a takeover notification message for a takeover request message from an external device, the takeover notification message indicating at least one of the following: the start time of the takeover, the end time of the takeover, The type of takeover, the reason for the takeover, and the takeover strategy.
  • a notification receiving module configured to receive a takeover notification message for a takeover request message from an external device, the takeover notification message indicating at least one of the following: the start time of the takeover, the end time of the takeover, The type of takeover, the reason for the takeover, and the takeover strategy.
  • receiving driving-related messages includes: a remote driving request module configured to provide an external device with a remote driving request message for a vehicle; and a control message receiving module configured to receive a remote driving request message
  • the control message includes a control command executable by the local actuator of the vehicle, and the control message is obtained by converting the execution operation on the remote actuator based on the parameters related to the local actuator.
  • FIG. 7B shows a schematic block diagram of a device 702 for driving control according to an embodiment of the present disclosure.
  • the device 702 may be included in the on-board subsystem 132 in FIGS. 1 and 2 or implemented as the on-board subsystem 132.
  • the device 702 includes a message receiving module 750 configured to receive driving-related messages from external devices of the vehicle, the driving-related messages are generated based on the perception information related to the environment in which the vehicle is located and indicate that The description related to the target object, and the driving-related message includes at least one of a perception message, a decision planning message, and a control message; and the driving control module 760 is configured to control the vehicle in the environment relative to the target object based on the driving-related message Driving in.
  • a message receiving module 750 configured to receive driving-related messages from external devices of the vehicle, the driving-related messages are generated based on the perception information related to the environment in which the vehicle is located and indicate that The description related to the target object, and the driving-related message includes at least one
  • the perceptual information includes at least one of the following: roadside perceptual information sensed by a sensing device arranged in the environment and independent of the vehicle; and transmitted in association with the vehicle. Vehicle perception information sensed by the sensing device; and vehicle perception information sensed by a sensor device arranged in association with another vehicle.
  • the message receiving module includes: a first perception message receiving module configured to receive the perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, movement status , The retention time of the motion state, the description information of the physical appearance, type, position information, moving speed, moving direction, acceleration, acceleration direction, historical motion trajectory information, and predicted motion trajectory information.
  • a first perception message receiving module configured to receive the perception message to include at least a description related to the target object, including at least one of the following items of the target object: classification, movement status , The retention time of the motion state, the description information of the physical appearance, type, position information, moving speed, moving direction, acceleration, acceleration direction, historical motion trajectory information, and predicted motion trajectory information.
  • the message receiving module includes: a second perception message receiving module configured to receive the perception message to include traffic condition information of the road on which the vehicle is located, and the traffic condition information includes at least one of the following: a section of the road The density of traffic participants on the road, the average speed of traffic participants on the road segment, the description of the lane-level congestion condition of the road segment, the starting position of the road segment, and the duration of current traffic conditions.
  • the driving-related message is specific to the lane in which the vehicle is located, and the driving-related message includes a decision planning message.
  • the decision planning message includes at least one of the following: lane identification, lane right of way information, and traffic The start time of the control of the tool and the start position of the control of the vehicle.
  • driving-related messages are specific to the vehicle.
  • the device 702 further includes: an escape request module configured to send an escape request message to at least another vehicle, the escape request message including at least one of the following: identification information of the vehicle, the type of the vehicle, and traffic Tool location information, driving direction information, driving route information, driving speed information, and location information of surrounding vehicles that affect the driving of the vehicle; and a response receiving module configured to receive an escape response message from another vehicle to escape
  • the response message includes at least one of the following: identification information of another vehicle, an indication of whether the other vehicle is avoiding the vehicle, the planned route of the other vehicle, and the planned speed of the other vehicle.
  • the apparatus 702 further includes: a perception message supply module configured to provide another perception message to an on-board subsystem associated with another vehicle, the perception message including a description of another target object in the environment, and perception The message includes at least one of the following items of another target object: classification, movement state, retention time of the movement state, description information of the physical appearance, type, position information, movement speed, movement direction, acceleration, acceleration direction, history Motion trajectory information, and predicted motion trajectory information.
  • a perception message supply module configured to provide another perception message to an on-board subsystem associated with another vehicle, the perception message including a description of another target object in the environment, and perception The message includes at least one of the following items of another target object: classification, movement state, retention time of the movement state, description information of the physical appearance, type, position information, movement speed, movement direction, acceleration, acceleration direction, history Motion trajectory information, and predicted motion trajectory information.
  • the message receiving module includes: a request to take over module configured to provide an external device with a request to take over message for the vehicle, the request to take over message indicates a request to take over the driving control of the vehicle at least in part; and a decision planning message and /Or the control message receiving module is configured to receive at least one of the decision planning message and the control message in response to the request to take over the message.
  • the takeover request message indicates at least one of the following: the takeover request message indicates at least one of the following: identification information of the vehicle, type of the vehicle, location information of the vehicle, travel plan information of the vehicle, request to take over The reason and time information, where the driving plan information includes at least one of the following: the driving direction of the vehicle, the driving destination, the planned driving route, the maximum allowed speed and the maximum acceleration allowed.
  • FIG. 8 shows a schematic block diagram of an example device 800 that can be used to implement embodiments of the present disclosure.
  • the device 800 may be used to implement the roadside subsystem 112 or the on-board subsystem 132 of FIGS. 1 and 2.
  • the device 800 includes a computing unit 801, which can be based on computer program instructions stored in a read only memory (ROM) 802 or loaded from a storage unit 808 to a random access memory (RAM) 803. Perform various appropriate actions and processing.
  • ROM read only memory
  • RAM random access memory
  • the computing unit 801, the ROM 802, and the RAM 803 are connected to each other through a bus 804.
  • An input/output (I/O) interface 805 is also connected to the bus 804.
  • the I/O interface 805 includes: an input unit 806, such as a keyboard, a mouse, etc.; an output unit 807, such as various types of displays, speakers, etc.; and a storage unit 808, such as a magnetic disk, an optical disk, etc. ; And a communication unit 809, such as a network card, a modem, a wireless communication transceiver, etc.
  • the communication unit 809 allows the device 800 to exchange information/data with other devices through a computer network such as the Internet and/or various telecommunication networks.
  • the computing unit 801 may be various general-purpose and/or special-purpose processing components with processing and computing capabilities. Some examples of computing unit 801 include, but are not limited to, central processing unit (CPU), graphics processing unit (GPU), various dedicated artificial intelligence (AI) computing chips, various computing units that run machine learning model algorithms, digital signal processing DSP, and any appropriate processor, controller, microcontroller, etc.
  • the calculation unit 801 executes the various methods and processes described above, for example, the method 800.
  • the method 400 or the method 500 may be implemented as a computer software program, which is tangibly contained in a computer-readable medium, such as the storage unit 808.
  • part or all of the computer program may be loaded and/or installed on the device 800 via the ROM 802 and/or the communication unit 809.
  • the computer program When the computer program is loaded into the RAM 803 and executed by the computing unit 801, one or more steps of the method 400 or the method 500 described above can be executed.
  • the computing unit 801 may be configured to execute the method 400 or the method 500 in any other suitable manner (for example, by means of firmware).
  • exemplary types of hardware logic components include: field programmable gate array (FPGA), application specific integrated circuit (ASIC), application specific standard product (ASSP), system on chip (SOC), Load programmable logic device (CPLD) and so on.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • ASSP application specific standard product
  • SOC system on chip
  • CPLD Load programmable logic device
  • the program code used to implement the method of the present disclosure can be written in any combination of one or more programming languages. These program codes can be provided to the processors or controllers of general-purpose computers, special-purpose computers, or other programmable data processing devices, so that when the program codes are executed by the processor or controller, the functions specified in the flowcharts and/or block diagrams/ The operation is implemented.
  • the program code can be executed entirely on the machine, partly executed on the machine, partly executed on the machine and partly executed on the remote machine as an independent software package, or entirely executed on the remote machine or server.
  • a computer-readable medium may be a tangible medium, which may contain or store a program for use by or in combination with the instruction execution system, apparatus, or device.
  • the computer-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • the computer-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard drives, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM portable compact disk read only memory
  • magnetic storage device or any suitable combination of the above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Atmospheric Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

用于驾驶控制的方法、装置、设备、介质和系统,该驾驶控制的方法包括由交通工具的外部设备获取与交通工具所处的环境相关的感知信息(480);基于感知信息确定与环境中存在的目标物体相关的描述,目标物体位于交通工具外部(485);至少基于与目标物体相关的描述来生成驾驶相关消息(490),驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及向交通工具提供驾驶相关消息,以用于相对于目标物体来控制交通工具的行驶(495)。由此,可以更准确、及时检测环境中可能会影响到交通工具的行驶的物体,引导交通工具的行驶。

Description

用于驾驶控制的方法、装置、设备、介质和系统
相关申请的交叉引用
本申请要求于2019年2月13日提交的、名称为“用于驾驶控制的方法、装置、设备、介质和系统”的PCT国际专利申请PCT/CN2019/074989的优先权,该国际专利申请通过引用全文合并于此。
技术领域
本公开的实施例总体涉及计算机领域,并且更具体地,涉及自动驾驶控制技术。
背景技术
随着科技的发展和进步,通过自动控制系统控制交通工具的驾驶能够给人们的出行带来便利。近年来,自动驾驶(也称为无人驾驶)作为人工智能的一个应用场景,已经成为各种交通工具、特别汽车产业的新发展方向。交通工具的自动驾驶能力越来越令人期待。一般而言,自动驾驶的实现可以包括通过传感设备对交通工具的周围环境进行感知,基于感知结果执行关于驾驶的决策规划,并且根据决策规划来控制交通工具的具体驾驶操作。因此,环境感知、决策规划以及控制任一方面的准确性和效率都将影响到交通工具的驾驶安全和舒适以及整体交通系统的运行效率。
发明内容
根据本公开的实施例,提供了一种用于驾驶控制的方案。
在本公开的第一方面中,提供了一种协助驾驶控制的方法。该方法包括由交通工具的外部设备获取与交通工具所处的环境相关的感知信息;基于感知信息确定与环境中存在的目标物体相关的描述,目标物体位于交通工具外部;至少基于与目标物体相关的描述来生成驾驶相关消 息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及向交通工具提供驾驶相关消息,以用于相对于目标物体来控制交通工具的行驶
在本公开的第二方面中,提供了一种驾驶控制的方法。该方法包括从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成并且指示与交通工具外部的目标物体相关的描述,并且驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及基于驾驶相关消息,相对于目标物体来控制交通工具在环境中的驾驶。
在本公开的第三方面中,提供了一种协助驾驶控制的装置。该装置包括:感知获取模块,被配置为由交通工具的外部设备获取与交通工具所处的环境相关的感知信息;目标物确定模块,被配置为基于感知信息确定与环境中存在的目标物体相关的描述,目标物体位于交通工具外部;消息生成模块,被配置为至少基于与目标物体相关的描述来生成驾驶相关消息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及消息供应模块,被配置为向交通工具提供驾驶相关消息,以用于相对于目标物体来控制交通工具的行驶。
在本公开的第四方面中,提供了一种驾驶控制的装置。该装置包括:消息接收模块,被配置为从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成并且指示与交通工具外部的目标物体相关的描述,并且驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及驾驶控制模块,被配置为基于驾驶相关消息,相对于目标物体来控制交通工具在环境中的驾驶。
在本公开的第五方面中,提供了一种电子设备。该电子设备包括一个或多个处理器;以及存储装置,用于存储一个或多个程序,当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现根据本公开的第一方面的方法。
在本公开的第六方面中,提供了一种电子设备。该电子设备包括一个或多个处理器;以及存储装置,用于存储一个或多个程序,当一个或多个程序被一个或多个处理器执行,使得一个或多个处理器实现根据本 公开的第二方面的方法。
在本公开的第七方面中,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,计算机指令用于使计算机执行根据本公开的第一方面的方法。
在本公开的第八方面中,提供了一种存储有计算机指令的非瞬时计算机可读存储介质,计算机指令用于使计算机执行根据本公开的第二方面的方法。
在本公开的第九方面中,提供了一种用于协同驾驶控制的系统,包括:路侧子系统,包括根据本公开的第三方面的装置;以及车载子系统,包括根据本公开的第四方面的装置。
应当理解,发明内容部分中所描述的内容并非旨在限定本公开的实施例的关键或重要特征,亦非用于限制本公开的范围。本公开的其它特征将通过以下的描述变得容易理解。
附图说明
结合附图并参考以下详细说明,本公开各实施例的上述和其他特征、优点及方面将变得更加明显。在附图中,相同或相似的附图标注表示相同或相似的元素,其中:
图1示出了本公开的多个实施例能够在其中实现的示例环境的示意图;
图2示出了根据本公开的一些实施例的协同驾驶控制系统的框图;
图3A至图3F示出了根据本公开的一些实施例的协调驾驶控制的示例场景的示意图;
图4A根据本公开的一些实施例的在路侧的用于协助驾驶控制的方法的流程图;
图4B根据本公开的一些实施例的在路侧的用于协助驾驶控制的方法的流程图;
图5A根据本公开的一些实施例的在车侧的用于驾驶控制的方法的流程图;
图5B根据本公开的一些实施例的在车侧的用于驾驶控制的方法的 流程图;
图6A根据本公开的一些实施例的在路侧的用于驾驶控制的装置的示意框图;
图6B根据本公开的一些实施例的在路侧的用于驾驶控制的装置的示意框图;
图7A根据本公开的一些实施例的在车侧的用于驾驶控制的装置的示意框图;
图7B根据本公开的一些实施例的在车侧的用于驾驶控制的装置的示意框图;以及
图8示出了能够实施本公开的多个实施例的设备的框图。
具体实施方式
下面将参照附图更详细地描述本公开的实施例。虽然附图中显示了本公开的某些实施例,然而应当理解的是,本公开可以通过各种形式来实现,而且不应该被解释为限于这里阐述的实施例,相反提供这些实施例是为了更加透彻和完整地理解本公开。因此,本领域普通技术人员应当认识到,可以对这里描述的实施例做出各种改变和修改,而不会背离本申请的范围和精神。同样,为了清楚和简明,以下的描述中省略了对公知功能和结构的描述。
在本公开的实施例的描述中,术语“包括”及其类似用语应当理解为开放性包含,即“包括但不限于”。术语“基于”应当理解为“至少部分地基于”。术语“一个实施例”或“该实施例”应当理解为“至少一个实施例”。术语“第一”、“第二”等等可以指代不同的或相同的对象。下文还可能包括其他明确的和隐含的定义。
如以上提及的,在交通工具的驾驶控制,特别是自动驾驶控制中,环境感知、决策规划和/或控制是很重要的方面。在当前的自动驾驶领域中,要求车辆本身能够感知周围环境并具有计算能力来处理感知结果和生成操作指令。然而,基于单车感知的驾驶存在很大局限性。例如,在车辆上安装的传感设备和计算设备的造价和维护成本高、重复利用率低。受限于车辆的位置和视野,车辆的传感设备的感知范围有限,易于被遮 挡,难以获得全局感知能力。此外,从单个车辆的角度来执行驾驶的决策规划和控制,也无法实现整体交通通行效率提升和解决特殊情况下的车辆通行问题。
根据本公开的实施例,提出了一种改进的驾驶控制方案。该方案基于多侧协同来实现交通工具的驾驶控制。具体地,由交通工具的外部设备基于与环境相关的感知信息来生成感知消息、决策规划消息和控制消息中的至少一个。所生成的感知消息、决策规划消息和/或控制消息被提供给交通工具。交通工具利用基于接收到的消息来实现驾驶控制。在一些实施例中,基于感知信息可以确定与环境中存在的目标物体相关的描述,特别是位于交通工具外部的目标物体相关的描述。驾驶相关消息基于与目标物体相关的描述来生成。由此,可以相对于目标物体来控制交通工具的自动驾驶,实现更灵活高效的驾驶控制。
根据一些示例实施例,还提出了一种改进的协同式驾驶控制方案。具体地,由远端设备基于与环境相关的感知信息,确定目标物体是否处于异常运动状态。在确定目标物体处于异常运动状态时,生成驾驶相关消息以指示目标物体的异常运动状态。所生成的驾驶相关消息被提供给交通工具,以用于控制交通工具行驶在与目标物体不碰撞的运动轨迹上。由此,可以更准确、及时检测环境中可能会影响到交通工具的行驶的异常物体,从而引导交通工具的安全行驶。
以下将参照附图来具体描述本公开的实施例。
示例环境
图1示出了本公开的多个实施例能够在其中实现的示例交通环境100的示意图。在该示例环境100中包括一个或多个交通工具130-1、130-2、130-3。为便于描述,交通工具130-1、130-2、130-3统称为交通工具130。如本文中所使用的,交通工具指的是能够承载人和/或物并且可移动的任何类型的工具。在图1以及本文的其他附图和描述中,交通工具130被图示为车辆。车辆可以是机动车辆或非机动车辆,其示例包括但不限于小汽车、轿车、卡车、公交车、电动车、摩托车、自行车,等等。然而,应当理解,车辆仅仅是交通工具的一个示例。本公开的实 施例同样适用于除车辆之外的其他交通工具,诸如船、火车、飞机等等。
环境100中的一个或多个交通工具130可以是具有一定自动驾驶能力的交通工具,也被称为无人驾驶交通工具。当然,环境100中的另外一个或一些交通工具130可以是不具有自动驾驶能力的交通工具。这样的交通工具可以由驾驶者控制。一个或多个交通工具130中的集成设备或者可移除设备可以具有基于一个或多个通信技术来与其他设备通信的能力,例如通过车到车(V2V)技术、车到基础设施(V2I)技术、车到网(V2N)、车到万物或车联网(V2X)技术或者任何其他通信技术来与其他交通工具或其他设备进行通信。
交通工具130可以安装有定位装置以确定其自身位置,该定位装置例如可以基于以下技术中的任一种来实现定位:基于激光点云数据的定位技术、全球定位系统(GPS)技术、全球导航卫星系统(GLONASS)技术、北斗导航系统技术、伽利略定位系统(Galileo)技术、准天顶卫星系统(QAZZ)技术、基站定位技术、Wi-Fi定位技术等。
除交通工具130之外,环境100中还可能存在其他物体,诸如是动物、植物105-1、人105-2、交通基础设施等可移动或不可移动的物体。交通基础设施包括用于引导交通通行和指示交通规则的物体,诸如交通信号灯150-3、交通指示牌(未示出)、路灯等等。位于交通工具外部的物体统称为车外物体105。在某个交通工具130的驾驶控制过程中,需要关注可能会影响到交通工具行驶的车外物体,这样的物体被称为目标物体。
环境100中还部署交通工具130的一个或多个外部设备,诸如路侧设备110和远端设备120。外部设备通常独立于交通工具130被布置,并且可以远离交通工具130定位。在本文中,“独立于”交通工具130的设备指的是至少不会随着交通工具130的移动而移动的设备。路侧设备110和远端设备120可以是具有计算能力的任何设备、节点、单元、设施等。作为示例,远端设备可以是通用计算机、服务器、大型服务机、诸如边缘计算节点等网络节点、诸如虚拟机(VM)等云端计算设备、以及任何其他提供计算能力的设备。
在本公开的实施例中,交通工具130可以具有与路侧设备110的有 线和/或无线通信连接。例如,一个或多个交通工具130可以与路侧设备110具有通信连接101,与远端设备120具有通信连接103,并且路侧设备110和120之间也可以存在通信连接105。注意,对于每个交通工具130,可以建立与路侧设备110和120之一的通信连接,或者可以与两者都具有通信连接。
经由通信连接,路侧设备110和/或远端设备120可以通过信令收发来主控或辅助控制一个或多个交通工具130的驾驶。例如,路侧设备110中集成/安装/固定有路侧子系统112(也称为“道路子系统”),交通工具130-1中集成/安装/固定有车载子系统132。路侧子系统112与车载子系统132之间互相通信,以实现对交通工具130的驾驶控制。不同车载子系统132之间也可以互相通信,以实现协同驾驶控制。虽然未示出,除路侧设备110和交通工具130-1之外,其他路侧设备和交通工具中也可以分别装备路侧子系统112和车载子系统132。
在一些实施例中,路侧设备110可以被部署在交通工具130行驶和/或停泊的地理区域附近,例如可以按一定间隔部署在道路两侧、或者与交通工具130可能出现的位置具有预定距离。在一些实现中,交通工具130与路侧设备110之间的通信连接101可以基于短距离通信技术。当然,通信连接101也可以基于实现其他距离通信的技术,本公开的范围在此方面不受限制。
在一些实施例中,远端设备120可以是联网的计算基础设施。例如,远端设备120可以被部署在云端或者在其他网络环境中的计算节点,诸如远端计算节点、服务器、边缘计算设备。在云环境中,远端设备120有时也可以称为云端设备。远端设备120可以提供较高的计算能力、存储能力和/或通信能力。远端设备120与路侧设备110的通信连接103和/或与交通工具130的通信连接105可以基于远距离通信技术。虽然可能物理上不是位于交通工具130的驾驶区域附近,但通过高速通信连接,远端设备120仍然可以获得对交通工具130的实时控制和/或非实时控制。
环境100中还布置有一个或多个传感设备107。传感设备107独立于交通工具130布置,用于监测环境100的状况,以获得与环境100相 关的感知信息。例如,传感设备107可以感测环境100中的交通参与者,包括但不限于交通工具130、行人、骑行者等目标物体,并且探测目标物体的类型、位置、速度、方向等信息。传感设备107也可以称为路侧传感设备。路侧设备110,特别是被部署在驾驶环境附近的路侧设备110可以与一个或多个传感设备107具有有线和/或无线的通信连接。在通信速率允许的情况下,更远距离的远端设备120可以也可以与一个或多个传感设备107进行通信,或者经由路侧设备110进行中转。由与道路相应布置的传感设备107采集到的感知信息也可以被称为路侧感知信息。传感设备107的感知信息可以被提供给具有通信连接的路侧设备110。虽然被示出为互相独立的设备,在一些实现中,传感设备107也可以与路侧设备110、120部分或完全集成在一起。
多个传感设备107可以按一定间隔布置,用于监测环境100的特定地理范围。在一些示例中,除了将传感设备107固定在特定位置之外,还可以设置可移动的传感设备107,诸如可移动感知站点等。取决于感知能力,传感设备107的感知范围是有限的。图1示意性示出传感设备107的感知范围102。在感知范围102内出现的物体或者现象可以被传感设备107感应到。在图1的示例中,车外物体105-1至105-3以及交通工具130-1和130-2均位于传感设备107的感知范围102内。交通工具130-3不在感知范围102内,其可以不在任何传感设备的感知范围,或者可以位于其他未示出的传感设备的感知范围。在一些情况下,多个相邻部署的传感设备107的感知范围可以具有部分重叠。
为了更全面地监测与交通工具130相关的环境100,传感设备107可以被布置在交通工具130行驶和/或停泊的区域附近。根据需要,传感设备107可以被布置在路侧、路面上或者以一定高度被部署,例如由支撑杆固定在某个高度。传感设备107可以包括一个或多个传感器单元,这些传感器单元可以是相同类型或不同类型,并且可以被分布在感知范围102的同一位置或不同位置。
传感设备107中的传感器单元的示例可以包括但不限于:图像传感器(例如摄像头)、激光雷达、毫米波雷达、红外传感器、定位传感器、光照传感器、压力传感器、温度传感器、湿度传感器、风速传感器、风 向传感器、空气质量传感器等等。图像传感器可以采集图像信息;激光雷达和毫米波雷达可以采集激光点云数据;红外传感器可以利用红外线来探测环境中的环境状况;定位传感器可以采集物体的位置信息;光照传感器可以采集指示环境中的光照强度的度量值;压力、温度和湿度传感器可以分别采集指示压力、温度和湿度的度量值;风速、风向传感器可以分别采集用于指示风速、风向的度量值;空气质量传感器可以采集一些与空气质量相关的指标,诸如空气中的氧气浓度、二氧化碳浓度、粉尘浓度、污染物浓度等。应当理解,以上仅列出了传感器单元的一些示例。根据实际需要,还可以存在其他类型的传感器。
应当理解,图1示出的设施和物体仅是示例。在不同环境中出现的物体的类型、数目和相对布置等可能会变化。本公开的范围在此方面不受限制。例如,环境100中可以具有更多路侧部署的路侧设备110和传感设备107,用于监测另外的地理位置。本公开的实施例可以涉及多个远端设备120,或者可以不涉及远端设备120。
示例系统
根据本公开的实施例,无论交通工具的自动驾驶能力如何,交通工具外部的设备,包括路侧设备或远端设备,被配置为与交通工具协同工作,以提供对交通工具的部分或全部驾驶控制。部分或全部驾驶控制通过向交通工具提供感知消息、决策规划消息和/或控制消息来实现。由此,远端设备的路侧子系统和交通工具上的车载子系统实现了协同驾驶控制。通过外部设备的辅助实现交通工具的驾驶控制功能,可以实现有效的安全自动驾驶。
图2示出了根据本公开的一些实施例的协同驾驶控制系统200的示意框图。系统200涉及图1的交通工具130、路侧设备110以及路侧传感设备107。在一些实施例中,远端设备120也可以与路侧设备110一起或者替代路侧设备110来执行驾驶控制,特别是在远端设备120与交通工具130之间的通信速率允许的情况下。因此,在下文中针对路侧设备110、特别是路侧子系统112所描述的功能也可以相应地被实现在远端设备120处。在一些实施例中,协同驾驶控制可能还涉及驾驶环境中 的交通基础设施,诸如交通信号灯150-3。
如图2所示,路侧设备110的路侧子系统112包括路侧单元(RSU)212和驾驶控制模块214,交通工具130中的车载子系统132包括车载单元(OBU)232、驾驶控制模块234,并且可选地包括与交通工具130相关联地布置的一个或多个车载传感设备236。RSU 212被配置为与路侧子系统112外部的设备和/或组件进行通信,例如与其他路侧子系统112、远端设备120和/或车载子系统132等,OBU 232被配置为与车载子系统132外部的设备和/或组件进行通信,例如与路侧子系统112、其他车载子系统132和/或远端设备120等。
路侧子系统112中的驾驶控制模块214被配置为处理与驾驶控制相关的信息和生成要有RSU 212传输的驾驶相关消息,以用于交通工具130的驾驶控制。车载子系统132中的驾驶控制模块234被配置为处理与驾驶控制相关的信息,获取驾驶相关消息,以及基于信息处理结果和/或获取的驾驶相关消息来控制交通工具130的驾驶。
驾驶相关消息可以具有符合在路侧子系统112与车载子系统132之间所使用的通信技术的任何格式。驾驶相关消息可以包括以下至少一项:指示环境感知结果的感知消息202,指示交通工具130在驾驶时的决策规划的决策规划消息204,以及指示交通工具130在驾驶时的具体驾驶操作的控制消息206。按顺序来看,这三类消息所指示的信息分别均依赖于前一类消息中包含的信息。例如,决策规划至少基于环境感知结果,控制消息通常在决策规划的基础上或是在环境感知结果的基础上做出。这意味着,如果向车载子系统132发送感知消息202和/或决策规划消息204,车载子系统132需要进一步处理以得到控制消息,从而实现对交通工具130的驾驶控制。交通工具130可以配备有线控系统,包括控制车辆的制动系统、、转向系统、传动系统、车身控制等,能够控制车辆加速、减速、转向、灯光、双闪等。
从感知消息、到决策规划消息、再到控制消息,对于交通工具130的驾驶控制不断明确,对于交通工具130侧的自动驾驶控制能力的要求逐步减少。在与车载子系统132的交互中,所提供的驾驶相关消息的具体类型可以取决于各种触发因素,诸如基于时间的触发、基于位置的触 发和基于事件的触发中的至少一个。
基于时间的触发例如可以是按周期广播或在特定时间段内发送一个或多个预定类型的驾驶相关消息,例如发送感知消息。在与路侧设备110建立通信连接并且在通信范围内的交通工具130均可以接收到广播的驾驶相关消息,特别是感知消息202。
基于位置的触发例如可以基于交通工具130的位置来触发一个或多个预定类型的驾驶相关消息。可以确定交通工具130是否处于预定区域内(例如,交通岔路口)、处于特定路段内和/或与参考物体(例如与路侧设备110)的距离在预定阈值内等,并且在确定满足条件时发送预定类型的驾驶相关消息。在这样的触发条件下,可以发送感知消息202、决策规划消息204和/或控制消息206。
基于事件的触发诸如可以包括来自车载子系统132的请求。车载子系统132可以根据请求的指示来发送具体类型的驾驶相关消息。备选地或附加地,基于事件的触发还可以包括检测到与交通工具130相关的预定事件的发生。例如,检测到交通工具130处于危险状态、故障状态、功率不足状态等,可以发送决策规划消息204和控制消息206,以应对危险、故障状态,降低对交通工具130处的计算和功率消耗。还可以定义触发传输一个或多个驾驶相关消息的其他事件。
以下将详细介绍各个驾驶相关消息的生成和其具体内容。
感知消息
具体地,路侧子系统112中的RSU 212被配置从一个或多个环境感知源获取所检测的环境100相关的感知信息,并基于感知信息生成驾驶相关消息202、204、206。感知信息可以指示环境100以及其中存在的物体的一个或多个方面,这取决于环境感知源的能力。对于驾驶控制而言,感知信息至少包括与环境100中存在的物体相关的信息。
环境感知源可以包括一个或多个路侧传感设备107,其向路侧子系统112提供路侧感知信息220。这样的路侧传感设备可以提供更好的感知视角和更全面和精确的环境感知。由于路侧传感设备107的存在,使得交通工具130甚至在不具备感知能力、仅具备有限感知能力和/或由于其他物体的遮挡而存在感知盲区时,仍然能够获得依赖于路侧的感知而 完成自动驾驶功能,从而实现低成本的安全自动驾驶。
备选地或附加地,环境感知源还可以包括交通工具130上的车载传感设备236,包括要控制的目标交通工具130和其他交通工具130,其向路侧子系统112提供交通工具感知信息224。车载传感设备236的类型、数目和/或感知能力等可以与一个或多个路侧传感设备107相同或不同。在一些实施例中,环境感知源还可以包括远端设备120,其可以从数据库中和/或从其他设备处获得与环境100相关的感知信息。通过综合来自不同环境感知源的感知信息,可以实现对环境100的协同式感知,以更全面地感知环境100。这样的协同式感知可以弥补不同环境感知源的感知精度、感知视角、感知类型等一个或多个方面的局限性。
一个或多个环境感知源可以利用相应类型的传感设备监测环境100中的静态和/或动态的物体,诸如行人、骑行者、交通工具、路面突出的物体等,还可以检测与交通通行相关的交通设施,诸如交通信号灯、交通标志灯。备选地或附加地,环境感知源还可以监测道路的路面状况、道路交通状况、环境在的气象状况、地理区域、对目标交通工具的监控和诊断等等。RSU 212从环境感知源获得的感知信息可以是传感设备采集的原始感知数据,经过部分处理后的感知数据,和/或感知结果。例如,来自车载子系统132的感知信息可以是原始感知数据,或者由驾驶控制模块234部分或全部处理后的感知数据或感知结果。
所获得的感知信息被提供给驾驶控制模块214。驾驶控制模块214被配置为基于感知信息进行分析。具体地,驾驶控制模块214可以被配置为基于感知信息,对环境100中可能出现的物体进行识别,并确定与识别出的物体的相关信息。驾驶控制模块214还可以被配置为基于感知信息,确定对环境100和/或环境100中的交通工具130的一个或多个其他方面的感知结果。驾驶控制模块214可以采用诸如数据融合技术等多种数据分析技术来处理感知信息。在一些实施例中,通过对感知信息的分析,驾驶控制模块214可以生成感知消息202,其中包括对感知信息的分析结果。感知消息202可以经由RSU 212被提供给车载子系统132中OBU 232。
在交通工具130处,车载子系统132中的驾驶控制模块234可以基 于感知消息202来执行交通工具130的驾驶控制。例如,驾驶控制模块234可以基于感知消息202来生成自车的决策规划,并且基于所生成的决策规划来控制自车的驾驶操作。在一些实施例中,如果交通工具130具有环境感知能力,例如集成有车载传感设备236,和/或可以从路侧设备110之外的其他设备(例如其他交通工具130)获得感知信息,则驾驶控制模块234可以基于从RSU 212接收到的感知消息202和其他感知信息一起来执行交通工具130的驾驶控制。不同感知消息/信息的来源可以通过数据融合等技术来进行处理。在一些实施例中,如果交通工具130不具有环境感知能力,驾驶控制模块234可以直接将接收到的感知消息202作为交通工具130的决策控制的输入。
取决于感知信息的来源和具体内容,感知消息202可以包括与环境100和/或环境100中的交通工具130的一个或多个其他方面相关的指示信息。在一些实施例中,感知消息202可以包括以下的一个或多个方面的信息:与环境100中存在的目标物体相关的描述,诸如可以包括目标物体的分类、位置信息、速度信息、方向信息、物理外观的描述信息、具体类型、运动状态、运动状态的保持时间、感知置信度、历史运动轨迹信息、预测运动轨迹信息、状态追踪信息中的至少一项;与环境100中的道路的物理状况相关的信息,以指示道路的路面物理状况和道路的结构化信息中的至少一项;与环境100中的交通设施相关的信息,以指示道路上的信号灯状态和交通标志中的至少一项;与环境100中的道路交通状况信息,以指示道路和/或道路中的车道相关的车道标志、交通流量和交通事件中的至少一项;以及与环境100中的气象状况相关的信息。在一些实施例中,感知消息202还可以包括与交通工具130的定位相关的辅助信息、对交通工具130的故障的诊断信息、与交通工具130中的软件系统的OTA相关信息、和/或时间信息。
在一些实施例中,感知消息202还可以包括地图信息,地图信息例如可以指示地图的标识、地图的更新方式、要更新地图的区域和位置信息中的至少一项。地图信息可以指示符合自动驾驶场景需求的高精地图。地图的标识例如可以包括地图的版本号、地图的更新频率等。地图的更新方式例如可以指示地图的更新源(从远端设备120或是从其他外部 源)、更新链接、更新时间等。
在一些实施例中,感知消息202还可以包括与交通工具130的停泊区域相关的信息。在交通工具130具有停泊意图时,例如检测到交通工具130靠近预定停泊地理区域或者发起停泊请求时,感知消息202中可以包括与交通工具130的停泊区域相关的信息。当然,在一些实施例中,也可以无需判断停泊意图而总是在感知消息202中包含与交通工具130的停泊区域相关的信息。与停泊区域相关的信息可以包括停泊区域中停泊点(例如车位)的相关信息、停泊区域中的目标物体信息等。在一些实施例中,与停泊区域相关的信息例如可以包括停泊区域中的停泊点的预定编号,停泊点的位置信息,停泊点的物理状态的描述,停泊点的空闲状况,和/或停泊区域的出入口信息。
在一些实施例中,与停泊区域相关的信息例如可以由远端设备120提供给路侧设备110。路侧设备110将这些信息包括在感知消息202中提供给交通工具130,以辅助交通工具130尽快寻找到合适的停泊点。在一些实施例中,路侧设备110也可以不在感知消息202中包括与停泊区域相关的信息,而是基于这样的信息来生成后续的决策规划消息204和/或控制消息206,以控制交通工具130在停泊区域中的驾驶和/或停泊。
为便于理解,感知消息202中的不同类型信息的一些具体示例和描述在以下表1.1中给出。
表1.1 感知消息(RSU→OBU)
Figure PCTCN2019103876-appb-000001
Figure PCTCN2019103876-appb-000002
Figure PCTCN2019103876-appb-000003
Figure PCTCN2019103876-appb-000004
表1.2 目标物体的类型
类型 备注
全量物体  
动态物体 高精地图中没有标记的障碍物
静态物体 高精地图中标记的障碍物
表1.3 目标物体的运动状态
运动状态 备注
静止 速度为零或者小于一个阈值
运动 具有一定的速度(大于一个阈值)
…… 还可以按移动速度细分其他运动状态
表1.4 目标物体的物理外观的描述信息
Figure PCTCN2019103876-appb-000005
虽然以上列出了感知消息202所包含的内容的一些示例,但应当理解,感知消息202还可以包括其他内容,包括更少或更多的内容。在每次通信中,路侧子系统112提供给车载子系统132的感知消息202可以是以上表1.1至表1.4中的全部内容、部分内容,或者可以包括其他内容。在传输时,感知消息202或其中的组成元素可以诸如数据压缩等,以进一步降低在路侧与交通工具之间传输的数据量。
在一些实施例中,一个交通工具130上的车载子系统132也可以向其他交通工具130上的车载子系统132(例如,通过OBU之间的通信连接)提供感知数据,例如以感知消息202的格式提供感知数据。在一个 实施例中,在车载子系统132之间交互的感知消息202可以主要包括目标物体相关的描述,例如以上表1.1至表1.4中设计的目标物体相关的描述。在车载子系统132之间交互的感知消息例如可以被单独在以下表1.5中列出。
表1.5 感知消息(OBU→OBU)
Figure PCTCN2019103876-appb-000006
Figure PCTCN2019103876-appb-000007
决策规划消息
决策规划消息204可以由路侧子系统112的驾驶控制模块214基于感知信息来生成。驾驶控制模块214通过处理感知信息,确定感知结果后,基于感知结果来生成决策规划消息204。在执行决策规划时,驾驶控制模块214可以确定按照道路或车道级别执行规划,针对同一条道路或车道上的交通工具130执行统一规划。此外,驾驶控制模块214也可以按照交通工具级别执行规划。由于具有环境100的更全面的感知信息,路侧子系统112可以考虑一定地理区域内的全部交通工具或交通参与方的状况,确定更合理的决策规划。
在一些实施例中,在确定需要路侧端对交通工具130的驾驶决策或 规划进行干预的时候,或者在路侧子系统112接管对交通工具130的驾驶控制时(例如,按照地理区域、根据交通工具130的请求或在获得交通工具130的许可之后),路侧子系统112向车载子系统132提供决策规划消息204。决策规划消息204例如可以实时或者按较小时间间隔(可取决于交通工具130的移动速度)被提供给车载子系统132。
在交通工具130处,车载子系统132中的驾驶控制模块234可以基于决策规划消息204来执行交通工具130的驾驶控制。例如,驾驶控制模块234可以基于决策规划消息204来生成自车控制信息,其指示交通工具130的具体驾驶操作。在一些实施例中,车载子系统132,诸如其中的驾驶控制模块234,可以确定是否使用决策规划消息204来控制交通工具130。在不使用决策规划消息204的情况下,该消息可以被丢弃。在需要使用的情况下,驾驶控制模块234可以完全遵守决策规划消息204中所指示的针对交通工具130的决策规划,或者备选地,可以结合本地策略并且参考消息204所指示的决策规划来制定本地决策规划。本公开的实施例在此方面不受限制。
路侧子系统112通过各个环境感知源可以获取到全局环境感知,包括但不限于道路交通状况信息、道路物理状态相关信息、基础设施信息、地图信息等,在对交通工具130进行驾驶规划时能够更合理有效地利用道路资源,保证驾驶更加安全有效。
在一些实施例中,决策规划消息204可以包括以下的一个或多个方面的信息:与要应用决策的行驶道路的指示,决策规划的起始时间信息和/或终止时间信息,决策规划的起始位置信息和/或终点位置信息,决策规划所针对的交通工具的标识,与交通工具的驾驶行为相关的决策信息,与交通工具的驾驶动作相关的决策信息,路径规划轨迹点的信息,到达路径规划轨迹点的预期时间,与决策规划所涉及的其他交通工具相关的信息,地图信息以及时间信息等等。地图信息例如可以指示地图的标识、地图的更新方式、要更新地图的区域和位置信息中的至少一项。地图的标识例如可以包括地图的版本号、地图的更新频率等。地图的更新方式例如可以指示地图的更新源(从远端设备120或是从其他外部源)、更新链接、更新时间等。
为便于理解,决策规划消息204中的不同类型信息的一些具体示例和描述在以下表2中给出。
表2.决策规划消息(RSU→OBU)
Figure PCTCN2019103876-appb-000008
Figure PCTCN2019103876-appb-000009
虽然以上列出了决策规划消息204所包含的内容的一些示例,但应当理解,决策规划消息204还可以包括其他内容,包括更少或更多的内容。在每次通信中,路侧子系统112提供给车载子系统132的决策规划消息204可以是以上表2中的全部内容、部分内容,或者可以包括其他内容。在传输时,决策规划消息204或其中的组成元素可以诸如数据压缩等,以进一步降低在路侧与交通工具之间传输的数据量。
控制消息
控制消息206可以由路侧子系统112的驾驶控制模块214基于决策规划信息来生成。控制消息206可以指定交通工具206的具体驾驶操作。
在一些实施例中,在确定需要路侧端对交通工具130的驾驶决策或规划进行干预的时候,或者在路侧子系统112接管对交通工具130的驾驶控制时(例如,按照地理区域、根据交通工具130的请求或在获得交通工具130的许可之后),路侧子系统112向车载子系统132提供控制消息206。控制消息206例如可以实时或者按较小时间间隔(可取决于交通工具130的移动速度)被提供给车载子系统132。
在交通工具130处,车载子系统132中的驾驶控制模块234可以基于控制消息206来执行交通工具130的驾驶控制。例如,驾驶控制模块 234可以基于控制消息206来确定交通工具130的具体驾驶操作。在一些实施例中,车载子系统132,诸如其中的驾驶控制模块234,可以确定是否使用控制消息206来控制交通工具130。在不使用控制消息206的情况下,该消息可以被丢弃。在需要使用的情况下,驾驶控制模块234可以完全遵守控制消息206中所指示的针对交通工具130的控制操作,或者备选地,可以结合本地策略并且参考消息206所指示的控制操作来确定本地控制操作。本公开的实施例在此方面不受限制。
路侧子系统112通过各个环境感知源可以获取到全局环境感知,包括但不限于道路交通状况信息、道路物理状态相关信息、基础设施信息、地图信息等,在对交通工具130进行控制操作制定时能够更合理有效地利用道路资源,确保在特殊需求情况下自动驾驶车辆的安全行驶。
在一些实施例中,控制消息206可以包括以下的一个或多个方面相关的信息:对交通工具的运动相关的运动学控制信息,与交通工具的动力系统、传动系统、制动系统和转向系统中的至少一项相关的动力学控制信息,与交通工具中的乘坐者的乘坐体验相关的控制信息,与交通工具的交通警示系统相关的控制信息,以及时间信息。
为便于理解,控制消息206中的不同类型信息的一些具体示例和描述在以下表3中给出。
表3.控制消息(RSU→OBU)
Figure PCTCN2019103876-appb-000010
Figure PCTCN2019103876-appb-000011
虽然以上列出了控制消息206所包含的内容的一些示例,但应当理解,控制消息206还可以包括其他内容,包括更少或更多的内容。在每次通信中,路侧子系统112提供给车载子系统132的控制消息206可以是以上表2中的全部内容、部分内容,或者可以包括其他内容。在传输时,控制消息206或其中的组成元素可以诸如数据压缩等,以进一步降低在路侧与交通工具之间传输的数据量。
来自交通工具的辅助信息
在一些实施例中,如以上提及的,交通工具130处的车载子系统132也可以向路侧子系统112提供交通工具感知信息224,以供驾驶控制模块214用于生成驾驶相关消息。交通工具感知信息224可以直接由OBU232发送给RSU 212,或者在被处理并且生成一定格式的感知消息。
由车载子系统132提供的感知消息包括由车载传感设备236检测到的与环境相关的信息,其中的内容的类型可以包括从路侧子系统112到车载子系统132的感知消息202中的内容类型的一个或多个。在一些实 施例中,由车载子系统132提供的感知消息可以包括与环境100中存在的目标物体相关的描述,以指示目标物体的类型、位置信息、速度信息、方向信息、物理外观的描述信息、历史轨迹和预测轨迹中的至少一项。在一些实施例中,由车载子系统132提供的感知消息可以备选地或附加与环境中的气象状况相关的信息;与交通工具130的定位相关的辅助信息地包括诸如与环境中的道路交通状况信息;对交通工具130的故障的诊断信息;与交通工具130中的软件系统的空中升级相关信息;和/或与交通工具130的停泊区域相关的信息。
在与交通工具130的停泊区域相关的信息可以在交通工具130靠近停泊区域时通过车载传感设备236检测到。针对停泊区域的特点,车载子系统132可以基于传感设备236的感知信息,确定停泊区域中的停泊点的预定编号,停泊点的位置信息和/或停泊点的空闲状况等,以作为与停泊区域相关的信息。通过车载子系统132提供的停泊区域相关的信息,路侧子系统112中的驾驶控制模块214可以更准确地控制交通工具130在停泊区域中的驾驶和/或停泊。
从交通工具130(例如,OBU 232)到路侧设备110(RSU 212)的感知消息中包含的具体内容的示例类型可以参考表1.1至表1.4中示出的感知消息202。当然,应当理解,可以不必包括与感知消息202中所列出的所有类型,并且在一些实施例中,也可以指示其他类型的信息。
在一些实施例中,除交通工具感知信息224之外或者作为备选,路侧子系统112还可以获取交通工具130可提供的其他类型的信息(例如通过RSU 212从OBU 232接收)。所获取的这些信息可以作为辅助信息,用于由路侧子系统112的驾驶控制模块214确定决策规划消息204和/或控制消息206。这样的辅助信息可以以任何消息格式生成和传输。
在一些实施例中,路侧子系统112例如可以获取交通工具130的实时运行信息。实时运行信息与交通工具130的当前运行状况相关,例如可以包括交通工具130的位置信息、行驶方向信息、行驶路线信息、行驶速度信息、操作状态信息以及部件状态信息中的至少一项。在一些实施例中,从车载子系统132获取的辅助信息还可以包括交通工具130的辅助规划信息,其指示交通工具130的规划状况。辅助规划信息例如可 以包括交通工具130的行驶意图的指示、计划行驶路线信息以及速度限制信息中的至少一项。
备选地或附加地,从车载子系统132获取的辅助信息还可以包括交通工具130的车身信息,用以描述交通工具130本身的物理状态、外观状态等。车身信息例如可以包括交通工具130的标识、类型、描述性信息、当前行驶路线信息以及故障相关信息中的至少一项。此外,辅助信息例如还可以包括特殊交通工具类型和路权要求等级中的至少一项。
为便于理解,辅助信息所指示的一些具体内容示例和描述在以下表4中给出。
表4.辅助信息(OBU→RSU)
Figure PCTCN2019103876-appb-000012
Figure PCTCN2019103876-appb-000013
Figure PCTCN2019103876-appb-000014
Figure PCTCN2019103876-appb-000015
虽然以上列出了来自交通工具的辅助信息所包含的内容的一些示例,但应当理解,辅助信息还可以包括其他内容,包括更少或更多的内容。在每次通信中,路侧子系统112提供给车载子系统132的辅助信息可以是以上表4中的全部内容、部分内容,或者可以包括其他内容。在传输时,来自交通工具的辅助信息或其中的组成元素可以诸如数据压缩等,以进一步降低在路侧与交通工具之间传输的数据量。
以上描述了协同驾驶控制系统以及其中的一些消息交互和消息组成的示例。在不同场景下,路侧子系统112与车载子系统132之间用于实现协同驾驶控制的消息交互可以不同,并且还可以存在其他信息交互。以下将参考一些示例场景来描述路侧、车侧以及可能还涉及云端的协同驾驶控制的实现。
交叉路口的示例场景
通常,在有信号灯的情况下,具有自动驾驶能力的车辆可以通过感知手段获得当前信号灯的状态,依照“红灯停-绿灯行”的交通规则通过交叉路口。在没有信号灯控制的交叉路口,每个自动驾驶车辆可能只能 依赖于自身的决策控制,会出现通行效率低的问题,例如由于各个自动驾驶的交通工具之间不断进行“博弈”而导致低通行效率。
在本公开的一些实施例中,在一些交通道路复杂的环境中,例如在道路交汇点、经常发生拥堵的路段以及交通部门认定交通事故多发的路段中,外部设备,例如路侧设备110可以基于路侧感知信息以及可能的其他来源的感知信息来感知周边道路的道路交通状况,并根据全局的道理交通状况来对车辆的通行进行全局驾驶控制,从而可以提升通行策略,实现更有效、合理且安全的驾驶控制。
在一些实施例中,可以由路侧设备110的路侧子系统112向道路上的交通工具130发送决策规划消息和/或控制消息,以引导交通工具130的通行。这样的决策规划消息可以是特定于环境100中的一个或多个车道,或者特定于交通工具130。如果要发送控制消息,控制消息将特定于交通工具130。
图3A示出了在交叉路口的示例场景310下的协同驾驶控制。在该场景中,多个交通工具(即车辆)130-1至130-8正驶向交叉路口,并且路侧子系统112可以实现车道级别的决策规划。
在一些示例中,在这样的场景下,一个或多个交通工具130的车载子系统132的OBU 232可以向路侧设备110的路侧子系统112发送交通工具感知信息224,并且还可以发送诸如实时运行信息、辅助规划信息、车身信息、特殊交通工具指示信息等一项或多项辅助信息。在通信范围内,路侧子系统112的RSU 212接收OBU 232发送的信息,并且接收路侧传感设备107以及可能的来自远端设备120的信息,由此,可以获得全局环境信息。
由于外部设备,例如路侧设备处可以从路侧感知设备以及可能还可以从其他环境感知源获得全面的环境感知信息,包括该环境下的行人、车辆、骑行者以及路面信息等感知数据,可以进一步提升环境感知能力,进而提升后续决策规划和控制的准确度。在一些实施例中,车载子系统132还可以通过感知消息向路侧子系统112指示交通工具130的具体需求。这样的感知消息例如可以包括交通工具130的辅助信息,诸如表4中所列出的辅助信息。
然后,路侧子系统112(例如,驾驶控制模块214)可以基于收集到的感知信息来确定驾驶相关消息。如果生成的驾驶相关消息包括感知消息202,该感知消息202中指示的环境感知结果相比单个交通工具130所感测到的环境感知结果而言可能更精确和全面。
在一些实施例中,由于交叉路口处的交通通行可能涉及多个交通工具的全局规划,因此路侧子系统112的驾驶控制模块214可以基于来自多个源的信息来生成决策规划消息204,以在道路、车道或车辆级别上执行决策规划,从而实现有效、合理的路口通行。在一个实施例中,驾驶控制模块214生成特定于车道的决策规划消息,诸如决策规划消息可以指示针对不同车道规定不同的路权,即车道上的交通工具130的通行权利,并且还可以指示相应路权的开始时间和结束时间。车道的路权相关信息可以从例如远程设备120,例如交通管理部分的系统获得。针对可能在一段时间内动态变化的车道,诸如潮汐车道,需要动态获得车道的最新路权信息,以便执行正确的决策规划和控制。
作为一个示例,在图3A的示例中,路侧子系统112确定交通工具130-1和130-2所处的车道具有在路口直行的路权,并且还规定直行的起始时间和结束时间。
用于实现车道级别控制的决策规划消息中的信息的一些具体示例和描述在以下表5中给出。
表5.决策规划消息(RSU→OBU)
Figure PCTCN2019103876-appb-000016
Figure PCTCN2019103876-appb-000017
决策规划消息可以在确定交通工具130处于路侧控制的时间和起始位置内被传递给车载子系统232。在一些实施例中,特定于车道的决策规划消息可以被广播,交通工具130接收到之后,根据自己所处的车道来确定是否使用接收到的决策规划消息。在一些实施例中,路侧子系统112可以首先确定特定交通工具130处于某个车道上,然后朝向该交通工具130发送特定于该车道的驾驶相关消息。
在接收到决策规划消息204之后,交通工具130可以根据决策规划消息204中指示的决策规划策略来对交通工具130进行控制,以确保交通工具130按照路侧子系统112的决策规划策略进行驾驶。例如,在图3A中,交通工具130-1和130-2在接收到决策规划消息之后,可以在决策规划消息所指示的起始时间处开始直行通过路口。在达到决策规划消息中所指示的结束时间时,交通工具130可以停止行驶通过路口。
在一些实施例中,驾驶相关消息可以特定于交通工具130。例如,感知消息可以包括针对交通工具130相关的目标物体的描述,决策规划消息均可以针对个体交通工具130定制化确定。控制消息则完全用于控制特定的交通工具130的驾驶。由此,在接收到特定驾驶相关消息后,交通工具130上的车载子系统132可以相应执行驾驶控制。在一些实施例中,驾驶控制模块214可以直接生成控制消息206,以指示对交通工具130的具体控制操作,然后由RSU 212将控制消息206提供给OBU 232。
作为一个示例,图3B示出了在交叉路口的示例场景312下的协同驾驶控制。在该场景中,多个交通工具(即车辆)130-1至130-4正驶向交叉路口,并且路侧子系统112可以实现交通工具级别的决策规划。具体地,基于多个来源的感知信息,路侧子系统112的驾驶控制模块214可以规划交通工具130-1右转,而交通工具130-3直行通过路口。然后, 驾驶控制模块214生成特定于交通工具130-1的决策规划消息,以引导交通工具130-1执行右转操作。此外,驾驶控制模块214还生成特定于交通工具130-4的决策规划消息,以引导交通工具130-1直行。
在一些实施例中,路侧子系统112还可以控制路侧的交通信号灯150-3,以共同配合实现通信策略。在一些实施例中,路侧子系统112的决策规划也可以实现在无信号灯的道路上允许交通工具的安全高效通行。
接管交通工具的示例场景
在决策控制消息204或控制消息206的应用意味着交通工具130的部分驾驶控制或全部驾驶控制将由外部设备,例如路侧设备110来执行。在一些实施例中,这种部分驾驶控制或全部驾驶控制是在确定路侧设备110能够接管交通工具130的情况下才启用。对交通工具130的接管可能由很多原因导致,例如可以是由于交通工具130,特别是自动驾驶模式下的交通工具130无法应对当前驾驶场景,例如发生软件和/或硬件故障,需要其他交通工具的协同操作才能够正常驾驶(例如受周围车辆围困而导致停止行驶)等等。这时候,路侧设备或其他交通工具的介入可以帮助交通工具130“脱困”,而无需人工干预,这提升了自动驾驶交通工具的自动运行能力。当然,在一些情况下,交通工具130可以由于任何其他原因而请求路侧设备110的接管,例如为了实现自动泊车,等等。
图3C示出了在接管交通工具130的示例场景320下的协同驾驶控制。在该场景下,交通工具130-1被周围的交通工具阻挡或者发生故障,例如在道路中停止自动驾驶。响应于这样的事件,交通工具130-1中的车载子系统132可以向路侧子系统112发生请求接管消息,以请求接管对交通工具130的驾驶控制。路侧子系统112的驾驶控制模块214由此可以向交通工具130-1提供控制消息206。在一些实施例中,如果交通工具130-1的部分驾驶控制模块发生故障,请求接管消息可以仅请求部分接管对交通工具的驾驶控制。响应于这样的请求接管消息,驾驶控制模块214可以提供决策规划消息204,使交通工具130-1可以按照决策 规划来进行驾驶。当然,在这种情况下,驾驶控制模块214也可以提供控制消息206,以提供对交通工具130-1的驾驶的全面控制。
图3D还示出了在接管交通工具130的示例场景330下的协同驾驶控制。在该场景下,交通工具130-1被周围的其他交通工具130-2和130-3阻挡,无法通过自车的自动驾驶决策离开困境。此时,交通工具130-1中的车载子系统132可以向路侧子系统112发送请求接管消息,以请求接管对交通工具130-1的部分或全部驾驶控制。在确定部分或全部接管交通工具130-1的驾驶控制的情况下,路侧子系统112向车载子系统132发送接管响应消息,以指示路侧是否至少部分接管对交通工具130-1的驾驶控制。
在一些实施例中,请求接管消息指示交通工具的标识信息、交通工具的行驶计划信息,请求接管的原因以及时间信息。交通工具的标识信息用于使路侧设备110能够标识和/或定位交通工具130。标识信息例如可以包括车辆的标识,诸如VIN码、车牌号等,还可以包括车辆的类型、大小、描述性信息、位置信息、周围物体的描述等等。行驶计划信息指示交通工具130的行驶计划,包括但不限于当前行驶的方向、目的地、计划行驶路线、速度限制信息(例如,最大允许的速度和/或加速度)。请求接管的原因例如可以指示交通工具130请求接管的原因,例如是自车故障、诸如自动泊车之类的特定需求、自动驾驶策略失效,等等。由车载子系统132提供给路侧子系统112的请求接管消息所包含的信息的一些示例如以下表6所示。
表6.请求接管消息(OBU→RSU)
Figure PCTCN2019103876-appb-000018
Figure PCTCN2019103876-appb-000019
在一些实施例中,由路侧子系统112提供给车载子系统132的接管响应消息指示是否至少部分接管所述交通工具和接管交通工具的起始时间,如以下表7.1所列。
表7.1.接管响应消息(RSU→OBU)
Figure PCTCN2019103876-appb-000020
在确定至少部分接管对交通工具130-1的驾驶控制的情况下,路侧子系统112的驾驶控制模块214由此可以向交通工具130-1提供决策规划消息204和/或控制消息206,以用于驾驶控制。此外,驾驶控制模 块214通过各个感知信息源接收到的感知信息,确定交通工具130-1的顺利驾驶需要一个或多个其他交通工具130的协作。例如,驾驶控制模块214确定交通工具130-1无法绕开交通工具130-2继续前进,需要交通工具130-2退避一定空间。由此,驾驶控制模块214还基于已有的信息来生成另一决策规划消息和/或另一控制消息。RSU 212然后向交通工具130-2提供所生成的决策规划消息和/或控制消息。交通工具130-2可以根据接收到的消息来执行相应驾驶动作,以配合交通工具130-1脱离当前被阻挡的状态。
在涉及泊车的实施例中,路侧子系统112可以向车主子系统132发送感知消息,以指示停车区域内的具体信息。感知消息可以指示停车区域内的目标物体相关的描述,停车位号,停车位地理位置,停车位描述等。这样的感知消息在以下表7.2中列出了其中内容的一些示例。
表7.2.泊车情况下的感知消息(RSU→OBU)
Figure PCTCN2019103876-appb-000021
在涉及泊车的实施例中,路侧子系统112发送的决策规划消息还可以具体指示与停车区域内交通工具130的具体驾驶决策或规划。决策规划消息的一些示例内容例如在以下表7.3给出。
表7.3.泊车情况下的决策规划消息(RSU→OBU)
Figure PCTCN2019103876-appb-000022
Figure PCTCN2019103876-appb-000023
在一些实施例中,路侧设备110或其他交通管理节点在检测到接管的需要或者在面对紧急情况时,也可以主动请求或要求路侧设备110接管一个或多个交通工具130。在一些实施例中,如果响应于请求接管消息或者路侧子系统112主动确定要至少部分接管对交通工具130的驾驶控制,RSU 212向交通工具130提供接管通知消息。接管通知消息指示接管的相关信息,例如可以包括但不限于接管的开始时间、接管的结束时间、接管的类型、接管的原因以及接管的策略中的一项或多项。
在一些实施例中,交通工具130的脱困还可以不依赖于路侧设备110或路侧子系统112,而是可以通过交通工具130之间的交互来实现协助脱困。具体地,例如在图3D的示例中,交通工具130-1可以向周围的交通工具,例如交通工具130-2和130-3发送请求脱困消息。请求脱困消息可以包括但不限于:交通工具130-1的标识信息,交通工具130-1的类 型,交通工具130-1的位置信息,行驶方向信息,行驶路线信息,行驶速度信息,和/或影响交通工具130-1的行驶的周围交通工具的位置信息,等等。
交通工具130-2和130-3接收到请求脱困消息,可以确定自车是否会影响交通工具130-1。如果交通工具130-2和130-3确定自车可能会影响交通工具130-1,可以调整自车的行驶规划策略,对交通工具130-1进行让行,以协助交通工具130-1脱困。交通工具130-2或130-3的脱困响应消息包括以下至少一项:交通工具130-2或130-3的标识信息,交通工具130-2或130-3是否避让所述交通工具的指示,交通工具130-2或130-3的规划路径,以及交通工具130-2或130-3的规划速度,等等。
以下表8和表9分别列出了请求脱困消息和脱困响应消息中所包含的信息的一些示例。
表8.请求脱困消息(OBU→OBU)
Figure PCTCN2019103876-appb-000024
表9.脱困响应消息(OBU→OBU)
Figure PCTCN2019103876-appb-000025
Figure PCTCN2019103876-appb-000026
地图更新的示例场景
以上讨论了感知消息202和/或决策规划消息204中可以包括地图信息,这是因为在车载子系统232执行具体驾驶控制时需要依赖于环境中的地图,特别是高精地图来引导交通工具的驾驶。如以上提及的,地图信息可以指示地图的标识、地图的更新方式、要更新地图的区域和位置信息中的至少一项。
在一些实施例中,路侧子系统112可以广播地图版本消息,例如可以周期性进行广播。地图版本消息包括地图版本信息、地图版本所针对的地理位置、地图的描述等。
在一些实施例中,交通工具130处的地图更新可以由车侧的驾驶控制模块234或路侧的驾驶控制模块214检测到地图更新的需求和/或响应于来自交通工具130、例如来自OBU 232的地图更新请求消息,RSU 212可以在所生成的感知消息202和/或决策规划消息204中包括地图信息。例如,车载子系统132可以在接收到地图版本消息后,确定要请求更新地图。地图更新请求消息可以指示时间信息、交通工具130的标识、交通工具130的类型、交通工具130的描述性信息、交通工具130的位置信息、地图版本信息、请求更新地图的区域、以及地图更新的方式。
路侧子系统112在接收到地图更新请求消息之后,根据交通工具130的对应需求,将地图数据发送给交通工具。地图数据可以包括地图版本、地图版本所针对的地理位置、地图数据对应区域、地图数据对应 更新方式等。
以下表10.1、表10.2和表10.3分别列出了地图版本消息、地图更新消息和地图数据所包含的内容的一些示例。
表10.1.地图版本消息(RSU→OBU)
指示信息 备注
时间信息 消息发送的时刻
地理位置信息 地图版本所针对的地理位置
地图版本信息 地图的版本
地图描述 与地图有关的描述
表10.2.地图更新请求消息(OBU→RSU)
Figure PCTCN2019103876-appb-000027
Figure PCTCN2019103876-appb-000028
表10.3.地图数据(RSU→OBU)
Figure PCTCN2019103876-appb-000029
虽然以上列出了地图更新请求消息所包含的内容的一些示例,但应当理解,地图更新请求消息还可以包括其他内容,包括更少或更多的内容。在每次通信中,车载子系统132提供给路侧子系统112的地图更新请求消息可以是以上表1.1至表1.4中的全部内容、部分内容,或者可以包括其他内容。在传输时,地图更新请求消息或其中的组成元素可以诸如数据压缩等,以进一步降低在路侧与交通工具之间传输的数据量。
交通状况识别的示例场景
在一些实施例中,在真实路况行驶时,如果可以提前获知前方路段的交通状况,可以更好的辅助交通工具(诸如车辆)进行路径的规划。基于路侧感知的交通状况识别指在混合交通环境下,由路侧感知设备和车载感知设备不断感知周边道路交通信息,并通过路侧子系统处理后, 实时地识别当前路段的交通流及拥堵状况,并通过将交通状况的识别作为感知消息发送给交通工具,以协助交通工具做出正确的决策控制。例如,车载子系统根据交通状况信息,可以确定交通工具的最佳行驶路径,以实现交通工具的安全、高效行驶。
在一些实施例中,路侧子系统112的驾驶控制模块214生成感知消息,以包括交通工具所处的道路的交通状况信息。交通状况信息包括以下至少一项:道路的一个路段上的交通参与者密度,路段上的交通参与者的平均速度,路段的车道级别拥堵状况描述,路段的起始位置,以及当前交通状况的持续时间。感知消息可以通过路侧子系统112的RSU212广播或通过其他通信连接提供给交通工具130。由此,交通工具130的车载子系统132可以将交通状况信息和车载子系统132可获得的其他感知信息或感知结果来确定交通工具130的行驶规划和控制策略。例如,如果车载子系统132从路侧子系统112获知前方路段拥堵,可以将行驶规划确定为提前变道,以躲避前方拥堵。
在一些实施例中,由路侧子系统112提供给车载子系统132的感知消息的一些示例在以下表11中给出。
表11.感知消息(交通状况信息)(RSU→OBU)
Figure PCTCN2019103876-appb-000030
Figure PCTCN2019103876-appb-000031
远程驾驶的示例场景
在一些实施例中,具有自动驾驶能力的交通工具130在运行过程中可以从远程信息监控,并且在需要的时候可以由人或者其他操作者远端远程驾驶交通工具。这在例如需要交通工具完成复杂任务、由于交通工具自身感知和处理等方面的局限性,需要人工远端进行自动远程驾驶。在这样的实现中,交通工具130可以支持远程驾驶和自动驾驶之间切换。
通常,可以设置远端执行机构,该执行机构模拟一般交通工具上用于实现驾驶的执行机构。执行机构可以包括交通工具的动力系统、传动系统、制动系统、转向系统、反馈系统等一个或多个系统中的组件。这样的远端执行机构可以是基于软件来实现的,例如由驾驶模拟器或图形界面进行操作。这样的远端执行机构可以也可以部分或全部由硬件来实现。在交通工具130进入远程控制模式时,远端执行机构上的驾驶操作被同步到交通工具130的本地执行机构上。也就是说,交通工具130的驾驶操作由远端执行机构发起和控制,并被映射到本地执行机构。远端执行机构上的执行操作例如可以由人活着由其他操作者做出。这种远程控制例如在交通工具130自身无法进行驾驶控制或者处于停滞状态时可以被发起。
在一些实施例中,在需要的情况下,远端执行机构或远端执行机构处的终端设备可以发起针对交通工具130的远程驾驶请求消息。远程驾驶请求消息可以指示远程驾驶的开始和/或结束的时间和/或位置,该消息中还包括时间信息。远程驾驶请求消息可以例如被发送给路侧设备110,例如路侧子系统112。路侧子系统112的RSU 212可以将远程驾驶请求消息发送给车载子系统132的OBU 232。
在接收到交通工具130的车载子系统132对远程驾驶的许可或确认的情况下,路侧设备110可以发起基于远程驾驶的控制消息206。具 体地,路侧子系统112确定与交通工具130相关联的远端执行机构上的远端执行操作,并且可以基于与交通工具130的本地执行机构相关的参数,将远端执行操作转换为控制消息206,以控制本地执行机构执行与远端执行操作相同的操作。与交通工具130的本地执行机构相关的参数例如可以从远端设备120的存储装置、或者直接从车载子系统132获得。
在一些实施例中,如果确定远程驾驶结束,路侧子系统112可以向交通工具130的车载子系统132提供远程驾驶结束的指示。此时,交通工具130恢复到正常驾驶模式。
检测异常运动物体的示例场景
在自动驾驶的场景中,需要关注交通工具所处环境中的其他各种类型的物体,诸如物体的位置、运动状态以及其他描述。由此,可以辅助交通工具做出正确的驾驶决策和控制。根据本公开的实施例,在协同式驾驶控制的情况下,由交通工具130外部的设备,诸如路侧设备110或其他佳通工具130上的车载子系统132基于与环境100相关的感知信息、特别是环境100中的目标物体相关的感知信息,来确定目标物体是否处于异常运动状态。如果检测到存在处于异常运动状态的目标物体,向交通工具130提供的驾驶相关消息,诸如感知消息202、决策规划消息204、控制消息206中的一个或多个将指示目标物体的异常运动状态。基于这样的驾驶相关消息,可以控制对应交通工具130的驾驶,以使交通工具130行驶在与目标物体不碰撞的运动轨迹上。
在交通工具130的驾驶环境中,诸如在行驶道路上,可能威胁到交通工具130的行驶安全的目标物体可能包括持续静止的目标物体、与交通工具130的移动方向冲突的目标物体、与交通工具130的移动速度显著不匹配的目标物体,等等。在真实的驾驶环境中,通常由于其他物体遮挡而存在感知盲区、或者交通工具自身的感知范围有限,交通工具130本身可能难以快速、准确且完整地识别在混合交通环境下的异常目标物体。在一些实施例中,可以由路侧设备110(例如,路侧子系统112)和其他交通工具130(例如,车载子系统132)的感知能力和/或计算能力来辅助交通工具130的驾驶控制,并向交通工具130提供驾驶相关消 息以指示处于异常运动状态的目标物体。
所关心的目标物体可以是在环境100中可能存在的各种类型的物体,特别是在交通工具130的行驶道路上或者附近的物体,诸如各种类型的交通工具、人、动物、以及其他物体。
在一些实施例中,在确定环境100中的目标物体是否处于异常运动状态时,可能期望检测是否处于异常静止状态的物体。在本文中,“异常静止状态”指的是在一段时间内持续静止,该时间段可以是大于某个阈值的时间段。异常静止的物体,特别是在道路上停止的物体需要被及时、准确通知到正在行驶的交通工具130的驾驶控制系统,以便做出正确的驾驶决策和控制。
在一个实施例中,可以基于在一段时间内的多个时间点处感测到的与目标物体相关的多个信息集,来确定目标物体在这段时间内是否持续静止。由于需要获得一段时间内的感知信息来判断目标物体的静止状态,单个交通工具130的感知能力可能难以完成这样的判断,因此需要借助于路侧或其他交通工具的感知能力和/或计算能力。
多个时间点处的多个信息集的部分或全部可以是由路侧传感设备107感测到并且周期性上报,或者部分或全部来自于路过静止的目标物体的一个或多个其他交通工具130上的车载传感设备236感测到的。在这种情况下,如果由某个交通工具的车载子系统132执行目标物体的运动状态的判断,该车载子系统132可以从路侧子系统112或者从远端设备120获得多个时间点处的多个信息集。通过采集一段时间内的目标物体的信息,路侧子系统112或相关的车载子系统132可以确定目标物体是否一直处于静止。如果目标物体在一段时间内一直处于静止,可以确定目标物体处于异常静止状态。
在一个实施例中,处于异常静止状态的目标物体可以是环境100中的交通工具,诸如在道路两侧长期停放的车辆。这样长期停放的车辆有时还被称为“僵尸车”。针对这样的车辆,用于判断异常静止状态的时间段的长度可以被设置为较长时间,例如几个星期、几个月等。
在一个实施例中,在确定目标物体处于异常静止状态之后,路侧子系统112或其他交通工具130的车载子系统132可以生成感知消息, 其可以包括诸如以上表1.1至表1.4中列出的全部或部分内容。所生成的感知消息可以包括与目标物体相关的描述,特别是包括目标物体的异常静止状态。在一些示例中,感知消息中还可以将长期处于静止状态的车辆(即目标物体)特别指示为“僵尸车”。
备选地或附加地,路侧子系统112或其他交通工具130的车载子系统132可以进一步生成针对特定交通工具130的决策规划消息或控制消息。决策规划消息或控制消息也可以指示目标物体的异常静止状态,这可以通过相对于异常静止状态的目标物体来规划交通工具130的行驶轨迹或者控制交通工具130的驾驶动作来实现。由此,可以使交通工具130避免行驶在可能与静止的目标物体碰撞的运动轨迹上。
具体地,为了避免与目标物体碰撞,决策规划消息或控制消息可以引导或控制交通工具130的移动方向、移动速度、或者整体行驶路线。例如,如果确定在道路的一个车道上存在处于异常静止状态的目标物体,决策规划消息或控制消息可以引导交通工具130提前变道,以行驶到其他车道上,从而避免与该目标物体的碰撞。在没有获得决策规划消息或控制消息的情况下,交通工具130的车载子系统132也可以将从外部接收到的感知消息中所包含的信息(例如与目标物体相关的描述)作为决策规划和/或控制的输入,从而确定如何执行驾驶的规划和控制,诸如可以控制交通工具130的移动方向、速度和路线,以避免与静止的目标物体的碰撞。
在一个实施例中,路侧子系统112或其他交通工具130的车载子系统132在确定处于异常静止状态的目标物体之后,可以广播所生成的感知消息。处于路侧子系统112或其他交通工具130的车载子系统132的通信范围内的交通工具130可以接收到这样的感知消息。在另一实施例中,路侧子系统112或其他交通工具130的车载子系统132所生成的决策规划消息和/或控制消息可以是针对特定交通工具130的,并且由此可以通过已建立的通信连接被传送给特定交通工具130。
图3E示出了由路侧子系统112检测环境100中处于异常静止状态的目标物体的示例场景340。如图3E所示,路侧设备110从路侧传感设备107和/或交通工具130的车载传感设备236获得与环境相关的感知 信息,特别是与环境中的交通工具130-2和130-3相关的感知信息。例如,路侧传感设备107可以周期性检测其感知范围102内的感知信息,并上报给路侧子系统112(例如,其中的RSU 212)。路侧子系统112(例如,其中的RSU 212)还可以从路过交通工具130-2和130-3的其他交通工具130上的车载子系统132获得车载传感设备236采集到的感知信息。路侧设备110(例如,驾驶控制模块214)可以从原始感知信息中确定感知结果,并且记录多个时间点处对交通工具130-2和130-3的感知结果,例如,在每个时间点处交通工具130-2和130-3的位置和速度等。
如果判断交通工具130-2和130-3在一段时间内均是静止的,路侧子系统112(例如,其中的驾驶控制模块214)可以确定该交通工具130-2和130-3处于异常静止状态,并且可以生成驾驶相关消息(诸如,感知消息、和/或针对特定交通工具130、例如交通工具130-1的决策规划消息和/或控制消息)。路侧子系统112的RSU 212可以广播感知消息,或者可以向特定交通工具130-1的车载子系统132发送决策规划消息和/或控制消息。交通工具130-1的车载子系统132接收到驾驶相关消息后,可以将驾驶相关消息与交通工具130-1检测到的感知结果融合,确定交通工具130-1的行驶策略,并将行驶策略传递给交通工具130-1的线控系统,实现对交通工具130-1的实时控制。例如,在获知交通工具130-2和130-3处于异常静止状态的情况下,交通工具130-1可以被控制为执行提前变道,以行驶到与交通工具130-2和130-3不会发生碰撞的车道上。
除了处于异常静止状态的目标物体之外,环境100中可能还存在处于异常移动状态的目标物体,诸如与交通工具130的移动方向冲突的目标物体、与交通工具130的移动速度显著不匹配的目标物体等。在交通环境中,处于异常移动状态的目标物体可能会威胁到交通工具130的行驶安全,因而需要及时检测这样的目标物体,以便做出关于交通工具130的正确行驶决策和控制。对于异常移动状态的目标物体的判断也可以由路侧子系统212或由某个交通工具130上的车载子系统132执行。
在一些实施例中,在确定目标物体是否处于异常运动状态时,还可以从目标物体相关的感知信息确定目标物体的移动,至少包括移动方向和/或移动速度。此外,所获得感知信息中还包括与环境100中的参考 物体相关的感知信息,并确定参考物体的移动,至少包括移动方向和/或移动速度。参考物体通常被选择为具有较大概率处于正常移动状态下的物体,诸如是环境100中的交通工具130。参考物体的选择取决于要判断的目标物体的运动状态,这在以下将会描述。通过比较目标物体的移动和参考物体的移动,可以确定目标物体是否处于异常移动状态。
异常移动状态可以包括:逆行状态,即目标物体的移动方向与环境中的大多数物体的移动方向不同,或者与交通规则所规定的移动方向不符;速度异常状态,即目标物体的移动速度与参考物体的移动速度之间的差异过大(例如,超过速度差阈值)。速度异常状态还可以进一步划分为异常慢速状态和异常快速状态,在异常慢速状态中目标物体的移动速度大于参考物体的移动速度,而在异常快速状态目标物体的移动速度小于参考物体的移动速度。
在一个实施例中,在确定目标物体是否处于逆行状态时,可以将目标物体的移动方向与参考物体的移动方向相比较。在一些示例中,参考物体可以被选择为与目标物体处于同一行车道上的物体,例如目标物体为目标交通工具,而参考物体被选择为与目标交通工具处于同一行车道上的参考交通工具(有时这种情况下的参考物体也称为“第一参考物体”)。在本文中,行车道指的是使交通工具纵向排列、安全行驶的带状道路部分,其可以由一个或多个车道构成。行驶道路上可以具有单向行车道,其中交通工具均朝向大致相同的方向行驶。行驶道路也可以具有双向行车道,其中两个行车道上的交通工具朝向不同方向行驶。如果检测到响应于目标物体的移动方向与参考物体的移动方向相反,可以确定目标物体处于逆行状态。为了准确检测目标物体是否处于逆行状态,参考物体可以具有较高置信度处于正确行驶方向的物体。
在一个实施例中,在确定目标物体是否处于速度异常状态时,将目标物体的移动速度与参考物体的移动速度相比较。在一些示例中,用于判断速度异常的参考物体可以是与目标物体具有相同的预期移动方向的物体,例如,参考物体和目标物体均可以是处于同一方向的行车道上的参考交通工具和目标交通工具。此时的参考物体有时也被称为“第二参考物体”。
如果确定响应于目标物体的移动速度与参考物体的移动速度之间的差异超过速度差阈值,确定目标物体处于速度异常状态。速度差阈值可以根据道路的相关交通规则来设置,或者可以基于物体(特别是交通工具)之间的安全距离来设置。参考物体可以被选择为具有较大概率处于正常移动或静止状态下的物体,诸如是环境100中与目标交通工具(目标物体)处于相同行车道上的交通工具130。在目标物体与参考物体之间的速度差异超过速度差阈值的情况下,如果还确定目标物体的移动速度大于参考物体的移动速度,可以确定目标物体处于异常慢速状态。否则,如果目标物体的移动速度小于参考物体的移动速度,可以确定目标物体处于异常快速状态。在某些道路上,交通工具的行驶速度需要满足特定速度要求,超过规定速度上限或者低于速度下限的交通工具均可能会威胁到其他交通工具的安全。此外,在移动状态下,如果交通工具与其他交通工具或物体之间的速度差过大,也需要特别规划和控制驾驶操作(诸如减速、加速、转向或改变行驶路线等),以避免发生碰撞。
在一个实施例中,路侧子系统112或其他交通工具130的车载子系统132在确定处于异常移动状态的目标物体之后,可以生成感知消息。感知消息可以包括与目标物体相关的描述,特别是包括目标物体的异常移动状态以及目标物体的其他方面的描述,诸如以上表1.1至表1.4中列出的全部或部分内容。在一些实施例中,目标物体相关的描述,诸如表1.1中目标物体的运动状态还可以特别指示目标物体的具体异常移动状态,诸如逆行状态、异常慢速状态、异常快速状态等。
备选地或附加地,路侧子系统112或其他交通工具130的车载子系统132可以进一步生成针对特定交通工具130的决策规划消息或控制消息。决策规划消息或控制消息也可以指示目标物体的异常运动状态。这可以通过相对于异常移动状态的目标物体来规划交通工具130的行驶轨迹或者控制交通工具130的驾驶动作来实现。由此,可以使交通工具130避免行驶在可能与正在异常移动的目标物体碰撞的运动轨迹上。
具体地,为了避免与目标物体碰撞,决策规划消息或控制消息可以引导或控制交通工具130的移动方向、移动速度、或者整体行驶路线。例如,如果确定在道路的一个车道上存在处于逆行状态的目标物体,决 策规划消息或控制消息可以引导交通工具130避免变换到那个车道上,或者从那个车道上变道。又例如,如果确定在相同车道上、在交通工具130的前方存在异常慢速状态的目标物体,策规划消息或控制消息可以引导交通工具130进行减速和/或变换车道等操作,以避免发生碰撞。或者,如果在相同车道上、在交通工具130的后方存在异常快速状态的目标物体,策规划消息或控制消息可以引导交通工具130进行加速和/或变换车道等操作。
在一个实施例中,路侧子系统112或其他交通工具130的车载子系统132在确定处于异常移动状态的目标物体之后,可以广播所生成的感知消息。处于路侧子系统112或其他交通工具130的车载子系统132的通信范围内的交通工具130可以接收到这样的感知消息。在另一实施例中,路侧子系统112或其他交通工具130的车载子系统132所生成的决策规划消息和/或控制消息可以是针对特定交通工具130的,并且由此可以通过已建立的通信连接被传送给特定交通工具130。
图3F示出了由路侧子系统112检测环境100中处于异常静止状态的目标物体的示例场景350。在该场景中,交通工具130-1至130-5正在同一方向的行车道上行驶。路侧传感设备107和/或交通工具130的车载传感设备236获得各自感知范围102和302内的感知信息,特别是与其中的各个交通工具相关的感知信息。路侧传感设备107和车载传感设备236可以将感知信息传递给路侧子系统112或某个交通工具130的车载子系统132。
路侧子系统112或车载子系统132(例如,驾驶控制模块214或驾驶控制234)可以基于感知信息确定交通工具130-3的移动方向与行车道上的其他交通工具(例如,交通工具130-5或交通工具130-3)的移动方向不同(相反),由此可以确定交通工具130-3处于逆行状态。路侧子系统112或车载子系统132(例如,RSU 212或RSU 232)可以广播感知消息,或者可以向特定交通工具130-1的车载子系统132发送决策规划消息和/或控制消息,以指示处于逆行状态的交通工具130-3。交通工具130-3的车载子系统132接收到驾驶相关消息后,可以将驾驶相关消息与交通工具130-1检测到的感知结果融合,确定交通工具130-3的行驶 策略,以控制交通工具130-3的线控系统按行驶策略行驶。例如,交通工具130-1可以被控制为不向逆行的交通工具130-4所处的车道变道。
在一些情况下,路侧子系统112或交通工具130-2的车载子系统132(例如,驾驶控制模块214或驾驶控制234)可以基于感知信息确定交通工具130-3的移动速度小于交通工具130-2的移动速度,并且两者之间的差异过大(大于速度差阈值),由此可以确定交通工具130-3处于异常慢速状态。路侧子系统112或车载子系统132(例如,RSU 212或RSU 232)可以广播感知消息,或者可以向特定交通工具130-1的车载子系统132发送决策规划消息和/或控制消息,以指示处于异常慢速状态的交通工具130-3。交通工具130-3的车载子系统132接收到驾驶相关消息后,可以将驾驶相关消息与交通工具130-1检测到的感知结果融合,确定交通工具130-3的行驶策略,以控制交通工具130-3的线控系统按行驶策略行驶。例如,交通工具130-1可以被控制为提前降低速度和/或变换车道。
示例方法
图4A示出了根据本公开实施例的协助驾驶控制的方法400的流程图。方法400可以被实现在图1和图2中的路侧子系统112处。应当理解,虽然以特定顺序示出,方法400中的一些步骤可以以与所示出的不同顺序或者以并行方式执行。本公开的实施例在此方面不受限制。
在410,获取与交通工具所处的环境相关的感知信息,感知信息至少包括与环境中存在的物体相关的信息。在420,至少基于感知信息,生成针对交通工具的驾驶相关消息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个。在430,向交通工具提供驾驶相关消息,以用于交通工具的驾驶控制。
在一些实施例中,获取感知信息包括获取以下至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,感知消息包括以下至少一项:与环境中存在的目标物体相关的描述,以指示目标物体的类型、位置信息、速度信息、方向信息、物理外观的描述信息、历史轨迹和预测轨迹中的至少一项;与环境中的道路的物理状况相关的信息,以指示道路的路面物理状况和道路的结构化信息中的至少一项;与环境中的交通设施相关的信息,以指示道路上的信号灯状态和交通标志中的至少一项;与环境中的道路交通状况信息,以指示与道路和/或道路中的车道的标志、交通流量和交通事件中的至少一项;与环境中的气象状况相关的信息;与交通工具的定位相关的辅助信息;对交通工具的故障的诊断信息;与交通工具中的软件系统的空中升级相关信息;地图信息,指示地图的标识、地图的更新方式、要更新地图的区域和位置信息中的至少一项;与交通工具的停泊区域相关的信息;以及时间信息。
在一些实施例中,决策规划消息包括以下至少一项:与要应用决策的行驶道路的指示,决策规划的起始时间信息和/或终止时间信息,决策规划的起始位置信息和/或终点位置信息,决策规划所针对的交通工具的标识,与交通工具的驾驶行为相关的决策信息,与交通工具的驾驶动作相关的决策信息,路径规划轨迹点的信息,到达路径规划轨迹点的预期时间,与决策规划所涉及的其他交通工具相关的信息,地图信息,指示地图的标识、地图的更新方式、要更新地图的区域和位置信息中的至少一项,以及时间信息。
在一些实施例中,控制消息包括以下至少一项:对交通工具的运动相关的运动学控制信息,与交通工具的动力系统、传动系统、制动系统和转向系统中的至少一项相关的动力学控制信息,与交通工具中的乘坐者的乘坐体验相关的控制信息,与交通工具的交通警示系统相关的控制信息,以及时间信息。
在一些实施例中,生成驾驶相关消息包括:响应于检测到交通工具所使用的地图信息要更新和/或响应于获取来自交通工具的地图更新请求消息,生成包括地图信息的感知消息和决策规划消息中的至少一项,其中地图更新请求消息指示以下至少一项:时间信息、交通工具的标识信息、请求更新地图的区域、以及地图更新的方式。
在一些实施例中,生成驾驶相关消息还包括:获取交通工具的实时运行信息、辅助规划信息、车身信息以及特殊交通工具指示信息中的至少一项;以及还基于所获取的信息,生成决策规划消息和控制消息中的至少一项。
在一些实施例中,实时运行信息包括交通工具的位置信息、行驶方向信息、行驶路线信息、行驶速度信息、操作状态信息以及部件状态信息中的至少一项。在一些实施例中,辅助规划信息包括交通工具的行驶意图的指示、计划行驶路线信息以及速度限制信息中的至少一项。在一些实施例中,车身信息包括交通工具的标识、类型、描述性信息、当前行驶路线信息以及故障相关信息中的至少一项。在一些实施例中,特殊交通工具指示信息包括特殊交通工具类型和路权要求等级中的至少一项。
在一些实施例中,提供驾驶相关消息包括:响应于基于时间的触发、基于位置的触发和基于事件的触发中的至少一个,向交通工具提供驾驶相关消息。
在一些实施例中,提供驾驶相关消息包括:收针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;以及响应于请求接管消息,向交通工具提供决策规划消息和控制消息中的至少一个。
在一些实施例中,请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的行驶计划信息,请求接管的原因以及时间信息。
在一些实施例中,方法400还包括:响应于确定要至少部分接管对交通工具的驾驶控制,向交通工具提供接管通知消息,接管通知消息指示以下至少一项:接管的开始时间、接管的结束时间、接管的类型、接管的原因以及接管的策略。
在一些实施例中,方法400还包括:响应于确定对交通工具的驾驶控制需要另一交通工具的协作,基于感知信息来生成另一决策规划消息和另一控制消息中的至少一个;以及向另一交通工具提供所生成的另一决策规划消息和另一控制消息中的至少一个,以用于另一交通工具的驾驶控制。
在一些实施例中,生成驾驶相关消息包括:确定与交通工具相关联的远端执行机构上的远端执行操作;还基于与交通工具的本地执行机构相关的参数,将远端执行操作转换为控制消息,以控制本地执行机构执行与远端执行操作相同的操作。
图4B示出了根据本公开实施例的协助驾驶控制的方法402的流程图。方法402可以被实现在图1和图2中的路侧子系统112处。应当理解,虽然以特定顺序示出,方法402中的一些步骤可以以与所示出的不同顺序或者以并行方式执行。本公开的实施例在此方面不受限制。
在框480,获取与交通工具所处的环境相关的感知信息。在框485,基于感知信息确定与环境中存在的目标物体相关的描述,目标物体位于交通工具外部。在框490,至少基于与目标物体相关的描述来生成驾驶相关消息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个。在框495,向交通工具提供驾驶相关消息,以用于相对于目标物体来控制交通工具的行驶。
在一些实施例中,获取感知信息包括获取以下至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,生成驾驶相关消息包括:生成感知消息以至少包括目标物体相关的描述,包括目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,生成驾驶相关消息还包括:生成感知消息以包括交通工具所处的道路的交通状况信息,交通状况信息包括以下至少一项:道路的一个路段上的交通参与者密度,路段上的交通参与者的平均速度,路段的车道级别拥堵状况描述,路段的起始位置,以及当前交通状况的持续时间。
在一些实施例中,驾驶相关消息特定于环境中的车道,并且向交 通工具提供驾驶相关消息包括:广播驾驶相关消息,以使处于车道上的交通工具获得驾驶相关消息,或者响应于确定交通工具处于车道上,朝向交通工具发送驾驶相关消息。
在一些实施例中,驾驶相关消息包括决策规划消息,决策规划消息包括以下中的至少一项:车道的标识,车道的路权信息,针对交通工具的控制的起始时间,以及针对交通工具的控制的起始位置。
在一些实施例中,驾驶相关消息特定于交通工具。
在一些实施例中,提供驾驶相关消息包括:接收针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;向交通工具提供接管响应消息,以指示是否至少部分接管对交通工具的驾驶控制;以及响应于确定至少部分接管对交通工具的驾驶控制,向交通工具提供决策规划消息和控制消息中的至少一个。
在一些实施例中,请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的类型、交通工具的位置信息、交通工具的行驶计划信息,请求接管的原因以及时间信息,其中行驶计划信息包括以下中的至少一项:交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。在一些实施例中,接管响应消息指示以下至少一项:是否至少部分接管交通工具和接管交通工具的起始时间。
图5A示出了根据本公开实施例的用于驾驶控制的方法500的流程图。方法500可以被实现在图1和图2中的车载子系统132处。应当理解,虽然以特定顺序示出,方法500中的一些步骤可以以与所示出的不同顺序或者以并行方式执行。本公开的实施例在此方面不受限制。
在510,从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成,并且包括感知消息、决策规划消息和控制消息中的至少一个。在520,基于驾驶相关消息来控制交通工具在环境中的驾驶。
在一些实施例中,方法500还包括:向外部设备提供由与交通工具相关联地布置的传感设备感测到的交通工具感知信息,以作为感知信息的至少一部分。
在一些实施例中,方法500还包括:向外部设备提供交通工具的实时运行信息、辅助规划信息、车身信息以及特殊交通工具指示信息中的至少一个,以用于决策规划消息和控制消息中的至少一项的生成。
在一些实施例中,接收驾驶相关消息包括:向外部设备提供针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;以及响应于请求接管消息,接收决策规划消息和控制消息中的至少一个。在一些实施例中,请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的行驶计划信息,请求接管的原因以及时间信息。
在一些实施例中,方法500还包括:从外部设备接收针对请求接管消息的接管通知消息,接管通知消息指示以下至少一项:接管的开始时间、接管的结束时间、接管的类型、接管的原因以及接管的策略。
在一些实施例中,接收驾驶相关消息包括:向外部设备提供对交通工具的远程驾驶请求消息;以及响应于远程驾驶请求消息来接收控制消息,控制消息包括由交通工具的本地执行机构可执行的控制指令,控制消息通过基于与本地执行机构相关的参数将远端执行机构上的执行操作转换而得到。
图5B示出了根据本公开实施例的用于驾驶控制的方法502的流程图。方法502可以被实现在图1和图2中的车载子系统132处。应当理解,虽然以特定顺序示出,方法502中的一些步骤可以以与所示出的不同顺序或者以并行方式执行。本公开的实施例在此方面不受限制。
在框560,从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成并且指示与交通工具外部的目标物体相关的描述,并且驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个。在框570,基于驾驶相关消息,相对于目标物体来控制交通工具在环境中的驾驶。
在一些实施例中,感知信息包括以下中的至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,接收驾驶相关消息包括:接收感知消息以至少包括目标物体相关的描述,包括目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,接收驾驶相关消息还包括:接收感知消息以包括交通工具所处的道路的交通状况信息,交通状况信息包括以下至少一项:道路的一个路段上的交通参与者密度,路段上的交通参与者的平均速度,路段的车道级别拥堵状况描述,路段的起始位置,以及当前交通状况的持续时间。
在一些实施例中,驾驶相关消息特定于交通工具所处的车道,并且驾驶相关消息包括决策规划消息,决策规划消息包括以下中的至少一项:车道的标识,车道的路权信息,针对交通工具的控制的起始时间,以及针对交通工具的控制的起始位置。
在一些实施例中,驾驶相关消息特定于交通工具。
在一些实施例中,方法502还包括:向至少另一个交通工具发送请求脱困消息,请求脱困消息包括以下至少一项:交通工具的标识信息,交通工具的类型,交通工具的位置信息,行驶方向信息,行驶路线信息,行驶速度信息,以及影响交通工具的行驶的周围交通工具的位置信息;以及从另一交通工具接收脱困响应消息,脱困响应消息包括以下至少一项:另一交通工具的标识信息,另一交通工具是否避让交通工具的指示,另一交通工具的规划路径,以及另一交通工具的规划速度。
在一些实施例中,方法502还包括:向另一交通工具相关联的车载子系统提供另一感知消息,感知消息包括环境中的另一目标物体的描述,感知消息包括另一目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,接收驾驶相关消息包括:向外部设备提供针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通 工具的驾驶控制;响应于请求接管消息,接收决策规划消息和控制消息中的至少一个;以及其中请求接管消息指示以下至少一项:请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的类型、交通工具的位置信息、交通工具的行驶计划信息,请求接管的原因以及时间信息,其中行驶计划信息包括以下中的至少一项:交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。
示例装置和设备
图6A示出了根据本公开实施例的用于协助驾驶控制的装置600的示意性框图。装置600可以被包括在图1和图2中的路侧子系统112中或者被实现为路侧子系统112。如图6A所示,装置600包括感知获取模块610,被配置为由交通工具的外部设备获取与交通工具所处的环境相关的感知信息,感知信息至少包括与环境中存在的物体相关的信息;消息生成模块620,被配置为至少基于感知信息,生成针对交通工具的驾驶相关消息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及消息供应模块630,被配置为向交通工具提供驾驶相关消息,以用于交通工具的驾驶控制。
在一些实施例中,感知获取模块被配置为获取以下至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,消息生成模块包括:基于地图更新的消息生成模块,被配置为响应于检测到交通工具所使用的地图信息要更新和/或响应于获取来自交通工具的地图更新请求消息,生成包括地图信息的感知消息和决策规划消息中的至少一项。在一些实施例中,地图更新请求消息指示以下至少一项:时间信息、交通工具的标识信息、请求更新地图的区域、以及地图更新的方式。
在一些实施例中,消息生成模块还包括:辅助信息获取模块,被配置为获取交通工具的实时运行信息、辅助规划信息、车身信息以及特 殊交通工具指示信息中的至少一项;以及基于辅助信息的消息生成模块,被配置为还基于所获取的信息,生成决策规划消息和控制消息中的至少一项。
在一些实施例中,消息供应模块,被配置为:基于触发的消息供应模块,被配置为响应于基于时间的触发、基于位置的触发和基于事件的触发中的至少一个,向交通工具提供驾驶相关消息。
在一些实施例中,消息供应模块包括:接管请求接收模块,被配置为接收针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;以及基于接管请求的消息供应模块,被配置为响应于请求接管消息,向交通工具提供决策规划消息和控制消息中的至少一个。
在一些实施例中,装置600还包括:通知供应模块,被配置为响应于确定要至少部分接管对交通工具的驾驶控制,向交通工具提供接管通知消息,接管通知消息指示以下至少一项:接管的开始时间、接管的结束时间、接管的类型、接管的原因以及接管的策略。
在一些实施例中,装置600还包括:另一消息生成模块,被配置为响应于确定对交通工具的驾驶控制需要另一交通工具的协作,基于感知信息来生成另一决策规划消息和另一控制消息中的至少一个;以及另一消息供应模块,被配置为向另一交通工具提供所生成的另一决策规划消息和另一控制消息中的至少一个,以用于另一交通工具的驾驶控制。
在一些实施例中,消息生成模块包括:操作确定模块,被配置为确定与交通工具相关联的远端执行机构上的远端执行操作;基于转换的消息生成模块,被配置为还基于与交通工具的本地执行机构相关的参数,将远端执行操作转换为控制消息,以控制本地执行机构执行与远端执行操作相同的操作。
图6B示出了根据本公开实施例的用于协助驾驶控制的装置602的示意性框图。装置602可以被包括在图1和图2中的路侧子系统112中或者被实现为路侧子系统112。如图6B所示,装置602包括感知获取模块680,被配置为由交通工具的外部设备获取与交通工具所处的环境相关的感知信息;目标物确定模块685,被配置为基于感知信息确定与 环境中存在的目标物体相关的描述,目标物体位于交通工具外部;消息生成模块690,被配置为至少基于与目标物体相关的描述来生成驾驶相关消息,驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及消息供应模块695,被配置为向交通工具提供驾驶相关消息,以用于相对于目标物体来控制交通工具的行驶。
在一些实施例中,感知获取模块被配置为获取以下至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,消息生成模块包括:第一感知消息生成模块,被配置为生成感知消息以至少包括目标物体相关的描述,包括目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,消息生成模块包括:第二感知消息生成模块,被配置为生成感知消息以包括交通工具所处的道路的交通状况信息,交通状况信息包括以下至少一项:道路的一个路段上的交通参与者密度,路段上的交通参与者的平均速度,路段的车道级别拥堵状况描述,路段的起始位置,以及当前交通状况的持续时间。
在一些实施例中,环境中的车道,并且消息供应模块包括:消息广播模块,被配置为广播驾驶相关消息,以使处于车道上的交通工具获得驾驶相关消息;或者消息发送模块,被配置为响应于确定交通工具处于车道上,朝向交通工具发送驾驶相关消息。
在一些实施例中,驾驶相关消息包括决策规划消息,决策规划消息包括以下中的至少一项:车道的标识,车道的路权信息,针对交通工具的控制的起始时间,以及针对交通工具的控制的起始位置。
在一些实施例中,驾驶相关消息特定于交通工具。
在一些实施例中,消息供应模块包括:请求接收模块,被配置为接收针对交通工具的请求接管消息,请求接管消息指示请求至少部分接 管对交通工具的驾驶控制;响应供应模块,被配置为向交通工具提供接管响应消息,以指示是否至少部分接管对交通工具的驾驶控制;以及决策规划和/或控制消息供应模块,被配置为响应于确定至少部分接管对交通工具的驾驶控制,向交通工具提供决策规划消息和控制消息中的至少一个。
在一些实施例中,请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的类型、交通工具的位置信息、交通工具的行驶计划信息,请求接管的原因以及时间信息,其中行驶计划信息包括以下中的至少一项:交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。在一些实施例中,接管响应消息指示以下至少一项:是否至少部分接管交通工具和接管交通工具的起始时间。
图7A示出了根据本公开实施例的用于驾驶控制的装置700的示意性框图。装置700可以被包括在图1和图2中的车载子系统132中或者被实现为车载子系统132。如图7A所示,装置700包括消息接收模块710,被配置为从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成,并且包括感知消息、决策规划消息和控制消息中的至少一个;以及驾驶相关控制模块720,被配置为基于驾驶相关消息来控制交通工具在环境中的驾驶。
在一些实施例中,装置700还包括:感知供应模块,被配置为向外部设备提供由与交通工具相关联地布置的传感设备感测到的交通工具感知信息,以作为感知信息的至少一部分。
在一些实施例中,装置700还包括:辅助信息供应模块,被配置为向外部设备提供交通工具的实时运行信息、辅助规划信息、车身信息以及特殊交通工具指示信息中的至少一个,以用于决策规划消息和控制消息中的至少一项的生成。
在一些实施例中,消息接收模块包括:请求接管供应模块,被配置为向外部设备提供针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;以及基于接管的消息接收模块,被配置为响应于请求接管消息,接收决策规划消息和控制消息中的 至少一个。请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的行驶计划信息,请求接管的原因以及时间信息。
在一些实施例中,装置700还包括:通知接收模块,被配置为从外部设备接收针对请求接管消息的接管通知消息,接管通知消息指示以下至少一项:接管的开始时间、接管的结束时间、接管的类型、接管的原因以及接管的策略。
在一些实施例中,接收驾驶相关消息包括:远程驾驶请求模块,被配置为向外部设备提供对交通工具的远程驾驶请求消息;以及控制消息接收模块,被配置为响应于远程驾驶请求消息来接收控制消息,控制消息包括由交通工具的本地执行机构可执行的控制指令,控制消息通过基于与本地执行机构相关的参数将远端执行机构上的执行操作转换而得到。
图7B示出了根据本公开实施例的用于驾驶控制的装置702的示意性框图。装置702可以被包括在图1和图2中的车载子系统132中或者被实现为车载子系统132。如图7B所示,装置702包括消息接收模块750,被配置为从交通工具的外部设备接收驾驶相关消息,驾驶相关消息基于与交通工具所处的环境相关的感知信息生成并且指示与交通工具外部的目标物体相关的描述,并且驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及驾驶控制模块760,被配置为基于驾驶相关消息,相对于目标物体来控制交通工具在环境中的驾驶。
在一些实施例中,感知信息包括以下中的至少一项:由被布置在环境中并且独立于交通工具的传感设备感测到的路侧感知信息;由与交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
在一些实施例中,消息接收模块包括:第一感知消息接收模块,被配置为接收感知消息以至少包括目标物体相关的描述,包括目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,消息接收模块包括:第二感知消息接收模块, 被配置为接收感知消息以包括交通工具所处的道路的交通状况信息,交通状况信息包括以下至少一项:道路的一个路段上的交通参与者密度,路段上的交通参与者的平均速度,路段的车道级别拥堵状况描述,路段的起始位置,以及当前交通状况的持续时间。
在一些实施例中,驾驶相关消息特定于交通工具所处的车道,并且驾驶相关消息包括决策规划消息,决策规划消息包括以下中的至少一项:车道的标识,车道的路权信息,针对交通工具的控制的起始时间,以及针对交通工具的控制的起始位置。
在一些实施例中,驾驶相关消息特定于交通工具。
在一些实施例中,装置702还包括:请求脱困模块,被配置为向至少另一个交通工具发送请求脱困消息,请求脱困消息包括以下至少一项:交通工具的标识信息,交通工具的类型,交通工具的位置信息,行驶方向信息,行驶路线信息,行驶速度信息,以及影响交通工具的行驶的周围交通工具的位置信息;以及响应接收模块,被配置为从另一交通工具接收脱困响应消息,脱困响应消息包括以下至少一项:另一交通工具的标识信息,另一交通工具是否避让交通工具的指示,另一交通工具的规划路径,以及另一交通工具的规划速度。
在一些实施例中,装置702还包括:感知消息供应模块,被配置为向另一交通工具相关联的车载子系统提供另一感知消息,感知消息包括环境中的另一目标物体的描述,感知消息包括另一目标物体的以下各项中的至少一项:分类,运动状态,运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
在一些实施例中,消息接收模块包括:请求接管模块,被配置为向外部设备提供针对交通工具的请求接管消息,请求接管消息指示请求至少部分接管对交通工具的驾驶控制;以及决策规划消息和/或控制消息接收模块,被配置为响应于请求接管消息,接收决策规划消息和控制消息中的至少一个。在一些实施例中,请求接管消息指示以下至少一项:请求接管消息指示以下至少一项:交通工具的标识信息、交通工具的类型、交通工具的位置信息、交通工具的行驶计划信息,请求接管的原因 以及时间信息,其中行驶计划信息包括以下中的至少一项:交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。
图8示出了可以用来实施本公开的实施例的示例设备800的示意性框图。设备800可以用于实现图1和图2的路侧子系统112或车载子系统132。如图所示,设备800包括计算单元801,其可以根据存储在只读存储器(ROM)802中的计算机程序指令或者从存储单元808加载到随机访问存储器(RAM)803中的计算机程序指令,来执行各种适当的动作和处理。在RAM 803中,还可存储设备800操作所需的各种程序和数据。计算单元801、ROM 802以及RAM 803通过总线804彼此相连。输入/输出(I/O)接口805也连接至总线804。
设备800中的多个部件连接至I/O接口805,包括:输入单元806,例如键盘、鼠标等;输出单元807,例如各种类型的显示器、扬声器等;存储单元808,例如磁盘、光盘等;以及通信单元809,例如网卡、调制解调器、无线通信收发机等。通信单元809允许设备800通过诸如因特网的计算机网络和/或各种电信网络与其他设备交换信息/数据。
计算单元801可以是各种具有处理和计算能力的通用和/或专用处理组件。计算单元801的一些示例包括但不限于中央处理单元(CPU)、图形处理单元(GPU)、各种专用的人工智能(AI)计算芯片、各种运行机器学习模型算法的计算单元、数字信号处理器(DSP)、以及任何适当的处理器、控制器、微控制器等。计算单元801执行上文所描述的各个方法和处理,例如方法800。例如,在一些实施例中,方法400或方法500可被实现为计算机软件程序,其被有形地包含于计算机可读介质,例如存储单元808。在一些实施例中,计算机程序的部分或者全部可以经由ROM 802和/或通信单元809而被载入和/或安装到设备800上。当计算机程序加载到RAM 803并由计算单元801执行时,可以执行上文描述的方法400或方法500的一个或多个步骤。备选地,在其他实施例中,计算单元801可以通过其他任何适当的方式(例如,借助于固件)而被配置为执行方法400或方法500。
本文中以上描述的功能可以至少部分地由一个或多个硬件逻辑部 件来执行。例如,非限制性地,可以使用的示范类型的硬件逻辑部件包括:场可编程门阵列(FPGA)、专用集成电路(ASIC)、专用标准产品(ASSP)、芯片上系统的系统(SOC)、负载可编程逻辑设备(CPLD)等等。
用于实施本公开的方法的程序代码可以采用一个或多个编程语言的任何组合来编写。这些程序代码可以提供给通用计算机、专用计算机或其他可编程数据处理装置的处理器或控制器,使得程序代码当由处理器或控制器执行时使流程图和/或框图中所规定的功能/操作被实施。程序代码可以完全在机器上执行、部分地在机器上执行,作为独立软件包部分地在机器上执行且部分地在远程机器上执行或完全在远程机器或服务器上执行。
在本公开的上下文中,计算机可读介质可以是有形的介质,其可以包含或存储以供指令执行系统、装置或设备使用或与指令执行系统、装置或设备结合地使用的程序。计算机可读介质可以是机器可读信号介质或机器可读储存介质。计算机可读介质可以包括但不限于电子的、磁性的、光学的、电磁的、红外的、或半导体系统、装置或设备,或者上述内容的任何合适组合。机器可读存储介质的更具体示例会包括基于一个或多个线的电气连接、便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或快闪存储器)、光纤、便捷式紧凑盘只读存储器(CD-ROM)、光学储存设备、磁储存设备、或上述内容的任何合适组合。
此外,虽然采用特定次序描绘了各操作,但是这应当理解为要求这样操作以所示出的特定次序或以顺序次序执行,或者要求所有图示的操作应被执行以取得期望的结果。在一定环境下,多任务和并行处理可能是有利的。同样地,虽然在上面论述中包含了若干具体实现细节,但是这些不应当被解释为对本公开的范围的限制。在单独的实施例的上下文中描述的某些特征还可以组合地实现在单个实现中。相反地,在单个实现的上下文中描述的各种特征也可以单独地或以任何合适的子组合的方式实现在多个实现中。
尽管已经采用特定于结构特征和/或方法逻辑动作的语言描述了 本主题,但是应当理解所附权利要求书中所限定的主题未必局限于上面描述的特定特征或动作。相反,上面所描述的特定特征和动作仅仅是实现权利要求书的示例形式。

Claims (41)

  1. 一种协助驾驶控制的方法,包括:
    由交通工具的外部设备获取与所述交通工具所处的环境相关的感知信息;
    基于所述感知信息确定与所述环境中存在的目标物体相关的描述,所述目标物体位于所述交通工具外部;
    至少基于与所述目标物体相关的所述描述来生成驾驶相关消息,所述驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及
    向所述交通工具提供所述驾驶相关消息,以用于相对于所述目标物体来控制所述交通工具的行驶。
  2. 根据权利要求1所述的方法,其中获取所述感知信息包括获取以下至少一项:
    由被布置在所述环境中并且独立于交通工具的传感设备感测到的路侧感知信息;
    由与所述交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及
    由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
  3. 根据权利要求1所述的方法,其中生成所述驾驶相关消息包括:
    生成所述感知消息以至少包括所述目标物体相关的所述描述,包括所述目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
  4. 根据权利要求1所述的方法,其中生成所述驾驶相关消息还包括:
    生成所述感知消息以包括所述交通工具所处的道路的交通状况信息,所述交通状况信息包括以下至少一项:所述道路的一个路段上的交通参与者密度,所述路段上的所述交通参与者的平均速度,所述路段的车道级别拥堵状况描述,所述路段的起始位置,以及当前交通状况的持续时间。
  5. 根据权利要求1所述的方法,其中所述驾驶相关消息特定于所述环境中的车道,并且向所述交通工具提供所述驾驶相关消息包括:
    广播所述驾驶相关消息,以使处于所述车道上的所述交通工具获得所述驾驶相关消息;或者
    响应于确定所述交通工具处于所述车道上,朝向所述交通工具发送所述驾驶相关消息。
  6. 根据权利要求5所述的方法,其中所述驾驶相关消息包括所述决策规划消息,所述决策规划消息包括以下中的至少一项:所述车道的标识,所述车道的路权信息,针对所述交通工具的控制的起始时间,以及针对所述交通工具的控制的起始位置。
  7. 根据权利要求1所述的方法,其中所述驾驶相关消息特定于所述交通工具。
  8. 根据权利要求1所述的方法,其中提供所述驾驶相关消息包括:
    接收针对所述交通工具的请求接管消息,所述请求接管消息指示请求至少部分接管对所述交通工具的驾驶控制;
    向所述交通工具提供接管响应消息,以指示是否至少部分接管对所述交通工具的所述驾驶控制;以及
    响应于确定至少部分接管对所述交通工具的所述驾驶控制,向所述交通工具提供所述决策规划消息和所述控制消息中的至少一个。
  9. 根据权利要求8所述的方法,其中所述请求接管消息指示以下至少一项:所述交通工具的标识信息、所述交通工具的类型、 所述交通工具的位置信息、所述交通工具的行驶计划信息,请求接管的原因以及时间信息,其中所述行驶计划信息包括以下中的至少一项:所述交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度;并且其中
    所述接管响应消息指示以下至少一项:是否至少部分接管所述交通工具和接管所述交通工具的起始时间。
  10. 一种驾驶控制的方法,包括:
    从交通工具的外部设备接收驾驶相关消息,所述驾驶相关消息基于与所述交通工具所处的环境相关的感知信息生成并且指示与所述交通工具外部的目标物体相关的描述,并且所述驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及
    基于所述驾驶相关消息,相对于所述目标物体来控制所述交通工具在所述环境中的驾驶。
  11. 根据权利要求10所述的方法,其中所述感知信息包括以下中的至少一项:
    由被布置在所述环境中并且独立于交通工具的传感设备感测到的路侧感知信息;
    由与所述交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及
    由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
  12. 根据权利要求10所述的方法,其中接收所述驾驶相关消息包括:
    接收所述感知消息以至少包括所述目标物体相关的所述描述,包括所述目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
  13. 根据权利要求10所述的方法,其中接收所述驾驶相关消 息还包括:
    接收所述感知消息以包括所述交通工具所处的道路的交通状况信息,所述交通状况信息包括以下至少一项:所述道路的一个路段上的交通参与者密度,所述路段上的所述交通参与者的平均速度,所述路段的车道级别拥堵状况描述,所述路段的起始位置,以及当前交通状况的持续时间。
  14. 根据权利要求10所述的方法,其中所述驾驶相关消息特定于所述交通工具所处的车道,并且所述驾驶相关消息包括所述决策规划消息,所述决策规划消息包括以下中的至少一项:所述车道的标识,所述车道的路权信息,针对所述交通工具的控制的起始时间,以及针对所述交通工具的控制的起始位置。
  15. 根据权利要求10所述的方法,其中所述驾驶相关消息特定于所述交通工具。
  16. 根据权利要求10至15中任一项所述的方法,还包括:
    向至少另一个交通工具发送请求脱困消息,所述请求脱困消息包括以下至少一项:所述交通工具的标识信息,所述交通工具的类型,所述交通工具的位置信息,行驶方向信息,行驶路线信息,行驶速度信息,以及影响所述交通工具的行驶的周围交通工具的位置信息;以及
    从所述另一交通工具接收脱困响应消息,所述脱困响应消息包括以下至少一项:所述另一交通工具的标识信息,所述另一交通工具是否避让所述交通工具的指示,所述另一交通工具的规划路径,以及所述另一交通工具的规划速度。
  17. 根据权利要求10至15中任一项所述的方法,还包括:
    向另一交通工具相关联的车载子系统提供另一感知消息,所述感知消息包括所述环境中的另一目标物体的描述,所述感知消息包括所述另一目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息, 以及预测运动轨迹信息。
  18. 根据权利要求10至15中任一项所述的方法,其中接收所述驾驶相关消息包括:
    向所述外部设备提供针对所述交通工具的请求接管消息,所述请求接管消息指示请求至少部分接管对所述交通工具的驾驶控制;以及
    响应于所述请求接管消息,接收所述决策规划消息和所述控制消息中的至少一个,并且
    其中所述请求接管消息指示以下至少一项:所述请求接管消息指示以下至少一项:所述交通工具的标识信息、所述交通工具的类型、所述交通工具的位置信息、所述交通工具的行驶计划信息,请求接管的原因以及时间信息,其中所述行驶计划信息包括以下中的至少一项:所述交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。
  19. 一种协助驾驶控制的装置,包括:
    感知获取模块,被配置为由交通工具的外部设备获取与所述交通工具所处的环境相关的感知信息;
    目标物确定模块,被配置为基于所述感知信息确定与所述环境中存在的目标物体相关的描述,所述目标物体位于所述交通工具外部;
    消息生成模块,被配置为至少基于与所述目标物体相关的所述描述来生成驾驶相关消息,所述驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及
    消息供应模块,被配置为向所述交通工具提供所述驾驶相关消息,以用于相对于所述目标物体来控制所述交通工具的行驶。
  20. 根据权利要求19所述的装置,其中所述感知获取模块被配置为获取以下至少一项:
    由被布置在所述环境中并且独立于交通工具的传感设备感测到的路侧感知信息;
    由与所述交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及
    由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
  21. 根据权利要求19所述的装置,其中所述消息生成模块包括:
    第一感知消息生成模块,被配置为生成所述感知消息以至少包括所述目标物体相关的所述描述,包括所述目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
  22. 根据权利要求19所述的装置,其中所述消息生成模块包括:
    第二感知消息生成模块,被配置为生成所述感知消息以包括所述交通工具所处的道路的交通状况信息,所述交通状况信息包括以下至少一项:所述道路的一个路段上的交通参与者密度,所述路段上的所述交通参与者的平均速度,所述路段的车道级别拥堵状况描述,所述路段的起始位置,以及当前交通状况的持续时间。
  23. 根据权利要求19所述的装置,其中所述驾驶相关消息特定于所述环境中的车道,并且所述消息供应模块包括:
    消息广播模块,被配置为广播所述驾驶相关消息,以使处于所述车道上的所述交通工具获得所述驾驶相关消息;或者
    消息发送模块,被配置为响应于确定所述交通工具处于所述车道上,朝向所述交通工具发送所述驾驶相关消息。
  24. 根据权利要求23所述的装置,其中所述驾驶相关消息包括所述决策规划消息,所述决策规划消息包括以下中的至少一项:所述车道的标识,所述车道的路权信息,针对所述交通工具的控制的起始时间,以及针对所述交通工具的控制的起始位置。
  25. 根据权利要求19所述的装置,其中所述驾驶相关消息特 定于所述交通工具。
  26. 根据权利要求19所述的装置,其中所述消息供应模块包括:
    请求接收模块,被配置为接收针对所述交通工具的请求接管消息,所述请求接管消息指示请求至少部分接管对所述交通工具的驾驶控制;
    响应供应模块,被配置为向所述交通工具提供接管响应消息,以指示是否至少部分接管对所述交通工具的所述驾驶控制;以及
    决策规划和/或控制消息供应模块,被配置为响应于确定至少部分接管对所述交通工具的所述驾驶控制,向所述交通工具提供所述决策规划消息和所述控制消息中的至少一个。
  27. 根据权利要求26所述的装置,其中所述请求接管消息指示以下至少一项:所述交通工具的标识信息、所述交通工具的类型、所述交通工具的位置信息、所述交通工具的行驶计划信息,请求接管的原因以及时间信息,其中所述行驶计划信息包括以下中的至少一项:所述交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度;并且其中
    所述接管响应消息指示以下至少一项:是否至少部分接管所述交通工具和接管所述交通工具的起始时间。
  28. 一种驾驶控制的装置,包括:
    消息接收模块,被配置为从交通工具的外部设备接收驾驶相关消息,所述驾驶相关消息基于与所述交通工具所处的环境相关的感知信息生成并且指示与所述交通工具外部的目标物体相关的描述,并且所述驾驶相关消息包括感知消息、决策规划消息和控制消息中的至少一个;以及
    驾驶控制模块,被配置为基于所述驾驶相关消息,相对于所述目标物体来控制所述交通工具在所述环境中的驾驶。
  29. 根据权利要求28所述的装置,其中所述感知信息包括以下中的至少一项:
    由被布置在所述环境中并且独立于交通工具的传感设备感测到的路侧感知信息;
    由与所述交通工具相关联地布置的传感设备感测到的交通工具感知信息;以及
    由与另一交通工具相关联地布置的传感设备感测到的交通工具感知信息。
  30. 根据权利要求28所述的装置,其中所述消息接收模块包括:
    第一感知消息接收模块,被配置为接收所述感知消息以至少包括所述目标物体相关的所述描述,包括所述目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
  31. 根据权利要求28所述的装置,其中所述消息接收模块包括:
    第二感知消息接收模块,被配置为接收所述感知消息以包括所述交通工具所处的道路的交通状况信息,所述交通状况信息包括以下至少一项:所述道路的一个路段上的交通参与者密度,所述路段上的所述交通参与者的平均速度,所述路段的车道级别拥堵状况描述,所述路段的起始位置,以及当前交通状况的持续时间。
  32. 根据权利要求28所述的装置,其中所述驾驶相关消息特定于所述交通工具所处的车道,并且所述驾驶相关消息包括所述决策规划消息,所述决策规划消息包括以下中的至少一项:所述车道的标识,所述车道的路权信息,针对所述交通工具的控制的起始时间,以及针对所述交通工具的控制的起始位置。
  33. 根据权利要求28所述的装置,其中所述驾驶相关消息特定于所述交通工具。
  34. 根据权利要求28至33中任一项所述的装置,还包括:
    请求脱困模块,被配置为向至少另一个交通工具发送请求脱 困消息,所述请求脱困消息包括以下至少一项:所述交通工具的标识信息,所述交通工具的类型,所述交通工具的位置信息,行驶方向信息,行驶路线信息,行驶速度信息,以及影响所述交通工具的行驶的周围交通工具的位置信息;以及
    响应接收模块,被配置为从所述另一交通工具接收脱困响应消息,所述脱困响应消息包括以下至少一项:所述另一交通工具的标识信息,所述另一交通工具是否避让所述交通工具的指示,所述另一交通工具的规划路径,以及所述另一交通工具的规划速度。
  35. 根据权利要求28至33中任一项所述的装置,还包括:
    感知消息供应模块,被配置为向另一交通工具相关联的车载子系统提供另一感知消息,所述感知消息包括所述环境中的另一目标物体的描述,所述感知消息包括所述另一目标物体的以下各项中的至少一项:分类,运动状态,所述运动状态的保持时间,物理外观的描述信息,类型,位置信息,移动速度,移动方向,加速度,加速度方向,历史运动轨迹信息,以及预测运动轨迹信息。
  36. 根据权利要求28至33中任一项所述的装置,其中所述消息接收模块包括:
    请求接管模块,被配置为向所述外部设备提供针对所述交通工具的请求接管消息,所述请求接管消息指示请求至少部分接管对所述交通工具的驾驶控制;以及
    决策规划消息和/或控制消息接收模块,被配置为响应于所述请求接管消息,接收所述决策规划消息和所述控制消息中的至少一个,并且
    其中所述请求接管消息指示以下至少一项:所述请求接管消息指示以下至少一项:所述交通工具的标识信息、所述交通工具的类型、所述交通工具的位置信息、所述交通工具的行驶计划信息,请求接管的原因以及时间信息,其中所述行驶计划信息包括以下中的至少一项:所述交通工具的行驶方向、行驶目的地、计划行驶路线、所允许的最大速度和所允许的最大加速度。
  37. 一种电子设备,所述设备包括:
    一个或多个处理器;以及
    存储装置,用于存储一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1至9中任一项所述的方法。
  38. 一种电子设备,所述设备包括:
    至少一个处理器;以及
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求10至18中任一项所述的方法。
  39. 一种存储有计算机指令的非瞬时计算机可读存储介质,所述计算机指令用于使所述计算机执行权利要求1至9中任一项所述的方法。
  40. 一种存储有计算机指令的非瞬时计算机可读存储介质,所述计算机指令用于使所述计算机执行权利要求10至18中任一项所述的方法。
  41. 一种用于协同驾驶控制的系统,包括:
    路侧子系统,包括根据权利要求19至27中任一项所述的装置;以及
    车载子系统,包括根据权利要求28至36中任一项所述的装置。
PCT/CN2019/103876 2019-02-13 2019-08-30 用于驾驶控制的方法、装置、设备、介质和系统 WO2020164238A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19914783.6A EP3916696A4 (en) 2019-02-13 2019-08-30 METHOD, APPARATUS AND DRIVING CONTROL DEVICE, AND MEDIUM AND SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/074989 WO2020164021A1 (zh) 2019-02-13 2019-02-13 用于驾驶控制的方法、装置、设备、介质和系统
CNPCT/CN2019/074989 2019-02-13

Publications (1)

Publication Number Publication Date
WO2020164238A1 true WO2020164238A1 (zh) 2020-08-20

Family

ID=72043910

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/CN2019/074989 WO2020164021A1 (zh) 2019-02-13 2019-02-13 用于驾驶控制的方法、装置、设备、介质和系统
PCT/CN2019/103875 WO2020164237A1 (zh) 2019-02-13 2019-08-30 用于驾驶控制的方法、装置、设备、介质和系统
PCT/CN2019/103876 WO2020164238A1 (zh) 2019-02-13 2019-08-30 用于驾驶控制的方法、装置、设备、介质和系统

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/CN2019/074989 WO2020164021A1 (zh) 2019-02-13 2019-02-13 用于驾驶控制的方法、装置、设备、介质和系统
PCT/CN2019/103875 WO2020164237A1 (zh) 2019-02-13 2019-08-30 用于驾驶控制的方法、装置、设备、介质和系统

Country Status (3)

Country Link
US (2) US12030484B2 (zh)
EP (3) EP3926435B1 (zh)
WO (3) WO2020164021A1 (zh)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11679764B2 (en) * 2019-06-28 2023-06-20 Baidu Usa Llc Method for autonomously driving a vehicle based on moving trails of obstacles surrounding the vehicle
JP7325251B2 (ja) * 2019-07-10 2023-08-14 日立Astemo株式会社 電子制御装置
US20220383748A1 (en) * 2019-10-29 2022-12-01 Sony Group Corporation Vehicle control in geographical control zones
US12118470B2 (en) * 2019-10-29 2024-10-15 Denso International America, Inc. System for predicting aggressive driving
CN114761301B (zh) * 2019-12-02 2023-05-09 日产自动车株式会社 车辆控制方法及车辆控制装置
WO2021171049A1 (ja) * 2020-02-24 2021-09-02 日産自動車株式会社 車両制御方法及び車両制御装置
JP7287340B2 (ja) * 2020-04-30 2023-06-06 トヨタ自動車株式会社 情報処理装置、制御装置、車両、及び散水方法
CN114255604A (zh) * 2020-09-21 2022-03-29 阿波罗智联(北京)科技有限公司 用于驾驶控制的方法、装置、设备、介质和系统
JP7227997B2 (ja) * 2021-03-12 2023-02-22 本田技研工業株式会社 決定装置、決定方法、及びプログラム
CN112860575A (zh) * 2021-03-16 2021-05-28 同济大学 一种交通环境融合感知在环的自动驾驶算法仿真测试方法
CN112882949A (zh) * 2021-03-16 2021-06-01 同济大学 一种交通环境融合感知在环vthil的车联网仿真测试平台与方法
US12024204B2 (en) * 2021-04-09 2024-07-02 Direct Cursus Technology L.L.C Method of and system for predicting a maneuver of an object
CN113177980B (zh) * 2021-04-29 2023-12-26 北京百度网讯科技有限公司 用于自动驾驶的目标对象速度确定方法、装置及电子设备
US20230162374A1 (en) * 2021-11-19 2023-05-25 Shenzhen Deeproute.Ai Co., Ltd Method for forecasting motion trajectory, storage medium, and computer device
CN113879339A (zh) * 2021-12-07 2022-01-04 阿里巴巴达摩院(杭州)科技有限公司 自动驾驶的决策规划方法、电子设备及计算机存储介质
CN114120653A (zh) * 2022-01-26 2022-03-01 苏州浪潮智能科技有限公司 一种集中式车群决策控制方法、装置及电子设备
US20230311929A1 (en) * 2022-03-31 2023-10-05 Gm Cruise Holdings Llc Autonomous vehicle interaction with chassis control system to provide enhanced driving modes
US12046050B2 (en) * 2022-04-15 2024-07-23 Toyota Research Institute, Inc. Systems and methods for detecting traffic lights using hierarchical modeling
CN115359681A (zh) * 2022-07-20 2022-11-18 贵州大学 一种支持自动驾驶的路侧结构光摄像机的优化布设方法
CN115294801B (zh) * 2022-08-03 2023-09-22 广东凯莎科技有限公司 一种车载网络通讯方法
EP4325461A1 (en) * 2022-08-18 2024-02-21 Volvo Car Corporation Method for controlling a first vehicle, method for assisting a first vehicle in travelling along a planned trajectory, method for providing an information describing an amount of vehicles being positioned on a planned trajectory, data processing apparatuses, computer programs, vehicle and traffic control system
KR102704452B1 (ko) * 2022-12-29 2024-09-09 (주)에스유엠 자율주행 차량의 현재 위치기반 인프라 정보 송수신 시스템 및 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807633A (zh) * 2017-09-27 2018-03-16 北京图森未来科技有限公司 一种路侧设备、车载设备以及自动驾驶感知方法及系统
CN108182817A (zh) * 2018-01-11 2018-06-19 北京图森未来科技有限公司 自动驾驶辅助系统、路侧端辅助系统和车载端辅助系统
CN108447291A (zh) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 一种智能道路设施系统及控制方法
JP2018169790A (ja) * 2017-03-29 2018-11-01 公立大学法人岩手県立大学 車両制御システム

Family Cites Families (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3509631B2 (ja) * 1999-05-28 2004-03-22 トヨタ自動車株式会社 車輌用データ記録装置
US20060212195A1 (en) * 2005-03-15 2006-09-21 Veith Gregory W Vehicle data recorder and telematic device
TWM282149U (en) * 2005-03-25 2005-12-01 S C Security World Co Ltd Recording device used on a vehicle for recording the path of the moving object and for recording video and audio information of environment
JP4254779B2 (ja) * 2005-12-07 2009-04-15 トヨタ自動車株式会社 車両異常通知システム及び装置並びに車載装置
US20110205359A1 (en) * 2010-02-19 2011-08-25 Panasonic Corporation Video surveillance system
US8731742B2 (en) * 2012-04-05 2014-05-20 GM Global Technology Operations LLC Target vehicle movement classification
US8700251B1 (en) * 2012-04-13 2014-04-15 Google Inc. System and method for automatically detecting key behaviors by vehicles
DE102012011888A1 (de) * 2012-06-15 2013-12-19 Connaught Electronics Ltd. Verfahren zum Bestimmen der voraussichtlichen Fahrzeugtrajektorie anhand von Daten eines kostengünstigen Lenkwinkelsensors, Lenkwinkelsensor und Kamerasystem
TWI488764B (zh) * 2013-03-15 2015-06-21 Ind Tech Res Inst 鄰車動態駕駛資訊輔助系統
US11669090B2 (en) * 2014-05-20 2023-06-06 State Farm Mutual Automobile Insurance Company Autonomous vehicle operation feature monitoring and evaluation of effectiveness
US9573592B2 (en) * 2014-12-23 2017-02-21 Toyota Motor Engineering & Manufacturing North America, Inc. Risk mitigation for autonomous vehicles relative to oncoming objects
WO2016126317A1 (en) 2015-02-06 2016-08-11 Delphi Technologies, Inc. Method of automatically controlling an autonomous vehicle based on electronic messages from roadside infrastructure of other vehicles
JP6431594B2 (ja) * 2015-03-31 2018-11-28 日立オートモティブシステムズ株式会社 自動運転制御装置
US10292064B2 (en) * 2016-03-09 2019-05-14 Qualcomm Incorporated Detection and reporting of a collision event
DE102016209556A1 (de) * 2016-06-01 2017-12-07 Robert Bosch Gmbh Verfahren zum Bereitstellen einer Information bezüglich eines Fußgängers in einem Umfeld eines Fahrzeugs und Verfahren zum Steuern eines Fahrzeugs
DE102016211649B4 (de) * 2016-06-28 2022-01-13 Audi Ag Verfahren zur Steuerung einer Lichteinheit einer Kraftfahrzeugaußenbeleuchtung, Kraftfahrzeug, fahrzeugexterne Einrichtung und System
CN106128140B (zh) 2016-08-11 2017-12-05 江苏大学 车联网环境下行车服务主动感知系统及方法
US11767012B2 (en) * 2017-01-19 2023-09-26 Hl Klemove Corp. Camera system for intelligent driver assistance system, and driver assistance system and method
WO2018208591A1 (en) * 2017-05-06 2018-11-15 The Texas A&M University System Method and system for vehicular communication and safety monitoring of driving environment to provide early warnings in real-time
US10614721B2 (en) 2017-06-08 2020-04-07 International Business Machines Corporation Providing parking assistance based on multiple external parking data sources
US20210342946A1 (en) * 2017-09-06 2021-11-04 State Farm Mutual Automobile Insurance Company Using a Distributed Ledger for Line Item Determination
JP7050449B2 (ja) * 2017-10-04 2022-04-08 パナソニック株式会社 路側装置、通信システムおよび危険検知方法
TWI674210B (zh) * 2017-12-04 2019-10-11 財團法人資訊工業策進會 偵測具威脅性車輛之系統及方法
CN108200552B (zh) 2017-12-14 2020-08-25 华为技术有限公司 一种v2x通信方法和装置
CN108010360A (zh) 2017-12-27 2018-05-08 中电海康集团有限公司 一种基于车路协同的自动驾驶环境感知系统
US20190034716A1 (en) * 2017-12-28 2019-01-31 Intel Corporation Privacy-preserving demographics identification
US11070988B2 (en) 2017-12-29 2021-07-20 Intel Corporation Reconfigurable network infrastructure for collaborative automated driving
US10762786B1 (en) * 2018-01-09 2020-09-01 State Farm Mutual Automobile Insurance Company Vehicle collision alert system and method for detecting driving hazards
JP6902676B2 (ja) * 2018-02-28 2021-07-14 ニッサン ノース アメリカ,インク 自律走行車の意思決定のための交通ネットワークインフラストラクチャ
US10203699B1 (en) 2018-03-30 2019-02-12 Toyota Jidosha Kabushiki Kaisha Selective remote control of ADAS functionality of vehicle
WO2019208015A1 (ja) * 2018-04-26 2019-10-31 ソニーセミコンダクタソリューションズ株式会社 情報処理装置、移動装置、情報処理システム、および方法、並びにプログラム
CN108833833B (zh) 2018-06-20 2021-02-02 长安大学 面向智能网联汽车场景图像数据感知与协同处理系统
CN110928284B (zh) * 2018-09-19 2024-03-29 阿波罗智能技术(北京)有限公司 辅助控制车辆的自动驾驶的方法、设备、介质和系统
CN109270524B (zh) * 2018-10-19 2020-04-07 禾多科技(北京)有限公司 基于无人驾驶的多数据融合障碍物检测装置及其检测方法
CN109240310A (zh) * 2018-11-15 2019-01-18 安徽酷哇机器人有限公司 自动驾驶避障方法
CN109300324A (zh) * 2018-11-30 2019-02-01 北京小马智行科技有限公司 一种无人驾驶汽车的环境信息获取方法及装置
US11479245B2 (en) * 2019-02-14 2022-10-25 Honda Motor Co., Ltd. Semi-autonomous vehicle control system and method of controlling a semi-autonomous vehicle
CA3173966A1 (en) * 2019-03-08 2020-09-17 Leddartech Inc. Lidar system, appartus communicating with the lidar system, and apparatus located in a field of view (fov) of the lidar system
JP7140067B2 (ja) * 2019-07-12 2022-09-21 株式会社デンソー 車両制御装置、車両制御方法、及びプログラム
US20210039664A1 (en) * 2019-08-08 2021-02-11 Toyota Jidosha Kabushiki Kaisha Machine learning system for modifying adas behavior to provide optimum vehicle trajectory in a region
KR20210052621A (ko) * 2019-10-29 2021-05-11 엘지전자 주식회사 자율주행 레벨 결정 장치 및 방법
US11745738B2 (en) * 2021-01-19 2023-09-05 Delphi Technologies Ip Limited System and method for controlling a propulsion system inverter

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018169790A (ja) * 2017-03-29 2018-11-01 公立大学法人岩手県立大学 車両制御システム
CN107807633A (zh) * 2017-09-27 2018-03-16 北京图森未来科技有限公司 一种路侧设备、车载设备以及自动驾驶感知方法及系统
CN108182817A (zh) * 2018-01-11 2018-06-19 北京图森未来科技有限公司 自动驾驶辅助系统、路侧端辅助系统和车载端辅助系统
CN108447291A (zh) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 一种智能道路设施系统及控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3916696A4 *

Also Published As

Publication number Publication date
EP3926435A4 (en) 2022-06-29
WO2020164021A1 (zh) 2020-08-20
EP3916508A4 (en) 2022-08-03
US20220105926A1 (en) 2022-04-07
US20220139222A1 (en) 2022-05-05
EP3916696A1 (en) 2021-12-01
WO2020164237A1 (zh) 2020-08-20
EP3926435B1 (en) 2024-06-12
EP3916508A1 (en) 2021-12-01
US12030484B2 (en) 2024-07-09
EP3926435A1 (en) 2021-12-22
EP3916696A4 (en) 2022-07-20

Similar Documents

Publication Publication Date Title
WO2020164238A1 (zh) 用于驾驶控制的方法、装置、设备、介质和系统
US11842642B2 (en) Connected automated vehicle highway systems and methods related to heavy vehicles
CN107024927B (zh) 一种自动驾驶系统和方法
CN111462497A (zh) 一种交通数据下发方法、系统、终端及存储介质
CN113496602B (zh) 智能路侧工具箱
US12046136B2 (en) Distributed driving with flexible roadside resources
CN110603181A (zh) 一种智能驾驶车辆让行方法、装置及车载设备
JP2023508083A (ja) 自動化車両を制御するための方法および装置
CN113160547B (zh) 一种自动驾驶方法及相关设备
CN108352112A (zh) 用于确定车辆的驾驶意图的方法和车辆通信系统
WO2021102957A1 (zh) 一种车道保持方法、车载设备和存储介质
CN110606070B (zh) 一种智能驾驶车辆及其制动方法、车载设备和存储介质
CN111464972A (zh) 经优先级排序的车辆消息传递
CN114501385A (zh) 一种应用于智能网联交通系统的协同自动驾驶系统
CN110562269A (zh) 一种智能驾驶车辆故障处理的方法、车载设备和存储介质
DE112022003364T5 (de) Komplementäres steuersystem für ein autonomes fahrzeug
CN110568847A (zh) 一种车辆的智能控制系统、方法,车载设备和存储介质
CN110599790B (zh) 一种智能驾驶车辆进站停靠的方法、车载设备和存储介质
WO2020248136A1 (zh) 用于驾驶控制的方法、装置、设备、介质和系统
CN112394716A (zh) 自动驾驶车辆队列的控制方法、装置、系统及车辆
CN113272195A (zh) 用于智能网联车辆的控制系统及控制方法
US20220281480A1 (en) Vehicle characteristics, motion state, planned movements and related sensory data sharing and networking for safe operation of groups of self-driving and driving assisted vehicles
US20240036575A1 (en) Processing device, processing method, processing system, storage medium
CN113269961A (zh) 货运车辆车路协同运输系统及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19914783

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019914783

Country of ref document: EP

Effective date: 20210826