WO2022105395A1 - Data processing method, apparatus, and system, computer device, and non-transitory storage medium - Google Patents
Data processing method, apparatus, and system, computer device, and non-transitory storage medium Download PDFInfo
- Publication number
- WO2022105395A1 WO2022105395A1 PCT/CN2021/118215 CN2021118215W WO2022105395A1 WO 2022105395 A1 WO2022105395 A1 WO 2022105395A1 CN 2021118215 W CN2021118215 W CN 2021118215W WO 2022105395 A1 WO2022105395 A1 WO 2022105395A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- information
- path planning
- scenario data
- simulation
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 29
- 238000004088 simulation Methods 0.000 claims abstract description 167
- 238000012360 testing method Methods 0.000 claims abstract description 64
- 238000000034 method Methods 0.000 claims abstract description 60
- 230000008569 process Effects 0.000 claims abstract description 44
- 238000012545 processing Methods 0.000 claims description 16
- 230000001174 ascending effect Effects 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 7
- 238000004422 calculation algorithm Methods 0.000 abstract description 22
- 238000005516 engineering process Methods 0.000 abstract description 13
- 230000000694 effects Effects 0.000 abstract description 8
- 238000010586 diagram Methods 0.000 description 16
- 230000015654 memory Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000013441 quality evaluation Methods 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000012300 Sequence Analysis Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
- B60W2050/0083—Setting, resetting, calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2119/00—Details relating to the type or aim of the analysis or the optimisation
- G06F2119/02—Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/15—Vehicle, aircraft or watercraft design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
Definitions
- This application relates to the field of computer technologies, and in particular, to a data processing method, apparatus, and system, a computer device, and a non-transitory storage medium.
- a self-driving vehicle also referred to as an unmanned vehicle or a wheeled mobile robot
- a self-driving vehicle is an intelligent vehicle that is controlled by a computer device to implement unmanned driving.
- the cost and time cycle for performing debugging of an unmanned vehicle control system based on driving in a real environment are large. Therefore, a control system or self-driving algorithm of an unmanned vehicle is usually debugged in advance based on a simulation scenario (that is, a screnario generated by simulation system) , so as to meet the rapid iteration requirements of debugging the self-driving algorithm.
- vehicle obstacles are manually designed and parameters such as speeds and attitudes are set, to simulate the road conditions of a real environment.
- the self-driving vehicle senses and responds to the manually designed vehicle obstacles, so as to intelligently plan a subsequent driving speed curve.
- noise interference is generally ignored for the manually designed vehicle obstacles.
- the simulation scenario cannot duplicate real road conditions, which negatively affects the accuracy of the self-driving algorithm and the intelligence of the self-driving vehicle.
- Embodiments of this application provide a data processing method, apparatus, and system, a computer device, and a non-transitory storage medium, to improve the accuracy of a self-driving algorithm and improve the intelligence of a self-driving vehicle.
- the technical solutions are as follows:
- a computer-implemented data processing method including:
- the performing path planning on the vehicle, to output simulated travel information of the vehicle includes:
- the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the scenario data includes the initial positions of the obstacles
- the performing path planning on the vehicle based on the simulation travel position includes:
- the scenario data includes the lane line information
- the performing path planning on the vehicle based on the simulation travel position includes:
- the scenario data includes the traffic light information
- the performing path planning on the vehicle based on the simulation travel position includes:
- the determining information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle comprises:
- the determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process includes:
- parsing the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp;
- the obtaining at least one time sequence corresponded to the at least one type of initial scenario data includes:
- a data processing apparatus including processor configured to execute:
- a determining module configured to determine at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process
- a path planning module configured to the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle;
- an adjustment module configured to perform parameter adjustment on the path planning model based on the simulated travel information.
- the path planning module includes:
- a determining unit configured to determine a simulation travel position of the vehicle based on the at least one type of scenario data
- a path planning unit configured to perform path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
- the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the scenario data includes the initial positions of the obstacles
- the path planning unit is configured to:
- the scenario data includes the lane line information
- the path planning unit is configured to:
- the scenario data includes the traffic light information
- the path planning unit includes:
- a determining subunit configured to determine based on the simulation travel position of the vehicle, information about a traffic light having a shortest distance to the simulation travel position;
- a planning subunit configured to perform path planning on the vehicle based on the information about the traffic light.
- the determining subunit is configured to:
- the determining module includes:
- a parsing unit configured to parse the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp;
- an obtaining and determining unit configured to obtain at least one time sequence corresponded to the at least one type of initial scenario data, and determine the at least one time sequence as the at least one type of scenario data.
- the obtaining and determining unit is configured to:
- a data processing system including a vehicle and a computer device, where
- the vehicle is configured to acquire road test data in a travel process and send the road test data to the computer device;
- the computer device is configured to determine at least one type of scenario data based on the road test data acquired by the vehicle in the travel process, the scenario data being used for simulating a path planning model; input the at least one type of scenario data into the path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; and perform parameter adjustment on the path planning model based on the simulated travel information.
- a computer device including one or more processors and a non-transitory computer readable storage medium storing machine readable instructions which are executable by the one or more processors to implement the data processing method according to any one of the foregoing possible implementations.
- a non-transitory storage medium storing machine readable instructions which are executable by a processor to implement the data processing method according to any one of the foregoing possible implementations.
- a computer program product or a computer program including machine readable instructions, the machine readable instructions being stored in a computer-readable storage medium.
- One or more processors of a computer device can read the machine readable instructions from the computer-readable storage medium, and the one or more processors execute the machine readable instructions to enable the computer device to perform the data processing method according to any one of the foregoing possible implementations.
- scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process
- the scenario data may more accurately reproduce actual road conditions and noise interference.
- the path planning model may access a more realistic simulation during debugging, which helps to produce a path planning model with higher accuracy, thereby improving the intelligence of the self-driving vehicle.
- FIG. 1 is a schematic diagram of an implementation environment of a data processing method according to an embodiment of this application;
- FIG. 2 is a flowchart of a data processing method according to an embodiment of this application.
- FIG. 3 is a flowchart of a data processing method according to an embodiment of this application.
- FIG. 4 is a schematic diagram of a logical structure of a path planning model according to an embodiment of this application.
- FIG. 5 is a schematic principle diagram of path planning according to an embodiment of this application.
- FIG. 6 is a schematic principle diagram of path planning according to an embodiment of this application.
- FIG. 7 is an effect diagram of comparison between an artificial scenario and a simulation scenario according to an embodiment of this application.
- FIG. 8 is a principle flowchart of a data processing method according to an embodiment of this application.
- FIG. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of this application.
- FIG. 10 is a structural block diagram of an on-board terminal 1000 according to an exemplary embodiment of this application.
- FIG. 11 is a schematic structural diagram of a computer device according to an embodiment of this application.
- first , second, and the like in this application are used for distinguishing between same items or similar items of which effects and functions are basically the same. It should be understood that, the “first” , “second” , and “n th " do not have a dependency relationship in logic or time sequence, and a quantity and an execution order thereof are not limited.
- a plurality of first positions means two or more first positions.
- Unmanned vehicle also referred to as an unmanned driving vehicle or a wheeled mobile robot, is an intelligent vehicle that is controlled by a computer device to implement unmanned driving. .
- the computer device may act as an intelligent pilot of the vehicle.
- the unmanned vehicle may sense a road environment through an on-board sensing system, automatically plan a travel route, and control the vehicle to reach a predetermined destination.
- the unmanned vehicle may sense an environment around the vehicle through an on-board sensor, and control steering and a speed of the vehicle according to sensed information such as information about a road, a vehicle position, and obstacles. In this way the vehicle may be controlled to travel on the road safely and reliably.
- the unmanned vehicle may integrate a plurality of advanced technologies such as automatic control, architecture, artificial intelligence, vision computing.
- Internet of vehicles a protocol using network connections to connect the unmanned with objects such as another vehicle, a person, a road, and/or a service platform to exchange information relating to automated driving of vehicles or the environment of the vehicles.
- This new-generation information communication technology mayo improve the intelligent driving level of the vehicle and provide a safe, comfortable, intelligent, and efficient driving experience for a user and/or improve the traffic running efficiency, thereby improving intelligence levels of social traffic services.
- an on-board device on a vehicle effectively utilizes dynamic information of all vehicles in an information network platform by using a wireless communication technology, to provide different functional services during running of the vehicle.
- the IoV may include any of the following features: the IoV may ensure a distance between vehicles, to reduce a probability that a vehicle has a collision accident; and the IoV may help an owner of a vehicle to implement real-time navigation and improve the efficiency of traffic running through communication with other vehicles and network systems.
- Self-driving simulation computer generated simulation of driving of the vehicle in a real environment.
- Self-driving simulation technology is the application of computer simulation technology to the automobile field, which may be more complex than a conventional advanced driver assistance system (ADAS) may have high requirements on decoupling and an architecture of a system.
- a self-driving simulation system may digitally reproduce and generalize a real world in a mathematical modeling manner. Establishment of a relatively accurate, reliable, and effective simulation model is a key for ensuring high credibility of a simulation result.
- the simulation technology may transform a real controller into an algorithm in a simulation scenario, to test and verify a self-driving algorithm in combination with technologies such as sensor simulation.
- vehicle obstacles may be manually set on some positions of a map of the simulation scenario and are endowed with information such as speeds and attitudes to generate fake vehicle obstacle sensing signals; or position points of lane lines in a real environment are automatically sampled, and fake lane line sensing signals are generated at corresponding position points in the map of the simulation scenario, to simulate a real road condition scenario.
- a simulation scenario that is close to a real environment may be alternatively created based on a graphics processing unit (GPU) .
- the simulation scenario is similar to an animation in the real environment, and sensing information is calculated again based on an algorithm.
- an embodiment of this application provides a data processing method, to simulate a simulation scenario that is more conformable to real road conditions and reproduce the real road conditions in an actual travel process, which helps to obtain a more accurate path planning model (that is, a self-driving algorithm) through debugging.
- path planning model that is, a self-driving algorithm
- algorithm parameters of an original path planning model are not good.
- the unmanned vehicle does not have a good and smooth planned speed curve in some scenarios, and consequently excessive throttling or braking is planned.
- a situation in which the excessive throttling or braking is used is reproduced in the simulation scenario by acquiring the foregoing road test data in the actual travel process, and the algorithm parameters of the path planning model are debugged in a simulation system, to cause the path planning model to output a good and smooth speed curve, to avoid use of the excessive throttling or braking in a similar scenario, thereby debugging the path planning model by using the road test data more effectively and pushing update and iteration of the path planning model.
- FIG. 1 is a schematic diagram of an implementation environment of a data processing method according to an embodiment of this application. Referring to FIG. 1, in the implementation environment, a vehicle 101 and a computer device 102 are included.
- the vehicle 101 is configured to acquire road test data in an actual travel process.
- functional modules such as an on-board sensor, a positioning component, a camera component, a controller, a data processor, and a self-driving system are installed on the vehicle 101.
- the foregoing functional modules may implement exchange and sharing between objects participating in traffic with the assistance of modern mobile communication and network technologies such as the IoV, 5 th generation mobile networks (5G) , and vehicle to everything (V2X) , thereby having functions such as sensing and perception, decision planning, and control and execution in a complex environment.
- the vehicle 101 may for example be a conventional automobile or truck, an intelligent automobile an unmanned vehicle, an electric vehicle, a bicycle, or a motorcycle etc.
- the vehicle 101 may be driven and operated manually by a driver, or may be driven by a self-driving system to implement unmanned driving.
- the on-board sensor includes data acquisition units such as a lidar, a millimeter-wave radar sensor, an acceleration sensor, a gyroscope sensor, a proximity sensor, and a pressure sensor.
- data acquisition units such as a lidar, a millimeter-wave radar sensor, an acceleration sensor, a gyroscope sensor, a proximity sensor, and a pressure sensor.
- the road test data is a rosbag packet returned by a robot operating system (ROS) when the vehicle 101 performs a road test.
- Information acquired by the vehicle 101 based on functional modules such as the camera component and the on-board sensor during the road test is stored in the rosbag packet, and is used for sensing and tracking positions and motion attitudes of obstacles and lane lines.
- the rosbag packet further stores positioning data acquired by the positioning component based on a global positioning system (GPS) .
- GPS global positioning system
- the rosbag packet further stores vehicle attitude estimation of an inertial measurement unit (IMS, also referred to as an inertial sensor) on the vehicle 101.
- IMS inertial measurement unit
- the rosbag packet further stores timestamps of the foregoing various types of information.
- the vehicle 101 and the computer device 102 may be directly or indirectly connected to each other in a wired or wireless communication manner.
- the vehicle 101 and the computer device 102 are connected wirelessly through the IoV, which is not limited in the embodiments of this application.
- the computer device 102 is configured to debug parameters of the path planning model, to iterate and update the path planning model.
- the computer device 102 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center.
- the computer device 102 takes on primary computing work, and the vehicle 101 takes on secondary computing work; alternatively, the computer device 102 takes on secondary computing work, and the vehicle 101 takes on primary computing work; alternatively, collaborative computing is performed by using a distributed computing architecture between the vehicle 101 and the computer device 102.
- the vehicle 101 generally refers to one of a plurality of vehicles, and a terminal device configured to perform communication connection with the computer device 102 is installed on the vehicle 101.
- a type of the terminal device includes, but is not limited to at least one of an on-board terminal, a smartphone, a tablet computer, a smart watch, a smart speaker, an e-book reader, a Moving Picture Experts Group Audio Layer III (MP3) player, a Moving Picture Experts Group Audio Layer IV (MP4) player, a laptop portable computer, and a desktop computer.
- the terminal device is provided with a self-driving system, and the self-driving system may plan travel parameters of the vehicle 101 based on the path planning model debugged by the computer device 102.
- a person skilled in the art may learn that there may be more or fewer vehicles 101. For example, there may be only one vehicle 101, or there may be dozens of or hundreds of vehicles 101 or more.
- the quantity and the device type of the vehicle 101 are not limited in the embodiments of this application.
- FIG. 2 is a flowchart of a data processing method according to an embodiment of this application. Referring to FIG. 2, this embodiment is applicable to a computer device, and is described in detail below:
- the computer device determines at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process, the scenario data being used for simulating a path planning model.
- the computer device inputs the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and performs path planning on the vehicle, to output simulated travel information of the vehicle.
- the computer device performs parameter adjustment on the path planning model based on the simulated travel information.
- scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
- the performing path planning on the vehicle, to output simulated travel information of the vehicle includes:
- the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the scenario data includes the initial positions of the obstacles
- the performing path planning on the vehicle based on the simulation travel position includes:
- the performing path planning on the vehicle based on the simulation travel position includes:
- the performing path planning on the vehicle based on the simulation travel position includes:
- the determining information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle includes:
- the determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process includes:
- parsing the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp;
- the obtaining at least one time sequence corresponded to the at least one type of initial scenario data includes:
- FIG. 3 is a flowchart of a data processing method according to an embodiment of this application. Referring to FIG. 3, this embodiment is applicable to a computer device, and is described in detail below:
- the computer device obtains road test data acquired by a vehicle in a travel process.
- the vehicle is configured to acquire the road test data in an actual travel process.
- the vehicle includes transportation tools such as a conventional automobile, an intelligent automobile, an unmanned vehicle, an electric vehicle, a bicycle, and a motorcycle.
- the vehicle may be driven and operated manually by a driver, or may be driven by a self-driving system to implement unmanned driving.
- the road test data is used for representing various types of electrical signals acquired by the vehicle in an actual travel process.
- the road test data is a rosbag packet uploaded by an ROS of the vehicle to the computer device.
- the rosbag packet includes sensing signals, video signals, and positioning signals. Because the vehicle and vehicle obstacles have different sensing signals, video signals, and positioning signals at different moments, these sensing signals, video signals, and positioning signals have respective timestamps.
- the sensing signals are acquired by an on-board sensor, the video signals are acquired by a camera component, and the positioning signals are acquired by a positioning component, where the sensing signals and the video signals are all used for sensing and tracking positions and motion attitudes of the obstacles and lane lines. For example, sensing signals recorded by an IMU sensor are used for estimating a vehicle attitude of the vehicle.
- the vehicle acquires at least one of sensing signals, video signals, or positioning signals in the actual travel process based on at least one of the on-board sensor, the camera component, or the positioning component, and stores at least one of the sensing signals, the video signals, or the positioning signals and respective timestamps correspondingly, to obtain the road test data.
- the vehicle encapsulates the road test data into a rosbag packet based on the ROS, and sends the rosbag packet to the computer device.
- the computer device receives the rosbag packet and parses the rosbag packet to obtain the road test data.
- the computer device encapsulates the road test data based on a Transmission Control Protocol (YCP) , a User Datagram Protocol (UDP) , or an Internet Protocol (IP) , and an encapsulation protocol of the road test data is not specifically limited in the embodiments of this application.
- YCP Transmission Control Protocol
- UDP User Datagram Protocol
- IP Internet Protocol
- the computer device parses a header field of the packet, to obtain a type identifier of the packet. If the type identifier indicates that the packet is a rosbag packet, the computer device parses data fields of the packet, to obtain the road test data.
- the computer device stores the road test data, for example, stores at least one of the sensing signals, the video signals, or the positioning signals in the road test data and the respective timestamps correspondingly.
- the signals and the timestamps are correspondingly stored based on key values, or the signals and the timestamps are correspondingly stored based on storage pages.
- the road test data is stored in a predefined storage container structure.
- the storage container structure includes at least one of a class or a structure.
- map configuration information, a simulation duration, scenario data, and the like of self-driving simulation are stored, where the map configuration information is used for indicating whether to use an electronic map or lane lines.
- a timestamp sequence, traffic light information, lane line information, and vehicle obstacle information are stored, where the vehicle obstacle information includes at least initial positions and motion information of obstacles.
- the traffic light information, the lane line information, the vehicle obstacle information, and the timestamp sequence are stored correspondingly.
- vehicle obstacle identifiers whether the vehicle obstacles include the vehicle, geometric information and position motion state estimation information of the vehicle obstacles, and vehicle obstacle media are stored.
- types of the vehicle obstacles, trajectory tracking, motion planning, a parameter model, and information about the vehicle are stored. According to a nested relationship provided in the storage container structure, logical relationships between complex information in the road test data can be clearly represented, which helps to transform the road test data into scenario data.
- the computer device parses the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp.
- the computer device determines at least one of the sensing signals, the video signals, or the positioning signals in the road test data as the at least one type of initial scenario data. Because at least one of the sensing signals, the video signals, or the positioning signals and timestamps are stored correspondingly, each piece of initial scenario data includes at least one timestamp.
- the computer device obtains at least one time sequence corresponded to the at least one type of initial scenario data, and determines the at least one time sequence as at least one type of scenario data.
- the scenario data is used for simulating a path planning model.
- the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the computer device sorts at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
- the time sequence is also referred to as dynamic series and refers to series formed by arranging values (that is, the initial scenario data) with the same statistical indicator in chronological order.
- a main objective of time sequence analysis is to predict the future according to existing historical data, and the time sequence in the embodiments of this application is used for performing self-driving simulation.
- the computer device sorts various pieces of initial scenario data according to timestamps and stores the various pieces of initial scenario data in the storage container structure.
- the computer device stores the traffic light information, the lane line information, the vehicle obstacle information, and positioning information of the vehicle correspondingly according to timestamps.
- the vehicle obstacle information is used as an example, the vehicle obstacle IDs, geometric information (sizes such as the length, the width, and the height) of the vehicle obstacles, and motion attitude estimation information of the vehicle obstacles are sequentially stored in a next layer of the vehicle obstacle information in the storage container structure and are stored as media of a trajectory tracking type according to timestamps, and relative distances between the vehicle and the vehicle obstacles when the vehicle senses each of the vehicle obstacles for the first time need to be recorded.
- a weight, the positioning information, and motion attitude estimation information of the vehicle are also stored in the storage container structure.
- the computer device parses the road surface information in the road test data, and stores position coordinates (x, y, z) of the vehicle in an electronic map and timestamps correspondingly based on the positioning information and the motion attitude estimation information of the vehicle.
- the computer device deletes information about vehicle obstacles of which a relative distance to the vehicle is greater than a first threshold, or the computer device deletes information about vehicle obstacles of which a relative distance to a lane line is greater than a second threshold, or the computer device deletes information about vehicle obstacles that do not interfere with a motion trajectory of the vehicle, thereby preventing some useless vehicle obstacles from interfering with a simulation operation and improving the data processing efficiency of self-driving simulation.
- a first threshold is any value greater than or equal to 0.
- a possible implementation that the computer device determines at least one type of scenario data based on the road test data acquired by the vehicle in a travel process is provided.
- a time sequence can be automatically obtained when the road test data is stored in the storage container structure, that is, a storage step and a parsing step of the road test data are coupled together without performing additional redundant and complex parsing operations, thereby simplifying an obtaining procedure of scenario data.
- the computer device may alternatively parse the road test data by predefining a parsing algorithm, so that road test data are stored in the storage container structure according to timestamps based on the parsing algorithm, to obtain the scenario data.
- an obtaining manner of the scenario data is not specifically limited.
- the computer device inputs the at least one type of scenario data into a path planning model, and determines a simulation travel position of the vehicle based on the at least one type of scenario data.
- the computer device reads the at least one type of scenario data from the storage container structure, and inputs the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle. Since position coordinates of the vehicle and timestamps are correspondingly stored when the road surface information is parsed, position coordinates of the vehicle at any moment may be determined from the storage container structure at the any moment, and a position point indicated by the position coordinates in the electronic map is determined as a simulation travel position of the vehicle at the any moment.
- the path planning model includes at least one of the following functional modules: an obstacle simulation unit, an obstacle prediction unit, a path planning unit, a control unit, and a vehicle simulation unit.
- the obstacle simulation unit includes an obstacle subunit and a lane line/traffic light subunit, where the obstacle subunit is configured to simulate sensing signals of other vehicle obstacles in a simulation scenario, and the lane line/traffic light subunit is configured to simulate sensing signals of a lane line or a traffic light in the simulation scenario.
- FIG. 4 is a schematic diagram of a logical structure of a path planning model according to an embodiment of this application.
- a path planning model 400 includes an obstacle simulation unit 401, an obstacle prediction unit 402, a path planning unit 403, a control unit 404, and a vehicle simulation unit 405, where the obstacle simulation unit 401 includes an obstacle subunit 4011 and a lane line/traffic light subunit 4012.
- the vehicle obstacle information including the initial positions and motion information of obstacles
- the lane line information or the traffic light information is inputted into the lane line/traffic light subunit 4012.
- the path planning unit 403 plans the simulated travel information (including a speed curve and a travel trajectory of the vehicle) of the vehicle, and inputs the travel trajectory of the vehicle into the control unit 404.
- the control unit 404 generates, according to the travel trajectory of the vehicle, a command prompt (CMD) for controlling the vehicle to move, and inputs the CMD of the vehicle into the vehicle simulation unit 405.
- CMD command prompt
- the vehicle simulation unit 405 controls motion of the vehicle in the simulation scenario based on the CMD of the vehicle, and inputs motion data of the vehicle into each of the path planning unit 403 and the obstacle simulation unit 401, to iteratively control motion of the vehicle at a next moment.
- geometric information of the vehicle and the road surface information in the scenario data also need to be inputted into the vehicle simulation unit 405, to simulate a vehicle that is more conformable to an actual size and real road conditions in the simulation scenario.
- the computer device performs path planning on the vehicle based on the simulation travel position to output simulated travel information of the vehicle.
- scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the computer device obtains, in response to that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of the obstacle; and performs path planning on the vehicle based on the motion information of the any obstacle.
- the distance threshold is a relative distance between the vehicle and an obstacle acquired in the road test data when the vehicle senses the obstacle for the first time.
- whether to trigger an obstacle to start to move in the simulation scenario is decided based on the relative distance between the vehicle and the obstacle, so that a motion simulation degree of the obstacle in the simulation scenario is higher, relative scenarios between the vehicle and the vehicle obstacle in the rosbag packet may be kept rather than being directly mechanically played based on the timestamps in the rosbag packet. In this way, even when a speed of the vehicle is changed due to changes of an algorithm, a relative trigger condition between the vehicle and the obstacle may still be kept.
- triggering an obstacle to move in the simulation scenario refers to playing motion information of the obstacle according to a timestamp sequence from a trigger moment, where the trigger moment refers to a moment at which the distance between the vehicle and the obstacle is exactly less than the distance threshold.
- FIG. 5 is a schematic principle diagram of path planning according to an embodiment of this application.
- a distance threshold of an obstacle 501 is s 1
- a distance threshold of an obstacle 502 is s 2 .
- the vehicle 500 travels to a position the obstacle 501 travels to a position the obstacle 502 is located at an initial position and if a distance between the position of the vehicle 500 and the initial position of the obstacle 502 is less than s 2 , the obstacle 502 is triggered to start to move in the simulation scenario.
- the computer device determines lane line information of a road section in which the simulation travel position is located based on the simulation travel position of the vehicle; and performs path planning on the vehicle based on the lane line information of the road section.
- the computer device when querying lane line information of a current road section, searches for a calibration position which has a shortest distance to the simulation travel position.
- the calibration position and may be stored together with corresponding information about a lane line within a target range of the calibration position.
- the lane line information that is stored corresponding to the calibration position may then be determined as, or treated as the lane line information of the road section in which the simulation travel position is located.
- the computer device when determining the lane line information according to the simulation travel position, adopts a K-dimensional tree (KD tree) search method, where a KD tree is a data search structure that may quickly search for a nearest neighbor and an approximate nearest neighbor in a high-dimensional space.
- KD tree K-dimensional tree
- the computer device uses position information of the vehicle in each road section in the road test data as an index (that is, the calibration position) , stores lane line information of each road section as a data member into the KD tree, extracts an index of which a distance to the simulation travel position is nearest from the KD tree according to the simulation travel position during simulation, uses a data member corresponding to the nearest index as the lane line information of the current road section of the vehicle, and sends the lane line to a path planning unit to perform path planning based on the self-driving algorithm.
- Such a KD tree-based search method may greatly improve the search efficiency of the lane line information.
- FIG. 6 is a schematic principle diagram of path planning according to an embodiment of this application.
- the on-board sensor senses information about a lane line within an R1 range; and when the vehicle travels to a position (x 1 , y 1 ) , the on-board sensor senses information about a lane line within an R2 range.
- the computer device determines information about a traffic light having a shortest distance to the simulation travel position and performs path planning on the vehicle based on the information about the traffic light.
- the computer device when querying information about a nearest traffic light, searches for a calibration position which has a shortest distance to the simulation travel position. The computer device may then obtain, information about a traffic light within a target range of the calibration position; for example corresponding traffic light information may be stored for each calibration position. The computer device may treat the traffic light closest to the calibration position as the traffic light having the shortest distance to the simulation travel position. In this way, the computer device may determine information about a traffic light that which is closest to the calibration position as the information about the traffic light having a shortest distance to the simulation travel position.
- a method for searching for the traffic light information based on the KD tree is similar to the method for searching for the lane line information based on the KD tree, so that details are not described herein again.
- Such a KD tree-based search method may greatly improve the search efficiency of the traffic light information.
- steps 304 and 305 a possible implementation that the computer device invokes a path planning model to perform path planning on the vehicle to output simulated travel information of the vehicle is provided. After planning a path of the vehicle, the computer device simulates the motion of the vehicle in the simulation scenario, to debug parameters of the path planning model based on motion feedback.
- the computer device performs parameter adjustment on the path planning model based on the simulated travel information.
- the path planning model outputs a speed curve and a travel trajectory of the vehicle, to control the vehicle to perform simulation motion in the simulation scenario in the travel trajectory according to the speed curve.
- the computer device inputs the road surface information, the speed curve, and the travel trajectory into a vehicle simulation unit, and (x, y, z) coordinates in the road surface information form a road surface.
- the computer device may calculate a displacement per frame of the vehicle in the travel trajectory according to the speed curve based on a kinematic model, and reposition the displacement per frame on the road surface, to simulate the motion of the vehicle.
- a current simulation travel position of the vehicle, the speed curve, the travel trajectory, and a slope of the road surface may be taken into consideration.
- FIG. 7 is an effect diagram of comparison between an artificial scenario and a simulation scenario according to an embodiment of this application.
- 701 corresponds to an artificial scenario
- 702 corresponds to a simulation scenario.
- Each curve of the first line represents a law that a distance between the vehicle and a front car changes over time
- each curve of the second line represents a speed change curve of the front car sensed by the vehicle.
- interference from noise may be apparently reproduced in the distance curves and the speed curves, and formation causes of the noise include, but are not limited to: a driving manner of a driver of the front car, influence of a traffic flow status, a sensing error of a self-driving algorithm, a sensing error of the noise, and the like. Therefore, the simulation scenario has a better reproduction degree and simulation degree.
- FIG. 8 is a principle flowchart of a data processing method according to an embodiment of this application.
- a computer device acquires a rosbag packet of road test data; in step 2, the computer device generates scenario data based on the rosbag packet; in step 3, the computer device inputs the scenario data into a path planning model to perform simulation; in step 4, the path planning model generates a simulation report; in step 5, the computer device performs quality evaluation on the path planning model based on the simulation report; and in step 6, the computer device performs parameter adjustment on the path planning model (that is, a self-driving algorithm) , to cause an adjusted path planning model to have more accurate path planning performance.
- the path planning model that is, a self-driving algorithm
- a distance between the simulation travel position of the vehicle and an actual travel position thereof is not less than n (n ⁇ 0) meters; a depth percentage of a brake pedal is not greater than m% (m ⁇ 0) ; and a distance between the simulation travel position of the vehicle and a center position of a lane is not greater than k (k ⁇ 0) meters, where m, n, and k are set by a technician, so as to measure the performance of the path planning model quantitatively.
- scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
- FIG. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of this application.
- the apparatus includes:
- a determining module 901 configured to determine at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process, the scenario data being used for simulating a path planning model;
- a path planning module 902 configured to input the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle;
- an adjustment module 903 configured to perform parameter adjustment on the path planning model based on the simulated travel information.
- scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
- the path planning module 902 includes:
- a determining unit configured to determine a simulation travel position of the vehicle based on the at least one type of scenario data
- a path planning unit configured to perform path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
- the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- the path planning unit is configured to:
- the path planning unit is configured to:
- the path planning unit includes:
- a determining subunit configured to determine information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle
- a planning subunit configured to perform path planning on the vehicle based on the information about the traffic light.
- the determining subunit is configured to:
- the determining module 901 includes:
- a parsing unit configured to parse the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp;
- an obtaining and determining unit configured to obtain at least one time sequence corresponded to the at least one type of initial scenario data, and determine the at least one time sequence as the at least one type of scenario data.
- the obtaining and determining unit is configured to:
- the data processing apparatus provided in the foregoing embodiment processes data
- the function distribution may be implemented by different functional modules according to requirements, that is, an internal structure of the computer device is divided into different functional modules, to implement all or some of the functions described above.
- the data processing apparatus provided in the foregoing embodiment and the embodiments of the data processing method belong to the same concept. For a specific implementation process, reference may be made to the embodiments of the data processing method. Details are not described herein again.
- FIG. 10 is a structural block diagram of an on-board terminal 1000 according to an exemplary embodiment of this application.
- a device type of the on-board terminal 1000 may include: a smartphone, a tablet computer, a Moving Picture Experts Group Audio Layer III (MP3) player, a Moving Picture Experts Group Audio Layer IV (MP4) player, a notebook computer, or a desktop computer.
- MP3 Moving Picture Experts Group Audio Layer III
- MP4 Moving Picture Experts Group Audio Layer IV
- the on-board terminal 1000 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or another name.
- the on-board terminal 1000 includes a processor 1001 and a memory 1002.
- the processor 1001 includes one or more processing cores, for example, a 4-core processor or an 8-core processor.
- the processor 1001 may be implemented in at least one hardware form of a digital signal processor (DSP) , a field-programmable gate array (FPGA) , and a programmable logic array (PLA) .
- the processor 1001 includes a main processor and a coprocessor.
- the main processor is a processor configured to process data in an awake state, and is also referred to as a central processing unit (CPU) .
- the coprocessor is a low power consumption processor configured to process data in a standby state.
- the processor 1001 may be integrated with a graphics processing unit (GPU) .
- the GPU is configured to render and draw content that needs to be displayed on a display screen.
- the processor 1001 further includes an artificial intelligence (AI) processor.
- the AI processor is configured to process computing operations related to machine learning.
- the memory 1002 includes one or more computer-readable storage medium.
- the computer-readable storage medium is non-transient.
- the memory 1002 further includes a high-speed random access memory and a non-volatile memory, such as one or more magnetic disk storage devices or flash storage devices.
- a non-transient computer-readable storage medium in the memory 1002 is configured to store machine readable instructions, and the machine readable instructions are configured to be executable by the processor 1001 to implement the data processing method provided in the embodiments of this application.
- the on-board terminal 1000 may optionally include a peripheral device interface 1003 and at least one peripheral device.
- the processor 1001, the memory 1002, and the peripheral device interface 1003 may be connected through a bus or a signal cable.
- Each peripheral device may be connected to the peripheral device interface 1003 through the bus, the signal line, or a circuit board.
- the peripheral device includes: at least one of a radio frequency (RF) circuit 1004, a touch display screen 1005, a camera component 1006, an audio circuit 1007, a positioning component 1008, and a power supply 1009.
- RF radio frequency
- the peripheral device interface 1003 may be configured to connect at least one input/output (I/O) -related peripheral device to the processor 1001 and the memory 1002.
- the processor 1001, the memory 1002, and the peripheral device interface 1003 are integrated on the same chip or circuit board.
- any one or two of the processor 1001, the memory 1002, and the peripheral device interface 1003 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
- the RF circuit 1004 is configured to receive and transmit an RF signal, which is also referred to as an electromagnetic signal.
- the RF circuit 1004 communicates with a communication network and other communication devices through the electromagnetic signal.
- the RF circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal.
- the RF circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chip set, a subscriber identity module card, and the like.
- the RF circuit 1004 may communicate with other terminals through at least one wireless communication protocol.
- the wireless communication protocol includes, but is not limited to: a metropolitan area network, generations of mobile communication networks (2G, 3G, 4G, and 5G) , a wireless local area network and/or a wireless fidelity (Wi-Fi) network.
- the RF circuit 1004 may further include a circuit related to near field communication (NFC) , which is not limited in this application.
- NFC near field communication
- the display screen 1005 is configured to display a user interface (UI) .
- the UI may include a graph, a text, an icon, a video, and any combination thereof.
- the display screen 1005 is a touch display screen, the display screen 1005 is further capable of acquiring touch signals on or above a surface of the display screen 1005.
- the touch signal may be inputted to the processor 1001 for processing as a control signal.
- the display screen 1005 is further configured to provide a virtual button and/or a virtual keyboard, which is also referred to as a soft button and/or a soft keyboard.
- the display screen 1005 is a flexible display screen disposed on a curved surface or a folded surface of the on-board terminal 1000.
- the display screen 1005 is even set in a non-rectangular irregular pattern, namely, a special-shaped screen.
- the display screen 1005 is manufactured by using a material such as a liquid crystal display (LCD) or an organic light-emitting diode (OLED) .
- LCD liquid crystal display
- OLED organic light-emitting diode
- the camera component 1006 is configured to acquire an image or a video.
- the camera component 1006 includes a front-facing camera and a rear-facing camera.
- the front-facing camera is disposed on the front panel of the terminal
- the rear-facing camera is disposed on a back surface of the terminal.
- there are at least two rear-facing cameras which are respectively any of a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, to achieve background blurring function through fusion of the main camera and the depth-of-field camera, panoramic photographing and virtual reality (VR) photographing through fusion of the main camera and the wide-angle camera, or other fusion photographing functions.
- VR virtual reality
- the camera component 1006 further includes a flash.
- the flash is a single color temperature flash or a double color temperature flash.
- the double color temperature flash is a combination of a warm light flash and a cold light flash, and is used for light compensation under different color temperatures.
- the audio circuit 1007 includes a microphone and a speaker.
- the microphone is configured to acquire sound waves of a user and an environment, and convert the sound waves into an electrical signal to input to the processor 1001 for processing, or input to the RF circuit 1004 for implementing voice communication.
- the microphone is an array microphone or an omni-directional acquisition microphone.
- the speaker is configured to convert electrical signals from the processor 1001 or the RF circuit 1004 into sound waves.
- the speaker is a conventional thin-film speaker or a piezoelectric ceramic speaker.
- the audio circuit 1007 further includes an earphone jack.
- the positioning component 1008 is configured to determine a current geographic location of the on-board terminal 1000, to implement navigation or a location based service (LBS) .
- the positioning component 1008 may be a positioning component based on the Global Positioning System (GPS) of the United States, the BeiDou System of China, the GLONASS System of Russia, or the GALILEO System of the European Union.
- GPS Global Positioning System
- the power supply 1009 is configured to supply power to components in the on-board terminal 1000.
- the power supply 1009 is an alternating current, a direct current, a disposable battery, or a rechargeable battery.
- the rechargeable battery supports wired charging or wireless charging.
- the rechargeable battery is further configured to support a fast charge technology.
- the on-board terminal 1000 further includes one or more sensors 1010.
- the one or more sensors 1010 include, but are not limited to: an acceleration sensor 1011, a gyroscope sensor 1012, a pressure sensor 1013, a fingerprint sensor 1014, an optical sensor 1015, and a proximity sensor 1016.
- the acceleration sensor 1011 detects accelerations on three coordinate axes of a coordinate system established by the on-board terminal 1000.
- the acceleration sensor 1011 is configured to detect components of the gravity acceleration on the three coordinate axes.
- the processor 1001 may control, according to a gravity acceleration signal acquired by the acceleration sensor 1011, the touch display screen 1005 to display the UI in a frame view or a portrait view.
- the acceleration sensor 1011 is further configured to acquire motion data of a game or a user.
- the gyroscope sensor 1012 detects a body direction and a rotation angle of the on-board terminal 1000.
- the gyroscope sensor 1012 acquires a 3D action of the user on the on-board terminal 1000 together with the acceleration sensor 1011.
- the processor 1001 implements the following functions according to the data acquired by the gyroscope sensor 1012: motion sensing (for example, changing the UI according to a tilt operation of the user) , image stabilization during shooting, game control, and inertial navigation.
- the pressure sensor 1013 is disposed on a side frame of the on-board terminal 1000 and/or a lower layer of the touch display screen 1005.
- a holding signal of the user to the on-board terminal 1000 may be detected, and left/right hand identification or a quick action may be performed by the processor 1001 according to the holding signal acquired by the pressure sensor 1013.
- the processor 1001 controls an operable control on the UI interface according to a pressure operation performed by the user on the touch display screen 1005.
- the operable control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
- the fingerprint sensor 1014 is configured to acquire a user's fingerprint, and the processor 1001 identifies a user's identity according to the fingerprint acquired by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies a user's identity according to the acquired fingerprint. When identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform related sensitive operations.
- the sensitive operations include: unlocking a screen, viewing encrypted information, downloading software, paying, changing a setting, and the like.
- the fingerprint sensor 1014 is disposed on a front face, a back face, or a side face of the on-board terminal 1000. When a physical button or a vendor logo is disposed on the on-board terminal 1000, the fingerprint sensor 1014 may be integrated together with the physical button or the vendor logo.
- the optical sensor 1015 is configured to acquire ambient light intensity.
- the processor 1001 controls display brightness of the touch display screen 1005 according to the ambient light intensity acquired by the optical sensor 1015. specifically, when the ambient light intensity is relatively high, the display brightness of the touch display screen 1005 is increased; and when the ambient light intensity is relatively low, the display brightness of the touch display screen 1005 is reduced.
- the processor 1001 further dynamically adjusts a camera parameter of the camera component 1006 according to the ambient light intensity acquired by the optical sensor 1015.
- the proximity sensor 1016 is also referred to as a distance sensor and is generally disposed at the front panel of the on-board terminal 1000.
- the proximity sensor 1016 is configured to acquire a distance between the user and the front face of the on-board terminal 1000.
- the processor 1001 controls the touch display screen 1005 to be switched from a screen-on state to a screen-off state; and when the proximity sensor 1016 detects that the distance between the user and the front face of the on-board terminal 1000 is gradually increased, the processor 1001 controls the touch display screen 1005 to be switched from the screen-off state to the screen-on state.
- FIG. 10 does not constitute any limitation on the on-board terminal 1000 and that the on-board terminal may include more or fewer components than those shown in the figure, a combination of some components, or different component arrangements.
- FIG. 11 is a schematic structural diagram of a computer device according to an embodiment of this application.
- the computer device 1100 may vary greatly due to different configurations or performance, and the computer device 1100 includes one or more central processing units (CPUs) 1101 and one or more memories 1102.
- the memory 1102 stores machine readable instructions, and the machine readable instructions are loaded and executed by the processor 1101 to implement the data processing method provided in the foregoing embodiments.
- the computer device 1100 further includes components such as a wired or wireless network interface, a keyboard, and an I/O interface for ease of I/O, and the computer device 1100 further includes other components for implementing functions of the device. Details are not described herein again.
- a non-transitory computer-readable storage medium for example, a memory including machine readable instructions is further provided.
- the machine readable instructions may be executed by a processor in a terminal to implement the data processing method in the foregoing embodiments.
- the computer-readable storage medium includes a read-only memory (ROM) , a random access memory (RAM) , a compact disc read-only memory (CD-ROM) , a magnetic tape, a floppy disk, an optical data storage device, or the like.
- a computer program product or a computer program is further provided, including one or more pieces of program code, the one or more pieces of program code being stored in a non-transitory computer-readable storage medium.
- One or more processors of a computer device can read the one or more pieces of program code from the non-transitory computer-readable storage medium, and the one or more processors execute the one or more pieces of program code to enable the computer device to perform the data processing method in the foregoing embodiments.
- the program is stored in a non-transitory computer-readable storage medium.
- the non-transitory storage medium mentioned above is a ROM, a magnetic disk, an optical disc, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
Abstract
A data processing method, apparatus, and system, a computer device, and a non-transitory storage medium, which belong to the field of computer technologies. In the data processing method, scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle (101, 500) in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
Description
CROSS REFERENCE TO RELATED APPLICATION
This disclosure claims the benefits of priority to Chinese application number 202011288734.6, filed on 17 November 2020 and entitled “DATA PROCESSING METHOD, APPARATUS, AND SYSTEM, COMPUTER DEVICE, AND NON-TRANSITORY STORAGE MEDIUM” , which is hereby incorporated by reference in its entirety.
This application relates to the field of computer technologies, and in particular, to a data processing method, apparatus, and system, a computer device, and a non-transitory storage medium.
Background Art
With the development of computer technologies and mobile communication technologies, self-driving vehicles have started to attract wide attention. A self-driving vehicle, also referred to as an unmanned vehicle or a wheeled mobile robot, is an intelligent vehicle that is controlled by a computer device to implement unmanned driving. The cost and time cycle for performing debugging of an unmanned vehicle control system based on driving in a real environment are large. Therefore, a control system or self-driving algorithm of an unmanned vehicle is usually debugged in advance based on a simulation scenario (that is, a screnario generated by simulation system) , so as to meet the rapid iteration requirements of debugging the self-driving algorithm.
In a debugging process based on the simulation scenario, vehicle obstacles are manually designed and parameters such as speeds and attitudes are set, to simulate the road conditions of a real environment. The self-driving vehicle senses and responds to the manually designed vehicle obstacles, so as to intelligently plan a subsequent driving speed curve. However, noise interference is generally ignored for the manually designed vehicle obstacles. As a result, the simulation scenario cannot duplicate real road conditions, which negatively affects the accuracy of the self-driving algorithm and the intelligence of the self-driving vehicle.
Summary of the Invention
Embodiments of this application provide a data processing method, apparatus, and system, a computer device, and a non-transitory storage medium, to improve the accuracy of a self-driving algorithm and improve the intelligence of a self-driving vehicle. The technical solutions are as follows:
According to one aspect, a computer-implemented data processing method is provided, including:
determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process;
inputting the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and performing path planning on the vehicle, to output simulated travel information of the vehicle; and
performing parameter adjustment on the path planning model based on the simulated travel information.
In a possible implementation, the performing path planning on the vehicle, to output simulated travel information of the vehicle includes:
determining a simulation travel position of the vehicle based on the at least one type of scenario data; and
performing path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
In a possible implementation, the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
In a possible implementation, the scenario data includes the initial positions of the obstacles, and the performing path planning on the vehicle based on the simulation travel position includes:
obtaining, in response to determining that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of said obstacle; and
performing path planning on the vehicle based on the motion information of the said obstacle.
In a possible implementation, the scenario data includes the lane line information, and the performing path planning on the vehicle based on the simulation travel position includes:
determining, based on the simulation travel position of the vehicle, lane line information of a road section in which the simulation travel position is located; and
performing path planning on the vehicle based on the lane line information of the road section.
In a possible implementation, the scenario data includes the traffic light information, and the performing path planning on the vehicle based on the simulation travel position includes:
determining, based on the simulation travel position of the vehicle, information about a traffic light having a shortest distance to the simulation travel position; and
performing path planning on the vehicle based on the information about the traffic light.
In a possible implementation, the determining information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle comprises:
searching for a calibration position which has a shortest distance to the simulation travel position; determining information of a traffic light closest to the calibration position; and
treating the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position.
In a possible implementation, the determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process includes:
parsing the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp; and
obtaining at least one time sequence corresponded to the at least one type of initial scenario data, and determining the at least one time sequence as the at least one type of scenario data.
In a possible implementation, the obtaining at least one time sequence corresponded to the at least one type of initial scenario data includes:
for any type of initial scenario data, sorting at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
According to one aspect, a data processing apparatus is provided, including processor configured to execute:
a determining module, configured to determine at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process;
a path planning module, configured to the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; and
an adjustment module, configured to perform parameter adjustment on the path planning model based on the simulated travel information.
In a possible implementation, the path planning module includes:
a determining unit, configured to determine a simulation travel position of the vehicle based on the at least one type of scenario data; and
a path planning unit, configured to perform path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
In a possible implementation, the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
In a possible implementation, the scenario data includes the initial positions of the obstacles, and the path planning unit is configured to:
obtain, in response to determining that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of said obstacle; and
perform path planning on the vehicle based on the motion information of said obstacle .
In a possible implementation, the scenario data includes the lane line information, and the path planning unit is configured to:
determine based on the simulation travel position of the vehicle, lane line information of a road section in which the simulation travel position is located; and
perform path planning on the vehicle based on the lane line information of the road section.
In a possible implementation, the scenario data includes the traffic light information, and the path planning unit includes:
a determining subunit, configured to determine based on the simulation travel position of the vehicle, information about a traffic light having a shortest distance to the simulation travel position; and
a planning subunit, configured to perform path planning on the vehicle based on the information about the traffic light.
In a possible implementation, the determining subunit is configured to:
search for a calibration position which has a shortest distance to the simulation travel position; determine information of a traffic light closest to the calibration position; and
treat the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position.
In a possible implementation, the determining module includes:
a parsing unit, configured to parse the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp; and
an obtaining and determining unit, configured to obtain at least one time sequence corresponded to the at least one type of initial scenario data, and determine the at least one time sequence as the at least one type of scenario data.
In a possible implementation, the obtaining and determining unit is configured to:
for any type of initial scenario data, sort at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
According to one aspect, a data processing system is provided, including a vehicle and a computer device, where
the vehicle is configured to acquire road test data in a travel process and send the road test data to the computer device; and
the computer device is configured to determine at least one type of scenario data based on the road test data acquired by the vehicle in the travel process, the scenario data being used for simulating a path planning model; input the at least one type of scenario data into the path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; and perform parameter adjustment on the path planning model based on the simulated travel information.
According to one aspect, a computer device is provided, including one or more processors and a non-transitory computer readable storage medium storing machine readable instructions which are executable by the one or more processors to implement the data processing method according to any one of the foregoing possible implementations.
According to one aspect, a non-transitory storage medium is provided, storing machine readable instructions which are executable by a processor to implement the data processing method according to any one of the foregoing possible implementations.
According to one aspect, a computer program product or a computer program is provided, including machine readable instructions, the machine readable instructions being stored in a computer-readable storage medium. One or more processors of a computer device can read the machine readable instructions from the computer-readable storage medium, and the one or more processors execute the machine readable instructions to enable the computer device to perform the data processing method according to any one of the foregoing possible implementations.
The technical solutions provided in the embodiments of this application may achieve one or more of the following beneficial effects:
As scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may more accurately reproduce actual road conditions and noise interference. As a result the path planning model may access a more realistic simulation during debugging, which helps to produce a path planning model with higher accuracy, thereby improving the intelligence of the self-driving vehicle.
To describe the technical solutions in the embodiments of this application more clearly, the following briefly describes the accompanying drawings required for describing the embodiments. Apparently, the accompanying drawings in the following description show only some embodiments of this application, and a person of ordinary skill in the art may still derive other accompanying drawings according to the accompanying drawings without creative efforts.
FIG. 1 is a schematic diagram of an implementation environment of a data processing method according to an embodiment of this application;
FIG. 2 is a flowchart of a data processing method according to an embodiment of this application;
FIG. 3 is a flowchart of a data processing method according to an embodiment of this application;
FIG. 4 is a schematic diagram of a logical structure of a path planning model according to an embodiment of this application;
FIG. 5 is a schematic principle diagram of path planning according to an embodiment of this application;
FIG. 6 is a schematic principle diagram of path planning according to an embodiment of this application;
FIG. 7 is an effect diagram of comparison between an artificial scenario and a simulation scenario according to an embodiment of this application;
FIG. 8 is a principle flowchart of a data processing method according to an embodiment of this application;
FIG. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of this application;
FIG. 10 is a structural block diagram of an on-board terminal 1000 according to an exemplary embodiment of this application; and
FIG. 11 is a schematic structural diagram of a computer device according to an embodiment of this application.
Detailed Description of Embodiments
In order to make objectives, technical solutions, and advantages of this application clearer, implementations of this application are further described in detail below with reference to the accompanying drawings.
The terms "first" , "second" , and the like in this application are used for distinguishing between same items or similar items of which effects and functions are basically the same. It should be understood that, the "first" , "second" , and "n
th" do not have a dependency relationship in logic or time sequence, and a quantity and an execution order thereof are not limited.
The term "at least one" in this application means one or more and "a plurality of" means two or more. For example, a plurality of first positions means two or more first positions.
Terms used in the embodiments of this application are explained below:
Unmanned vehicle: also referred to as an unmanned driving vehicle or a wheeled mobile robot, is an intelligent vehicle that is controlled by a computer device to implement unmanned driving. . The computer device may act as an intelligent pilot of the vehicle. The unmanned vehicle may sense a road environment through an on-board sensing system, automatically plan a travel route, and control the vehicle to reach a predetermined destination. The unmanned vehicle may sense an environment around the vehicle through an on-board sensor, and control steering and a speed of the vehicle according to sensed information such as information about a road, a vehicle position, and obstacles. In this way the vehicle may be controlled to travel on the road safely and reliably. The unmanned vehicle may integrate a plurality of advanced technologies such as automatic control, architecture, artificial intelligence, vision computing.
Internet of vehicles (IoV) : a protocol using network connections to connect the unmanned with objects such as another vehicle, a person, a road, and/or a service platform to exchange information relating to automated driving of vehicles or the environment of the vehicles. This new-generation information communication technology mayo improve the intelligent driving level of the vehicle and provide a safe, comfortable, intelligent, and efficient driving experience for a user and/or improve the traffic running efficiency, thereby improving intelligence levels of social traffic services. Optionally, an on-board device on a vehicle effectively utilizes dynamic information of all vehicles in an information network platform by using a wireless communication technology, to provide different functional services during running of the vehicle. The IoV may include any of the following features: the IoV may ensure a distance between vehicles, to reduce a probability that a vehicle has a collision accident; and the IoV may help an owner of a vehicle to implement real-time navigation and improve the efficiency of traffic running through communication with other vehicles and network systems.
Self-driving simulation: computer generated simulation of driving of the vehicle in a real environment. Self-driving simulation technology is the application of computer simulation technology to the automobile field, which may be more complex than a conventional advanced driver assistance system (ADAS) may have high requirements on decoupling and an architecture of a system. A self-driving simulation system may digitally reproduce and generalize a real world in a mathematical modeling manner. Establishment of a relatively accurate, reliable, and effective simulation model is a key for ensuring high credibility of a simulation result. The simulation technology may transform a real controller into an algorithm in a simulation scenario, to test and verify a self-driving algorithm in combination with technologies such as sensor simulation.
Generally, in a conventional process of automatically generating a road test simulation scenario, vehicle obstacles may be manually set on some positions of a map of the simulation scenario and are endowed with information such as speeds and attitudes to generate fake vehicle obstacle sensing signals; or position points of lane lines in a real environment are automatically sampled, and fake lane line sensing signals are generated at corresponding position points in the map of the simulation scenario, to simulate a real road condition scenario. Optionally, a simulation scenario that is close to a real environment may be alternatively created based on a graphics processing unit (GPU) . The simulation scenario is similar to an animation in the real environment, and sensing information is calculated again based on an algorithm.
In the a conventional self-driving simulation, sensing information such as motion states of vehicle obstacles and lane lines during an actual road test cannot be truly reflected, and noise interference is generally ignored in the information such as manually designed vehicle obstacles or lane lines. Consequently, the simulation scenario cannot duplicate real road conditions well. As a result, a simulation effect of conventional self-driving simulation system is relatively poor, and the self-driving algorithm adopted by the path planning model cannot be iterated and updated more quickly and accurately, which negatively affects the accuracy of the self-driving algorithm and affects the intelligence of the self-driving vehicle.
In view of this, an embodiment of this application provides a data processing method, to simulate a simulation scenario that is more conformable to real road conditions and reproduce the real road conditions in an actual travel process, which helps to obtain a more accurate path planning model (that is, a self-driving algorithm) through debugging. For example, during a self-driving road test, algorithm parameters of an original path planning model are not good. As a result, the unmanned vehicle does not have a good and smooth planned speed curve in some scenarios, and consequently excessive throttling or braking is planned. A situation in which the excessive throttling or braking is used is reproduced in the simulation scenario by acquiring the foregoing road test data in the actual travel process, and the algorithm parameters of the path planning model are debugged in a simulation system, to cause the path planning model to output a good and smooth speed curve, to avoid use of the excessive throttling or braking in a similar scenario, thereby debugging the path planning model by using the road test data more effectively and pushing update and iteration of the path planning model.
FIG. 1 is a schematic diagram of an implementation environment of a data processing method according to an embodiment of this application. Referring to FIG. 1, in the implementation environment, a vehicle 101 and a computer device 102 are included.
The vehicle 101 is configured to acquire road test data in an actual travel process. Optionally, functional modules such as an on-board sensor, a positioning component, a camera component, a controller, a data processor, and a self-driving system are installed on the vehicle 101. The foregoing functional modules may implement exchange and sharing between objects participating in traffic with the assistance of modern mobile communication and network technologies such as the IoV, 5
th generation mobile networks (5G) , and vehicle to everything (V2X) , thereby having functions such as sensing and perception, decision planning, and control and execution in a complex environment.
The vehicle 101 may for example be a conventional automobile or truck, an intelligent automobile an unmanned vehicle, an electric vehicle, a bicycle, or a motorcycle etc. The vehicle 101 may be driven and operated manually by a driver, or may be driven by a self-driving system to implement unmanned driving.
Optionally, the on-board sensor includes data acquisition units such as a lidar, a millimeter-wave radar sensor, an acceleration sensor, a gyroscope sensor, a proximity sensor, and a pressure sensor.
In some embodiments, the road test data is a rosbag packet returned by a robot operating system (ROS) when the vehicle 101 performs a road test. Information acquired by the vehicle 101 based on functional modules such as the camera component and the on-board sensor during the road test is stored in the rosbag packet, and is used for sensing and tracking positions and motion attitudes of obstacles and lane lines. Optionally, the rosbag packet further stores positioning data acquired by the positioning component based on a global positioning system (GPS) . Optionally, the rosbag packet further stores vehicle attitude estimation of an inertial measurement unit (IMS, also referred to as an inertial sensor) on the vehicle 101. Optionally, the rosbag packet further stores timestamps of the foregoing various types of information.
The vehicle 101 and the computer device 102 may be directly or indirectly connected to each other in a wired or wireless communication manner. For example, the vehicle 101 and the computer device 102 are connected wirelessly through the IoV, which is not limited in the embodiments of this application.
The computer device 102 is configured to debug parameters of the path planning model, to iterate and update the path planning model. Optionally, the computer device 102 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Optionally, the computer device 102 takes on primary computing work, and the vehicle 101 takes on secondary computing work; alternatively, the computer device 102 takes on secondary computing work, and the vehicle 101 takes on primary computing work; alternatively, collaborative computing is performed by using a distributed computing architecture between the vehicle 101 and the computer device 102.
Optionally, the vehicle 101 generally refers to one of a plurality of vehicles, and a terminal device configured to perform communication connection with the computer device 102 is installed on the vehicle 101. A type of the terminal device includes, but is not limited to at least one of an on-board terminal, a smartphone, a tablet computer, a smart watch, a smart speaker, an e-book reader, a Moving Picture Experts Group Audio Layer III (MP3) player, a Moving Picture Experts Group Audio Layer IV (MP4) player, a laptop portable computer, and a desktop computer. The terminal device is provided with a self-driving system, and the self-driving system may plan travel parameters of the vehicle 101 based on the path planning model debugged by the computer device 102.
A person skilled in the art may learn that there may be more or fewer vehicles 101. For example, there may be only one vehicle 101, or there may be dozens of or hundreds of vehicles 101 or more. The quantity and the device type of the vehicle 101 are not limited in the embodiments of this application.
FIG. 2 is a flowchart of a data processing method according to an embodiment of this application. Referring to FIG. 2, this embodiment is applicable to a computer device, and is described in detail below:
201: The computer device determines at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process, the scenario data being used for simulating a path planning model.
202: The computer device inputs the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and performs path planning on the vehicle, to output simulated travel information of the vehicle.
203: The computer device performs parameter adjustment on the path planning model based on the simulated travel information.
According to the method provided in this embodiment of this application, scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
In a possible implementation, the performing path planning on the vehicle, to output simulated travel information of the vehicle includes:
determining a simulation travel position of the vehicle based on the at least one type of scenario data; and
performing path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
In a possible implementation, the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
In a possible implementation, the scenario data includes the initial positions of the obstacles, and the performing path planning on the vehicle based on the simulation travel position includes:
obtaining, in response to that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of the obstacle; and
performing path planning on the vehicle based on the motion information of the any obstacle.
In a possible implementation, if the scenario data includes the lane line information, the performing path planning on the vehicle based on the simulation travel position includes:
determining, based on the simulation travel position of the vehicle, lane line information of a road section in which the simulation travel position is located; and
performing path planning on the vehicle based on the lane line information of the road section.
In a possible implementation, if the scenario data includes the traffic light information, the performing path planning on the vehicle based on the simulation travel position includes:
determining information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle; and
performing path planning on the vehicle based on the information about the traffic light.
In a possible implementation, the determining information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle includes:
searching for a calibration position which has a shortest distance to the simulation travel position; determining information of a traffic light closest to the calibration position; and
treating the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position.
In a possible implementation, the determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process includes:
parsing the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp; and
obtaining at least one time sequence corresponded to the at least one type of initial scenario data, and determining the at least one time sequence as the at least one type of scenario data.
In a possible implementation, the obtaining at least one time sequence corresponded to the at least one type of initial scenario data includes:
for any type of initial scenario data, sorting at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
Any combination of all the foregoing optional technical solutions may be used to form an optional embodiment of the present disclosure, and details are not described herein again.
FIG. 3 is a flowchart of a data processing method according to an embodiment of this application. Referring to FIG. 3, this embodiment is applicable to a computer device, and is described in detail below:
301: The computer device obtains road test data acquired by a vehicle in a travel process.
The vehicle is configured to acquire the road test data in an actual travel process. Optionally, the vehicle includes transportation tools such as a conventional automobile, an intelligent automobile, an unmanned vehicle, an electric vehicle, a bicycle, and a motorcycle. The vehicle may be driven and operated manually by a driver, or may be driven by a self-driving system to implement unmanned driving.
The road test data is used for representing various types of electrical signals acquired by the vehicle in an actual travel process.
In an exemplary scenario, the road test data is a rosbag packet uploaded by an ROS of the vehicle to the computer device. Optionally, the rosbag packet includes sensing signals, video signals, and positioning signals. Because the vehicle and vehicle obstacles have different sensing signals, video signals, and positioning signals at different moments, these sensing signals, video signals, and positioning signals have respective timestamps. The sensing signals are acquired by an on-board sensor, the video signals are acquired by a camera component, and the positioning signals are acquired by a positioning component, where the sensing signals and the video signals are all used for sensing and tracking positions and motion attitudes of the obstacles and lane lines. For example, sensing signals recorded by an IMU sensor are used for estimating a vehicle attitude of the vehicle.
In some embodiments, in an actual travel process, the vehicle acquires at least one of sensing signals, video signals, or positioning signals in the actual travel process based on at least one of the on-board sensor, the camera component, or the positioning component, and stores at least one of the sensing signals, the video signals, or the positioning signals and respective timestamps correspondingly, to obtain the road test data. The vehicle encapsulates the road test data into a rosbag packet based on the ROS, and sends the rosbag packet to the computer device. The computer device receives the rosbag packet and parses the rosbag packet to obtain the road test data.
Optionally, the computer device encapsulates the road test data based on a Transmission Control Protocol (YCP) , a User Datagram Protocol (UDP) , or an Internet Protocol (IP) , and an encapsulation protocol of the road test data is not specifically limited in the embodiments of this application.
In some embodiments, after receiving any packet, the computer device parses a header field of the packet, to obtain a type identifier of the packet. If the type identifier indicates that the packet is a rosbag packet, the computer device parses data fields of the packet, to obtain the road test data.
In some embodiments, after obtaining the road test data through parsing, the computer device stores the road test data, for example, stores at least one of the sensing signals, the video signals, or the positioning signals in the road test data and the respective timestamps correspondingly. Optionally, the signals and the timestamps are correspondingly stored based on key values, or the signals and the timestamps are correspondingly stored based on storage pages.
In an exemplary scenario, the road test data is stored in a predefined storage container structure. Optionally, the storage container structure includes at least one of a class or a structure. For example, in a first layer of the storage container structure, map configuration information, a simulation duration, scenario data, and the like of self-driving simulation are stored, where the map configuration information is used for indicating whether to use an electronic map or lane lines. In a next layer of the scenario data, a timestamp sequence, traffic light information, lane line information, and vehicle obstacle information are stored, where the vehicle obstacle information includes at least initial positions and motion information of obstacles. Optionally, the traffic light information, the lane line information, the vehicle obstacle information, and the timestamp sequence are stored correspondingly. In a next layer of the vehicle obstacle information, vehicle obstacle identifiers (IDs) , whether the vehicle obstacles include the vehicle, geometric information and position motion state estimation information of the vehicle obstacles, and vehicle obstacle media are stored. In a next layer of the vehicle obstacle media, types of the vehicle obstacles, trajectory tracking, motion planning, a parameter model, and information about the vehicle are stored. According to a nested relationship provided in the storage container structure, logical relationships between complex information in the road test data can be clearly represented, which helps to transform the road test data into scenario data.
302: The computer device parses the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp.
In some embodiments, after obtaining the road test data through parsing, the computer device determines at least one of the sensing signals, the video signals, or the positioning signals in the road test data as the at least one type of initial scenario data. Because at least one of the sensing signals, the video signals, or the positioning signals and timestamps are stored correspondingly, each piece of initial scenario data includes at least one timestamp.
303: The computer device obtains at least one time sequence corresponded to the at least one type of initial scenario data, and determines the at least one time sequence as at least one type of scenario data.
The scenario data is used for simulating a path planning model. Optionally, the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
Optionally, for any type of initial scenario data, the computer device sorts at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence. The time sequence is also referred to as dynamic series and refers to series formed by arranging values (that is, the initial scenario data) with the same statistical indicator in chronological order. A main objective of time sequence analysis is to predict the future according to existing historical data, and the time sequence in the embodiments of this application is used for performing self-driving simulation.
In some embodiments, the computer device sorts various pieces of initial scenario data according to timestamps and stores the various pieces of initial scenario data in the storage container structure. For example, the computer device stores the traffic light information, the lane line information, the vehicle obstacle information, and positioning information of the vehicle correspondingly according to timestamps. Further, the vehicle obstacle information is used as an example, the vehicle obstacle IDs, geometric information (sizes such as the length, the width, and the height) of the vehicle obstacles, and motion attitude estimation information of the vehicle obstacles are sequentially stored in a next layer of the vehicle obstacle information in the storage container structure and are stored as media of a trajectory tracking type according to timestamps, and relative distances between the vehicle and the vehicle obstacles when the vehicle senses each of the vehicle obstacles for the first time need to be recorded. In addition, a weight, the positioning information, and motion attitude estimation information of the vehicle are also stored in the storage container structure.
In some embodiments, the computer device parses the road surface information in the road test data, and stores position coordinates (x, y, z) of the vehicle in an electronic map and timestamps correspondingly based on the positioning information and the motion attitude estimation information of the vehicle.
In some embodiments, the computer device deletes information about vehicle obstacles of which a relative distance to the vehicle is greater than a first threshold, or the computer device deletes information about vehicle obstacles of which a relative distance to a lane line is greater than a second threshold, or the computer device deletes information about vehicle obstacles that do not interfere with a motion trajectory of the vehicle, thereby preventing some useless vehicle obstacles from interfering with a simulation operation and improving the data processing efficiency of self-driving simulation. Each of the first threshold and the second threshold is any value greater than or equal to 0.
In steps 302 and 303, a possible implementation that the computer device determines at least one type of scenario data based on the road test data acquired by the vehicle in a travel process is provided. According to a predefined storage container structure, a time sequence can be automatically obtained when the road test data is stored in the storage container structure, that is, a storage step and a parsing step of the road test data are coupled together without performing additional redundant and complex parsing operations, thereby simplifying an obtaining procedure of scenario data. In some embodiments, the computer device may alternatively parse the road test data by predefining a parsing algorithm, so that road test data are stored in the storage container structure according to timestamps based on the parsing algorithm, to obtain the scenario data. In the embodiments of this application, an obtaining manner of the scenario data is not specifically limited.
304: The computer device inputs the at least one type of scenario data into a path planning model, and determines a simulation travel position of the vehicle based on the at least one type of scenario data.
In some embodiments, the computer device reads the at least one type of scenario data from the storage container structure, and inputs the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle. Since position coordinates of the vehicle and timestamps are correspondingly stored when the road surface information is parsed, position coordinates of the vehicle at any moment may be determined from the storage container structure at the any moment, and a position point indicated by the position coordinates in the electronic map is determined as a simulation travel position of the vehicle at the any moment.
In some embodiments, the path planning model includes at least one of the following functional modules: an obstacle simulation unit, an obstacle prediction unit, a path planning unit, a control unit, and a vehicle simulation unit. Optionally, the obstacle simulation unit includes an obstacle subunit and a lane line/traffic light subunit, where the obstacle subunit is configured to simulate sensing signals of other vehicle obstacles in a simulation scenario, and the lane line/traffic light subunit is configured to simulate sensing signals of a lane line or a traffic light in the simulation scenario.
FIG. 4 is a schematic diagram of a logical structure of a path planning model according to an embodiment of this application. Referring to FIG. 4, a path planning model 400 includes an obstacle simulation unit 401, an obstacle prediction unit 402, a path planning unit 403, a control unit 404, and a vehicle simulation unit 405, where the obstacle simulation unit 401 includes an obstacle subunit 4011 and a lane line/traffic light subunit 4012. Based on the above, when the at least one type of scenario data is inputted into the path planning model, the vehicle obstacle information (including the initial positions and motion information of obstacles) is inputted into the obstacle subunit 4011, and the lane line information or the traffic light information is inputted into the lane line/traffic light subunit 4012. Then, data stored in the obstacle simulation unit 401 is inputted into the obstacle prediction unit 402, and the obstacle prediction unit 402 predicts simulated travel information of the vehicle obstacles. In addition, a virtual lane line or a virtual traffic light in the simulation scenario is predicted based on data stored in the lane line/traffic light subunit 4012, and the virtual lane line or the virtual traffic light that is predicted and the simulated travel information of the vehicle obstacles that is predicted by the obstacle prediction unit 402 are both inputted into the path planning unit 403. The path planning unit 403 plans the simulated travel information (including a speed curve and a travel trajectory of the vehicle) of the vehicle, and inputs the travel trajectory of the vehicle into the control unit 404. The control unit 404 generates, according to the travel trajectory of the vehicle, a command prompt (CMD) for controlling the vehicle to move, and inputs the CMD of the vehicle into the vehicle simulation unit 405. The vehicle simulation unit 405 controls motion of the vehicle in the simulation scenario based on the CMD of the vehicle, and inputs motion data of the vehicle into each of the path planning unit 403 and the obstacle simulation unit 401, to iteratively control motion of the vehicle at a next moment. It should be noted that, geometric information of the vehicle and the road surface information in the scenario data also need to be inputted into the vehicle simulation unit 405, to simulate a vehicle that is more conformable to an actual size and real road conditions in the simulation scenario.
305: The computer device performs path planning on the vehicle based on the simulation travel position to output simulated travel information of the vehicle.
Optionally, since the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle, how to fully use the scenario data to perform path planning will be discussed in the following three cases in the embodiments of this application.
1. Vehicle obstacle information-based path planning
In some embodiments, if the scenario data includes the initial positions of the obstacles, the computer device obtains, in response to that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of the obstacle; and performs path planning on the vehicle based on the motion information of the any obstacle. Optionally, the distance threshold is a relative distance between the vehicle and an obstacle acquired in the road test data when the vehicle senses the obstacle for the first time.
In the foregoing process, whether to trigger an obstacle to start to move in the simulation scenario is decided based on the relative distance between the vehicle and the obstacle, so that a motion simulation degree of the obstacle in the simulation scenario is higher, relative scenarios between the vehicle and the vehicle obstacle in the rosbag packet may be kept rather than being directly mechanically played based on the timestamps in the rosbag packet. In this way, even when a speed of the vehicle is changed due to changes of an algorithm, a relative trigger condition between the vehicle and the obstacle may still be kept. It should be noted that, triggering an obstacle to move in the simulation scenario refers to playing motion information of the obstacle according to a timestamp sequence from a trigger moment, where the trigger moment refers to a moment at which the distance between the vehicle and the obstacle is exactly less than the distance threshold.
In an exemplary scenario, FIG. 5 is a schematic principle diagram of path planning according to an embodiment of this application. Referring to FIG. 5, it is assumed that a distance threshold of an obstacle 501 is s
1, and a distance threshold of an obstacle 502 is s
2. At a simulation moment t = t
0, a vehicle 500 travels to a position
and if a distance between an initial position
of the obstacle 501 and the position
of the vehicle 500 is less than s
1, the obstacle 501 is triggered to start to move in the simulation scenario. At a simulation moment t = t
1, the vehicle 500 travels to a position
the obstacle 501 travels to a position
the obstacle 502 is located at an initial position
and if a distance between the position
of the vehicle 500 and the initial position
of the obstacle 502 is less than s
2, the obstacle 502 is triggered to start to move in the simulation scenario.
2. Lane line information-based path planning
In some embodiments, if the scenario data includes the lane line information, the computer device determines lane line information of a road section in which the simulation travel position is located based on the simulation travel position of the vehicle; and performs path planning on the vehicle based on the lane line information of the road section.
In some embodiments, when querying lane line information of a current road section, the computer device searches for a calibration position which has a shortest distance to the simulation travel position. The calibration position and may be stored together with corresponding information about a lane line within a target range of the calibration position. The lane line information that is stored corresponding to the calibration position may then be determined as, or treated as the lane line information of the road section in which the simulation travel position is located.
Optionally, when determining the lane line information according to the simulation travel position, the computer device adopts a K-dimensional tree (KD tree) search method, where a KD tree is a data search structure that may quickly search for a nearest neighbor and an approximate nearest neighbor in a high-dimensional space. In an example, the computer device uses position information of the vehicle in each road section in the road test data as an index (that is, the calibration position) , stores lane line information of each road section as a data member into the KD tree, extracts an index of which a distance to the simulation travel position is nearest from the KD tree according to the simulation travel position during simulation, uses a data member corresponding to the nearest index as the lane line information of the current road section of the vehicle, and sends the lane line to a path planning unit to perform path planning based on the self-driving algorithm. Such a KD tree-based search method may greatly improve the search efficiency of the lane line information.
In an exemplary scenario, FIG. 6 is a schematic principle diagram of path planning according to an embodiment of this application. As shown in 600, in the rosbag packet, when the vehicle travels to a position (x
0, y
0) , the on-board sensor senses information about a lane line within an R1 range; and when the vehicle travels to a position (x
1, y
1) , the on-board sensor senses information about a lane line within an R2 range. Therefore, during simulation, if an index of the nearest neighbor obtained through search based on the simulation travel position is (x
0, y
0) , a data member (that is, the information about the lane line) within the R1 range is outputted from the KD tree; and if an index of the nearest neighbor obtained through search based on the simulation travel position is (x
1, y
1) , a data member (that is, the information about the lane line) within the R2 range is outputted from the KD tree.
3. Traffic light information-based path planning
In some embodiments, if the scenario data includes the traffic light information, the computer device determines information about a traffic light having a shortest distance to the simulation travel position and performs path planning on the vehicle based on the information about the traffic light.
In some embodiments, when querying information about a nearest traffic light, the computer device searches for a calibration position which has a shortest distance to the simulation travel position. The computer device may then obtain, information about a traffic light within a target range of the calibration position; for example corresponding traffic light information may be stored for each calibration position. The computer device may treat the traffic light closest to the calibration position as the traffic light having the shortest distance to the simulation travel position. In this way, the computer device may determine information about a traffic light that which is closest to the calibration position as the information about the traffic light having a shortest distance to the simulation travel position.
A method for searching for the traffic light information based on the KD tree is similar to the method for searching for the lane line information based on the KD tree, so that details are not described herein again. Such a KD tree-based search method may greatly improve the search efficiency of the traffic light information.
In steps 304 and 305, a possible implementation that the computer device invokes a path planning model to perform path planning on the vehicle to output simulated travel information of the vehicle is provided. After planning a path of the vehicle, the computer device simulates the motion of the vehicle in the simulation scenario, to debug parameters of the path planning model based on motion feedback.
306: The computer device performs parameter adjustment on the path planning model based on the simulated travel information.
In some embodiments, the path planning model outputs a speed curve and a travel trajectory of the vehicle, to control the vehicle to perform simulation motion in the simulation scenario in the travel trajectory according to the speed curve. Optionally, the computer device inputs the road surface information, the speed curve, and the travel trajectory into a vehicle simulation unit, and (x, y, z) coordinates in the road surface information form a road surface. The computer device may calculate a displacement per frame of the vehicle in the travel trajectory according to the speed curve based on a kinematic model, and reposition the displacement per frame on the road surface, to simulate the motion of the vehicle. Optionally, when the kinematic model performs kinematic calculation, a current simulation travel position of the vehicle, the speed curve, the travel trajectory, and a slope of the road surface may be taken into consideration.
FIG. 7 is an effect diagram of comparison between an artificial scenario and a simulation scenario according to an embodiment of this application. As shown in 701 and 702, 701 corresponds to an artificial scenario, and 702 corresponds to a simulation scenario. Each curve of the first line represents a law that a distance between the vehicle and a front car changes over time, and each curve of the second line represents a speed change curve of the front car sensed by the vehicle. As can be seen, in the simulation scenario, interference from noise may be apparently reproduced in the distance curves and the speed curves, and formation causes of the noise include, but are not limited to: a driving manner of a driver of the front car, influence of a traffic flow status, a sensing error of a self-driving algorithm, a sensing error of the noise, and the like. Therefore, the simulation scenario has a better reproduction degree and simulation degree.
FIG. 8 is a principle flowchart of a data processing method according to an embodiment of this application. As shown in 800, in step 1, a computer device acquires a rosbag packet of road test data; in step 2, the computer device generates scenario data based on the rosbag packet; in step 3, the computer device inputs the scenario data into a path planning model to perform simulation; in step 4, the path planning model generates a simulation report; in step 5, the computer device performs quality evaluation on the path planning model based on the simulation report; and in step 6, the computer device performs parameter adjustment on the path planning model (that is, a self-driving algorithm) , to cause an adjusted path planning model to have more accurate path planning performance.
In some embodiments, when quality evaluation is performed, at least one of the following conditions is adopted: a distance between the simulation travel position of the vehicle and an actual travel position thereof is not less than n (n ≥ 0) meters; a depth percentage of a brake pedal is not greater than m% (m ≥ 0) ; and a distance between the simulation travel position of the vehicle and a center position of a lane is not greater than k (k ≥ 0) meters, where m, n, and k are set by a technician, so as to measure the performance of the path planning model quantitatively.
Any combination of all the foregoing optional technical solutions may be used to form an optional embodiment of the present disclosure, and details are not described herein again.
According to the method provided in this embodiment of this application, scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
FIG. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of this application. Referring to FIG. 9, the apparatus includes:
a determining module 901, configured to determine at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process, the scenario data being used for simulating a path planning model;
a path planning module 902, configured to input the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; and
an adjustment module 903, configured to perform parameter adjustment on the path planning model based on the simulated travel information.
According to the apparatus provided in this embodiment of this application, scenario data used for self-driving simulation is generated according to road test data acquired by a vehicle in an actual travel process, the scenario data may truly reproduce actual road conditions and noise interference, to cause a path planning model to have a more realistic simulation effect during debugging, which helps to obtain a path planning model with higher accuracy through debugging, that is, to obtain a self-driving algorithm with higher accuracy, thereby improving the intelligence of a self-driving vehicle.
In a possible implementation, based on the apparatus composition of FIG. 9, the path planning module 902 includes:
a determining unit, configured to determine a simulation travel position of the vehicle based on the at least one type of scenario data; and
a path planning unit, configured to perform path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
In a possible implementation, the scenario data includes at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
In a possible implementation, if the scenario data includes the initial positions of the obstacles, the path planning unit is configured to:
obtain, in response to that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of the obstacle; and
perform path planning on the vehicle based on the motion information of the any obstacle.
In a possible implementation, if the scenario data includes the lane line information, the path planning unit is configured to:
determine lane line information of a road section in which the simulation travel position is located based on the simulation travel position of the vehicle; and
perform path planning on the vehicle based on the lane line information of the road section.
In a possible implementation, if the scenario data includes the traffic light information, based on the apparatus composition of FIG. 9, the path planning unit includes:
a determining subunit, configured to determine information about a traffic light having a shortest distance to the simulation travel position based on the simulation travel position of the vehicle; and
a planning subunit, configured to perform path planning on the vehicle based on the information about the traffic light.
In a possible implementation, the determining subunit is configured to:
search for a calibration position which has a shortest distance to the simulation travel position; determine information of a traffic light closest to the calibration position; and
treat the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position .
In a possible implementation, based on the apparatus composition of FIG. 9, the determining module 901 includes:
a parsing unit, configured to parse the road test data to obtain at least one type of initial scenario data, one type of initial scenario data including at least one timestamp; and
an obtaining and determining unit, configured to obtain at least one time sequence corresponded to the at least one type of initial scenario data, and determine the at least one time sequence as the at least one type of scenario data.
In a possible implementation, the obtaining and determining unit is configured to:
for any type of initial scenario data, sort at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
Any combination of all the foregoing optional technical solutions may be used to form an optional embodiment of the present disclosure, and details are not described herein again.
It should be noted that, when the data processing apparatus provided in the foregoing embodiment processes data, it is illustrated with an example of division of functional modules. During practical application, the function distribution may be implemented by different functional modules according to requirements, that is, an internal structure of the computer device is divided into different functional modules, to implement all or some of the functions described above. In addition, the data processing apparatus provided in the foregoing embodiment and the embodiments of the data processing method belong to the same concept. For a specific implementation process, reference may be made to the embodiments of the data processing method. Details are not described herein again.
FIG. 10 is a structural block diagram of an on-board terminal 1000 according to an exemplary embodiment of this application. Optionally, a device type of the on-board terminal 1000 may include: a smartphone, a tablet computer, a Moving Picture Experts Group Audio Layer III (MP3) player, a Moving Picture Experts Group Audio Layer IV (MP4) player, a notebook computer, or a desktop computer. The on-board terminal 1000 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or another name.
Generally, the on-board terminal 1000 includes a processor 1001 and a memory 1002.
Optionally, the processor 1001 includes one or more processing cores, for example, a 4-core processor or an 8-core processor. Optionally, the processor 1001 may be implemented in at least one hardware form of a digital signal processor (DSP) , a field-programmable gate array (FPGA) , and a programmable logic array (PLA) . In some embodiments, the processor 1001 includes a main processor and a coprocessor. The main processor is a processor configured to process data in an awake state, and is also referred to as a central processing unit (CPU) . The coprocessor is a low power consumption processor configured to process data in a standby state. In some embodiments, the processor 1001 may be integrated with a graphics processing unit (GPU) . The GPU is configured to render and draw content that needs to be displayed on a display screen. In some embodiments, the processor 1001 further includes an artificial intelligence (AI) processor. The AI processor is configured to process computing operations related to machine learning.
In some embodiments, the memory 1002 includes one or more computer-readable storage medium. Optionally, the computer-readable storage medium is non-transient. Optionally, the memory 1002 further includes a high-speed random access memory and a non-volatile memory, such as one or more magnetic disk storage devices or flash storage devices. In some embodiments, a non-transient computer-readable storage medium in the memory 1002 is configured to store machine readable instructions, and the machine readable instructions are configured to be executable by the processor 1001 to implement the data processing method provided in the embodiments of this application.
In some embodiments, the on-board terminal 1000 may optionally include a peripheral device interface 1003 and at least one peripheral device. The processor 1001, the memory 1002, and the peripheral device interface 1003 may be connected through a bus or a signal cable. Each peripheral device may be connected to the peripheral device interface 1003 through the bus, the signal line, or a circuit board. Specifically, the peripheral device includes: at least one of a radio frequency (RF) circuit 1004, a touch display screen 1005, a camera component 1006, an audio circuit 1007, a positioning component 1008, and a power supply 1009.
The peripheral device interface 1003 may be configured to connect at least one input/output (I/O) -related peripheral device to the processor 1001 and the memory 1002. In some embodiments, the processor 1001, the memory 1002, and the peripheral device interface 1003 are integrated on the same chip or circuit board. In some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral device interface 1003 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The RF circuit 1004 is configured to receive and transmit an RF signal, which is also referred to as an electromagnetic signal. The RF circuit 1004 communicates with a communication network and other communication devices through the electromagnetic signal. The RF circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the RF circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chip set, a subscriber identity module card, and the like. Optionally, the RF circuit 1004 may communicate with other terminals through at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: a metropolitan area network, generations of mobile communication networks (2G, 3G, 4G, and 5G) , a wireless local area network and/or a wireless fidelity (Wi-Fi) network. In some embodiments, the RF circuit 1004 may further include a circuit related to near field communication (NFC) , which is not limited in this application.
The display screen 1005 is configured to display a user interface (UI) . Optionally, the UI may include a graph, a text, an icon, a video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 is further capable of acquiring touch signals on or above a surface of the display screen 1005. The touch signal may be inputted to the processor 1001 for processing as a control signal. Optionally, the display screen 1005 is further configured to provide a virtual button and/or a virtual keyboard, which is also referred to as a soft button and/or a soft keyboard. In some embodiments, there is one display screen 1005 disposed on a front panel of the on-board terminal 1000. In some other embodiments, there are at least two display screens 1005 respectively disposed on different surfaces of the on-board terminal 1000 or designed in a foldable shape. In still some other embodiments, the display screen 1005 is a flexible display screen disposed on a curved surface or a folded surface of the on-board terminal 1000. Optionally, the display screen 1005 is even set in a non-rectangular irregular pattern, namely, a special-shaped screen. Optionally, the display screen 1005 is manufactured by using a material such as a liquid crystal display (LCD) or an organic light-emitting diode (OLED) .
The camera component 1006 is configured to acquire an image or a video. Optionally, the camera component 1006 includes a front-facing camera and a rear-facing camera. Generally, the front-facing camera is disposed on the front panel of the terminal, and the rear-facing camera is disposed on a back surface of the terminal. In some embodiments, there are at least two rear-facing cameras, which are respectively any of a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, to achieve background blurring function through fusion of the main camera and the depth-of-field camera, panoramic photographing and virtual reality (VR) photographing through fusion of the main camera and the wide-angle camera, or other fusion photographing functions. In some embodiments, the camera component 1006 further includes a flash. Optionally, the flash is a single color temperature flash or a double color temperature flash. The double color temperature flash is a combination of a warm light flash and a cold light flash, and is used for light compensation under different color temperatures.
In some embodiments, the audio circuit 1007 includes a microphone and a speaker. The microphone is configured to acquire sound waves of a user and an environment, and convert the sound waves into an electrical signal to input to the processor 1001 for processing, or input to the RF circuit 1004 for implementing voice communication. For the purpose of stereo acquisition or noise reduction, there are a plurality of microphones, disposed at different parts of the on-board terminal 1000 respectively. Optionally, the microphone is an array microphone or an omni-directional acquisition microphone. The speaker is configured to convert electrical signals from the processor 1001 or the RF circuit 1004 into sound waves. Optionally, the speaker is a conventional thin-film speaker or a piezoelectric ceramic speaker. When the speaker is the piezoelectric ceramic speaker, electrical signals not only can be converted into sound waves that can be heard by human, but also can be converted into sound waves that cannot be heard by human for ranging or other uses. In some embodiments, the audio circuit 1007 further includes an earphone jack.
The positioning component 1008 is configured to determine a current geographic location of the on-board terminal 1000, to implement navigation or a location based service (LBS) . Optionally, the positioning component 1008 may be a positioning component based on the Global Positioning System (GPS) of the United States, the BeiDou System of China, the GLONASS System of Russia, or the GALILEO System of the European Union.
The power supply 1009 is configured to supply power to components in the on-board terminal 1000. Optionally, the power supply 1009 is an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1009 includes a rechargeable battery, the rechargeable battery supports wired charging or wireless charging. The rechargeable battery is further configured to support a fast charge technology.
In some embodiments, the on-board terminal 1000 further includes one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: an acceleration sensor 1011, a gyroscope sensor 1012, a pressure sensor 1013, a fingerprint sensor 1014, an optical sensor 1015, and a proximity sensor 1016.
In some embodiments, the acceleration sensor 1011 detects accelerations on three coordinate axes of a coordinate system established by the on-board terminal 1000. For example, the acceleration sensor 1011 is configured to detect components of the gravity acceleration on the three coordinate axes. Optionally, the processor 1001 may control, according to a gravity acceleration signal acquired by the acceleration sensor 1011, the touch display screen 1005 to display the UI in a frame view or a portrait view. The acceleration sensor 1011 is further configured to acquire motion data of a game or a user.
In some embodiments, the gyroscope sensor 1012 detects a body direction and a rotation angle of the on-board terminal 1000. The gyroscope sensor 1012 acquires a 3D action of the user on the on-board terminal 1000 together with the acceleration sensor 1011. The processor 1001 implements the following functions according to the data acquired by the gyroscope sensor 1012: motion sensing (for example, changing the UI according to a tilt operation of the user) , image stabilization during shooting, game control, and inertial navigation.
Optionally, the pressure sensor 1013 is disposed on a side frame of the on-board terminal 1000 and/or a lower layer of the touch display screen 1005. When the pressure sensor 1013 is disposed on the side frame of the on-board terminal 1000, a holding signal of the user to the on-board terminal 1000 may be detected, and left/right hand identification or a quick action may be performed by the processor 1001 according to the holding signal acquired by the pressure sensor 1013. When the pressure sensor 1013 is disposed at the lower layer of the touch display screen 1005, the processor 1001 controls an operable control on the UI interface according to a pressure operation performed by the user on the touch display screen 1005. The operable control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1014 is configured to acquire a user's fingerprint, and the processor 1001 identifies a user's identity according to the fingerprint acquired by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies a user's identity according to the acquired fingerprint. When identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform related sensitive operations. The sensitive operations include: unlocking a screen, viewing encrypted information, downloading software, paying, changing a setting, and the like. Optionally, the fingerprint sensor 1014 is disposed on a front face, a back face, or a side face of the on-board terminal 1000. When a physical button or a vendor logo is disposed on the on-board terminal 1000, the fingerprint sensor 1014 may be integrated together with the physical button or the vendor logo.
The optical sensor 1015 is configured to acquire ambient light intensity. In an embodiment, the processor 1001 controls display brightness of the touch display screen 1005 according to the ambient light intensity acquired by the optical sensor 1015. specifically, when the ambient light intensity is relatively high, the display brightness of the touch display screen 1005 is increased; and when the ambient light intensity is relatively low, the display brightness of the touch display screen 1005 is reduced. In another embodiment, the processor 1001 further dynamically adjusts a camera parameter of the camera component 1006 according to the ambient light intensity acquired by the optical sensor 1015.
The proximity sensor 1016 is also referred to as a distance sensor and is generally disposed at the front panel of the on-board terminal 1000. The proximity sensor 1016 is configured to acquire a distance between the user and the front face of the on-board terminal 1000. In an embodiment, when the proximity sensor 1016 detects that the distance between the user and the front face of the on-board terminal 1000 is gradually reduced, the processor 1001 controls the touch display screen 1005 to be switched from a screen-on state to a screen-off state; and when the proximity sensor 1016 detects that the distance between the user and the front face of the on-board terminal 1000 is gradually increased, the processor 1001 controls the touch display screen 1005 to be switched from the screen-off state to the screen-on state.
A person skilled in the art may understand that the structure shown in FIG. 10 does not constitute any limitation on the on-board terminal 1000 and that the on-board terminal may include more or fewer components than those shown in the figure, a combination of some components, or different component arrangements.
FIG. 11 is a schematic structural diagram of a computer device according to an embodiment of this application. The computer device 1100 may vary greatly due to different configurations or performance, and the computer device 1100 includes one or more central processing units (CPUs) 1101 and one or more memories 1102. The memory 1102 stores machine readable instructions, and the machine readable instructions are loaded and executed by the processor 1101 to implement the data processing method provided in the foregoing embodiments. Optionally, the computer device 1100 further includes components such as a wired or wireless network interface, a keyboard, and an I/O interface for ease of I/O, and the computer device 1100 further includes other components for implementing functions of the device. Details are not described herein again.
In an exemplary embodiment, a non-transitory computer-readable storage medium, for example, a memory including machine readable instructions is further provided. The machine readable instructions may be executed by a processor in a terminal to implement the data processing method in the foregoing embodiments. For example, the computer-readable storage medium includes a read-only memory (ROM) , a random access memory (RAM) , a compact disc read-only memory (CD-ROM) , a magnetic tape, a floppy disk, an optical data storage device, or the like.
In an exemplary embodiment, a computer program product or a computer program is further provided, including one or more pieces of program code, the one or more pieces of program code being stored in a non-transitory computer-readable storage medium. One or more processors of a computer device can read the one or more pieces of program code from the non-transitory computer-readable storage medium, and the one or more processors execute the one or more pieces of program code to enable the computer device to perform the data processing method in the foregoing embodiments.
A person of ordinary skill in the art may understand that all or some of the steps of the foregoing embodiments may be implemented by hardware or may be implemented by a program instructing relevant hardware. Optionally, the program is stored in a non-transitory computer-readable storage medium. Optionally, the non-transitory storage medium mentioned above is a ROM, a magnetic disk, an optical disc, or the like.
The foregoing descriptions are merely optional embodiments of this application, but are not intended to limit this application. Any modification, equivalent replacement, or improvement made within the spirit and principle of this application shall fall within the protection scope of this application.
Claims (21)
- A data processing method, comprising:determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process;inputting the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and performing path planning on the vehicle, to output simulated travel information of the vehicle; andperforming parameter adjustment on the path planning model based on the simulated travel information.
- The method according to claim 1, wherein the performing path planning on the vehicle, to output simulated travel information of the vehicle comprises:determining a simulation travel position of the vehicle based on the at least one type of scenario data; andperforming path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
- The method according to claim 2, wherein the scenario data comprises at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- The method according to claim 3, wherein the scenario data comprises the initial positions of the obstacles, and the performing path planning on the vehicle based on the simulation travel position comprises:obtaining, in response to determining that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of said obstacle; andperforming path planning on the vehicle based on the motion information of said obstacle.
- The method according to claim 3, wherein the scenario data comprises the lane line information, and the performing path planning on the vehicle based on the simulation travel position comprises:determining, based on the simulation travel position of the vehicle, lane line information of a road section in which the simulation travel position is located; andperforming path planning on the vehicle based on the lane line information of the road section.
- The method according to claim 3, wherein the scenario data comprises traffic light information, and the performing path planning on the vehicle based on the simulation travel position comprises:determining, based on the simulation travel position of the vehicle, information about a traffic light having a shortest distance to the simulation travel position; andperforming path planning on the vehicle based on the information about the traffic light.
- The method according to claim 6, wherein determining information about a traffic light having a shortest distance to the simulation travel position comprises:searching for a calibration position which has a shortest distance to the simulation travel position,determining information of a traffic light closest to the calibration position; andtreating the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position.
- The method according to claim 1, wherein the determining at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process comprises:parsing the road test data to obtain at least one type of initial scenario data, one type of initial scenario data comprising at least one timestamp; andobtaining at least one time sequence corresponded to the at least one type of initial scenario data, and determining the at least one time sequence as the at least one type of scenario data.
- The method according to claim 8, wherein the obtaining at least one time sequence corresponded to the at least one type of initial scenario data comprises:for any type of initial scenario data, sorting at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
- A data processing apparatus, comprising:a determining module, configured to determine at least one type of scenario data based on road test data previously acquired by a vehicle in a travel process;a path planning module, configured to input the at least one type of scenario data into a path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; andan adjustment module, configured to perform parameter adjustment on the path planning model based on the simulated travel information.
- The apparatus according to claim 10, wherein the path planning module comprises:a determining unit, configured to determine a simulation travel position of the vehicle based on the at least one type of scenario data; anda path planning unit, configured to perform path planning on the vehicle based on the simulation travel position to obtain the simulated travel information of the vehicle.
- The apparatus according to claim 11, wherein the scenario data comprises at least one of traffic light information, lane line information, initial positions of obstacles, motion information of the obstacles, road surface information, and travel information of the vehicle.
- The apparatus according to claim 12, wherein the scenario data comprises the initial positions of the obstacles, and the path planning unit is configured to:obtain, in response to determining that a distance between the simulation travel position of the vehicle and an initial position of an obstacle is less than a distance threshold, motion information of said obstacle; andperform path planning on the vehicle based on the motion information of said obstacle.
- The apparatus according to claim 12, wherein the scenario data comprises the lane line information, and the path planning unit is configured to:determine based on the simulation travel position of the vehicle, lane line information of a road section in which the simulation travel position is located; andperform path planning on the vehicle based on the lane line information of the road section.
- The apparatus according to claim 12, wherein the scenario data comprises the traffic light information, and the path planning unit comprises:a determining subunit, configured to determine based on the simulation travel position of the vehicle, information about a traffic light having a shortest distance to the simulation travel position; anda planning subunit, configured to perform path planning on the vehicle based on the information about the traffic light.
- The apparatus according to claim 15, wherein the determining subunit is configured to:search for a calibration position which has a shortest distance to the simulation travel position,determine information of a traffic light closest to the calibration position; andtreat the traffic light closest to the calibration position as the traffic light having the shortest distance of the simulation travel position.
- The apparatus according to claim 10, wherein the determining module comprises:a parsing unit, configured to parse the road test data to obtain at least one type of initial scenario data, one type of initial scenario data comprising at least one timestamp; andan obtaining and determining unit, configured to obtain at least one time sequence corresponded to the at least one type of initial scenario data, and determine the at least one time sequence as the at least one type of scenario data.
- The apparatus according to claim 17, wherein the obtaining and determining unit is configured to:for any type of initial scenario data, sort at least one element in the initial scenario data in ascending order of timestamps, to obtain a time sequence.
- A data processing system, comprising a vehicle and a computer device, whereinthe vehicle is configured to acquire road test data in a travel process and send the road test data to the computer device; and the computer device is configured to determine at least one type of scenario data based on the road test data previously acquired by the vehicle in the travel process, the scenario data being used for simulating a path planning model; input the at least one type of scenario data into the path planning model in order to simulate operation of the path planning model during a travel process of the vehicle, and perform path planning on the vehicle, to output simulated travel information of the vehicle; and perform parameter adjustment on the path planning model based on the simulated travel information.
- A computer device, comprising one or more processors and a non-transitory computer readable storage medium storing machine readable instructions which are executable by the one or more processors to implement the data processing method according to any one of claims 1 to 9.
- A non-transitory storage medium storing machine readable instructions which are executable by a processor to implement the data processing method according to any one of claims 1 to 9.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011288734.6A CN112307642B (en) | 2020-11-17 | 2020-11-17 | Data processing method, device, system, computer equipment and storage medium |
CN202011288734.6 | 2020-11-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022105395A1 true WO2022105395A1 (en) | 2022-05-27 |
Family
ID=74336137
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/118215 WO2022105395A1 (en) | 2020-11-17 | 2021-09-14 | Data processing method, apparatus, and system, computer device, and non-transitory storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112307642B (en) |
WO (1) | WO2022105395A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116298088A (en) * | 2022-12-29 | 2023-06-23 | 华世德电子科技(昆山)有限公司 | Test method and system for nitrogen-oxygen sensor for vehicle |
CN118025235A (en) * | 2024-04-12 | 2024-05-14 | 智道网联科技(北京)有限公司 | Automatic driving scene understanding method, device and system and electronic equipment |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112307642B (en) * | 2020-11-17 | 2022-09-16 | 苏州智加科技有限公司 | Data processing method, device, system, computer equipment and storage medium |
CN113065240B (en) * | 2021-03-19 | 2023-04-07 | 成都安智杰科技有限公司 | Self-adaptive cruise simulation method and device, electronic equipment and storage medium |
CN113093627A (en) * | 2021-04-13 | 2021-07-09 | 上海车右智能科技有限公司 | A motion carrier system for autopilot test |
CN113343425B (en) * | 2021-05-08 | 2022-09-30 | 北京三快在线科技有限公司 | Simulation test method and device |
CN113343457B (en) * | 2021-05-31 | 2023-05-30 | 苏州智加科技有限公司 | Automatic driving simulation test method, device, equipment and storage medium |
KR102652486B1 (en) * | 2021-09-24 | 2024-03-29 | (주)오토노머스에이투지 | Method for predicting traffic light information by using lidar and server using the same |
CN115148028B (en) * | 2022-06-30 | 2023-12-15 | 北京小马智行科技有限公司 | Method and device for constructing vehicle drive test scene according to historical data and vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108549366A (en) * | 2018-05-04 | 2018-09-18 | 同济大学 | Intelligent automobile road driving mapping experiment method parallel with virtual test |
US20200209874A1 (en) * | 2018-12-31 | 2020-07-02 | Chongqing Jinkang New Energy Vehicle, Ltd. | Combined virtual and real environment for autonomous vehicle planning and control testing |
CN111505965A (en) * | 2020-06-17 | 2020-08-07 | 深圳裹动智驾科技有限公司 | Method and device for simulation test of automatic driving vehicle, computer equipment and storage medium |
CN111611711A (en) * | 2020-05-21 | 2020-09-01 | 北京百度网讯科技有限公司 | Automatic driving data processing method and device and electronic equipment |
CN112307642A (en) * | 2020-11-17 | 2021-02-02 | 苏州智加科技有限公司 | Data processing method, device, system, computer equipment and storage medium |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109597317B (en) * | 2018-12-26 | 2022-03-18 | 广州小鹏汽车科技有限公司 | Self-learning-based vehicle automatic driving method and system and electronic equipment |
CN111142539B (en) * | 2020-01-13 | 2020-10-27 | 中智行科技有限公司 | Unmanned vehicle control method and device and unmanned vehicle |
-
2020
- 2020-11-17 CN CN202011288734.6A patent/CN112307642B/en active Active
-
2021
- 2021-09-14 WO PCT/CN2021/118215 patent/WO2022105395A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108549366A (en) * | 2018-05-04 | 2018-09-18 | 同济大学 | Intelligent automobile road driving mapping experiment method parallel with virtual test |
US20200209874A1 (en) * | 2018-12-31 | 2020-07-02 | Chongqing Jinkang New Energy Vehicle, Ltd. | Combined virtual and real environment for autonomous vehicle planning and control testing |
CN111611711A (en) * | 2020-05-21 | 2020-09-01 | 北京百度网讯科技有限公司 | Automatic driving data processing method and device and electronic equipment |
CN111505965A (en) * | 2020-06-17 | 2020-08-07 | 深圳裹动智驾科技有限公司 | Method and device for simulation test of automatic driving vehicle, computer equipment and storage medium |
CN112307642A (en) * | 2020-11-17 | 2021-02-02 | 苏州智加科技有限公司 | Data processing method, device, system, computer equipment and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116298088A (en) * | 2022-12-29 | 2023-06-23 | 华世德电子科技(昆山)有限公司 | Test method and system for nitrogen-oxygen sensor for vehicle |
CN118025235A (en) * | 2024-04-12 | 2024-05-14 | 智道网联科技(北京)有限公司 | Automatic driving scene understanding method, device and system and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN112307642A (en) | 2021-02-02 |
CN112307642B (en) | 2022-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022105395A1 (en) | Data processing method, apparatus, and system, computer device, and non-transitory storage medium | |
WO2021128777A1 (en) | Method, apparatus, device, and storage medium for detecting travelable region | |
CN110967011B (en) | Positioning method, device, equipment and storage medium | |
US10031526B1 (en) | Vision-based driving scenario generator for autonomous driving simulation | |
CN110795523B (en) | Vehicle positioning method and device and intelligent vehicle | |
EP3620959B1 (en) | Image data acquisition logic of an autonomous driving vehicle for capturing image data using cameras | |
CN110386142A (en) | Pitch angle calibration method for automatic driving vehicle | |
CN111192341A (en) | Method and device for generating high-precision map, automatic driving equipment and storage medium | |
CN113343457B (en) | Automatic driving simulation test method, device, equipment and storage medium | |
KR20200143242A (en) | Detecting adversarial samples by a vision based perception system | |
US11971481B2 (en) | Point cloud registration for lidar labeling | |
EP3832605A1 (en) | Method and device for determining potentially visible set, apparatus, and storage medium | |
US20220176977A1 (en) | Vehicle control | |
CN109213144A (en) | Man-machine interface (HMI) framework | |
CN113807470B (en) | Vehicle driving state determination method and related device | |
CN108399778A (en) | Swarm intelligence congestion reminding method, system and computer readable storage medium | |
CN113205515B (en) | Target detection method, device and computer storage medium | |
WO2022142890A1 (en) | Data processing method and related apparatus | |
CN112269939B (en) | Automatic driving scene searching method, device, terminal, server and medium | |
CN110550045B (en) | Speed planning and tracking method, device and storage medium | |
US11908095B2 (en) | 2-D image reconstruction in a 3-D simulation | |
US20210348938A1 (en) | Sensor calibration for space translation | |
CN111664860B (en) | Positioning method and device, intelligent equipment and storage medium | |
CN114970112A (en) | Method and device for automatic driving simulation, electronic equipment and storage medium | |
CN114623836A (en) | Vehicle pose determining method and device and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21893546 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21893546 Country of ref document: EP Kind code of ref document: A1 |