Nothing Special   »   [go: up one dir, main page]

WO2021012342A1 - Systems and methods for traffic prediction - Google Patents

Systems and methods for traffic prediction Download PDF

Info

Publication number
WO2021012342A1
WO2021012342A1 PCT/CN2019/101786 CN2019101786W WO2021012342A1 WO 2021012342 A1 WO2021012342 A1 WO 2021012342A1 CN 2019101786 W CN2019101786 W CN 2019101786W WO 2021012342 A1 WO2021012342 A1 WO 2021012342A1
Authority
WO
WIPO (PCT)
Prior art keywords
traffic
historical
prediction model
inputs
preliminary
Prior art date
Application number
PCT/CN2019/101786
Other languages
French (fr)
Inventor
Hui QIU
Haibo Li
Original Assignee
Beijing Didi Infinity Technology And Development Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology And Development Co., Ltd. filed Critical Beijing Didi Infinity Technology And Development Co., Ltd.
Publication of WO2021012342A1 publication Critical patent/WO2021012342A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation

Definitions

  • the present disclosure generally relates to systems and methods for traffic prediction, and in particular, to systems and methods for predicting a plurality of future parameters associated with traffic condition.
  • a system providing traffic services may predict future traffic information based on current traffic information and/or historical traffic information according to a linear model or a tree model.
  • traffic services e.g., online to offline transportation services, navigation services, map services
  • a first aspect of the present disclosure relates to a system for predicting traffic parameter.
  • the system may include at least one storage medium including a set of instructions and at least one processor in communication with the at least one storage medium.
  • the at least one processor may be directed to cause to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; obtain a first trained prediction model; determine a target vector based on the plurality of first inputs by using the first trained prediction model; obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtain a second trained prediction model; and predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • the travel condition may be associated with a road section.
  • each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, and/or a first statistical parameter associated with the plurality of first historical parameters.
  • the first parameter associated with traffic condition may include at least one of a traffic congestion level of the travel condition, a traffic speed of the travel condition, and/or a traffic flow of the travel condition.
  • the first statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, and/or a traffic flow statistical parameter.
  • the traffic congestion statistical parameter may include at least one of a mode of a plurality of historical traffic congestion levels and/or a congestion probability of the plurality of historical traffic congestion levels.
  • the traffic speed statistical parameter may include at least one of a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, and/or a minimum value of the plurality of historical speeds.
  • the traffic flow statistical parameter may include at least one of a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, and/or a minimum value of the plurality of historical traffic flows.
  • each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point, a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, and/or a second statistical parameter associated with the plurality of second historical traffic parameters.
  • each of the plurality of second inputs may further include a reference parameter, the reference parameter including weather information at the corresponding future time point.
  • the first trained prediction model may be a first part of a trained prediction model and the second trained prediction model may be a second part of the trained prediction model.
  • the trained prediction model may be a sequence to sequence model.
  • the first part of the trained prediction model may be an encoder and the second part of the trained prediction model may be a decoder.
  • the trained prediction model may be determined based on a training process.
  • the training process may include obtaining a plurality of first sample inputs corresponding to a plurality of first sample time points respectively; obtaining a preliminary prediction model including a preliminary first part and a preliminary second part; determining a preliminary vector based on the plurality of first sample inputs by using the preliminary first part; obtaining a plurality of second sample inputs corresponding to a plurality of second sample time points respectively; predicting a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part; obtaining a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively; determining a value of a loss function of the preliminary prediction model based on the plurality of sample parameters associated with traffic condition and the plurality of actual parameters associated with traffic condition; and designating the preliminary prediction model as the trained prediction model in response to a determination that the value of the training process
  • the training process may further include updating the preliminary first part or the preliminary second part in response to a determination that the value of the loss function is larger than or equal to the loss threshold.
  • a second aspect of the present disclosure relates to a method for predicting traffic parameter.
  • the method may include obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; obtaining a first trained prediction model; determining a target vector based on the plurality of first inputs by using the first trained prediction model; obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtaining a second trained prediction model; and predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • a third aspect of the present disclosure relates to a system for predicting traffic parameter.
  • the system may include a first obtaining module, a vector determination module, a second obtaining module, and a prediction module.
  • the first obtaining module may be configured to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points and obtain a first trained prediction model.
  • the vector determination module may be configured to determine a target vector based on the plurality of first inputs by using the first trained prediction model.
  • the second obtaining module may be configured to obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points and obtain a second trained prediction model.
  • the prediction module may be configured to predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • a fourth aspect of the present disclosure related to a non-transitory computer readable medium.
  • the non-transitory computer readable medium may include executable instructions. When the executable instructions are executed by at least one processor, the executable instructions may direct the at least one processor to perform a method.
  • the method may include obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; obtaining a first trained prediction model; determining a target vector based on the plurality of first inputs by using the first trained prediction model; obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtaining a second trained prediction model; and predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • FIG. 1 is a schematic diagram illustrating an exemplary traffic prediction system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure
  • FIG. 5 is a flowchart illustrating an exemplary process for predicting future parameters associated with a travel condition according to some embodiments of the present disclosure
  • FIG. 6 is a schematic diagram illustrating an exemplary first input according to some embodiments of the present disclosure.
  • FIG. 7 is a schematic diagram illustrating an exemplary second input according to some embodiments of the present disclosure.
  • FIG. 8 is a flowchart illustrating an exemplary training process for determining a prediction model according to some embodiments of the present disclosure.
  • FIG. 9 is a schematic diagram illustrating an exemplary structure of a prediction model according to some embodiments of the present disclosure.
  • the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
  • the systems and methods disclosed in the present disclosure are described primarily regarding on-demand transportation services, it should also be understood that this is only one exemplary embodiment.
  • the systems and methods of the present disclosure may be applied to any other kind of on demand service.
  • the systems and methods of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof.
  • the vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high-speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, a driverless vehicle, or the like, or any combination thereof.
  • the transportation system may also include any transportation system for management and/or distribution, for example, a system for sending and/or receiving an express.
  • the application of the system or method of the present disclosure may include a web page, a plug-in of a browser, a client terminal, a custom system, an internal analysis system, an artificial intelligence robot, or the like, or any combination thereof.
  • passenger, ” “requestor, ” “requester, ” “service requestor, ” “service requester” and “customer” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may request or order a service.
  • driver, ” “provider, ” “service provider, ” and “supplier” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may provide a service or facilitate the providing of the service.
  • user in the present disclosure may refer to an individual, an entity, or a tool that may request a service, order a service, provide a service, or facilitate the providing of the service.
  • the user may be a passenger, a driver, an operator, or the like, or any combination thereof.
  • passenger and passenger terminal may be used interchangeably, and terms “driver” and “driver terminal” may be used interchangeably.
  • service in the present disclosure are used interchangeably to refer to a request that may be initiated by a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, a supplier, or the like, or any combination thereof.
  • the service request may be accepted by any one of a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, or a supplier.
  • the service request may be chargeable or free.
  • the positioning technology used in the present disclosure may be based on a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a Galileo positioning system, a quasi-zenith satellite system (QZSS) , a wireless fidelity (WiFi) positioning technology, or the like, or any combination thereof.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • COMPASS compass navigation system
  • Galileo positioning system Galileo positioning system
  • QZSS quasi-zenith satellite system
  • WiFi wireless fidelity positioning technology
  • An aspect of the present disclosure relates to systems and methods for predicting traffic parameter.
  • the systems may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points and determine a target vector based on the plurality of first inputs by using a first trained prediction model.
  • the systems may also obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points. Further, the systems may predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using a second trained prediction model.
  • a plurality of future parameters associated with a travel condition corresponding to the plurality of future time points can be predicted and the plurality of future parameters associated with the travel condition are predicted based on a plurality of first inputs corresponding to a plurality of candidate time points and a plurality of second inputs corresponding to the plurality of future time points, which can fuse multi-channel data and improve the efficiency and accuracy of traffic prediction.
  • FIG. 1 is a schematic diagram illustrating an exemplary traffic prediction system according to some embodiments of the present disclosure.
  • the traffic prediction system may predict future parameters associated with traffic condition based on current parameters associated with traffic condition and/or historical parameters associated with traffic condition.
  • the traffic prediction system may be applied in various application scenarios, such as an on-demand transportation service scenario, a navigation service scenario, a map service scenario, etc.
  • the present disclosure takes an on-demand transportation service scenario as an example, accordingly the traffic prediction system 100 may be an online transportation service platform for transportation services such as taxi hailing services, chauffeur services, express car services, carpool services, bus services, etc.
  • the traffic prediction system 100 may include a server 110, a network 120, a requester terminal 130, a provider terminal 140, and a storage 150.
  • the server 110 may be a single server or a server group.
  • the server group may be centralized or distributed (e.g., server 110 may be a distributed system) .
  • the server 110 may be local or remote.
  • the server 110 may access information and/or data stored in the requester terminal 130, the provider terminal 140, and/or the storage 150 via the network 120.
  • the server 110 may connect to the requester terminal 130, the provider terminal 140, and/or the storage 150 to access stored information and/or data.
  • the server 110 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the server 110 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
  • the server 110 may include a processing engine 112.
  • the processing engine 112 may process information and/or data to perform one or more functions described in the present disclosure.
  • the processing engine 112 may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively and a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively.
  • the processing engine 112 may further predict a plurality of future parameters associated with travel condition corresponding to the plurality of future time points respectively based on the plurality of first inputs and the plurality of second inputs by using a trained prediction model.
  • the processing engine 112 may include one or more processing engines (e.g., single-core processing engine (s) or multi-core processor (s) ) .
  • the processing engine 112 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field-programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • PLD programmable logic device
  • controller
  • the network 120 may facilitate exchange of information and/or data.
  • one or more components e.g., the server 110, the requester terminal 130, the provider terminal 140, and the storage 150
  • the processing engine 112 may obtain the plurality of first inputs and the plurality of second inputs from the storage 150 via the network 120.
  • the network 120 may be any type of wired or wireless network, or combination thereof.
  • the network 120 may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 120 may include one or more network access points.
  • the network 120 may include wired or wireless network access points such as base stations and/or internet exchange points 120-1, 120-2, ..., through which one or more components of the traffic prediction system 100 may be connected to the network 120 to exchange data and/or information.
  • a requester may be a user of the requester terminal 130. In some embodiments, the user of the requester terminal 130 may be someone other than the requester. For example, a user A of the requester terminal 130 may use the requester terminal 130 to transmit a service request for a user B, or receive service and/or information or instructions from the server 110.
  • a provider may be a user of the provider terminal 140. In some embodiments, the user of the provider terminal 140 may be someone other than the provider. For example, a user C of the provider terminal 140 may use the provider terminal 140 to receive a service request for a user D, and/or information or instructions from the server 110.
  • “requester” and “requester terminal” may be used interchangeably, and “provider” and “provider terminal” may be used interchangeably.
  • the requester terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a motor vehicle 130-4, or the like, or any combination thereof.
  • the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, a smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the smart mobile device may include a smartphone, a personal digital assistance (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google Glass TM , a RiftCon TM , a Fragments TM , a Gear VR TM , etc.
  • the built-in device in the motor vehicle 130-4 may include an onboard computer, an onboard television, etc.
  • the requester terminal 130 may be a device with positioning technology for locating the position of the requester and/or the requester terminal 130.
  • the provider terminal 140 may be similar to, or the same device as the requester terminal 130. In some embodiments, the provider terminal 140 may be a device with positioning technology for locating the position of the provider and/or the provider terminal 140. In some embodiments, the provider terminal 140 may periodically transmit GPS information to the server 110. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may communicate with another positioning device to determine the position of the requester, the requester terminal 130, the provider, and/or the provider terminal 140. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may transmit positioning information to the server 110.
  • the storage 150 may store data and/or instructions. In some embodiments, the storage 150 may store data obtained from the requester terminal 130 and/or the provider terminal 140. In some embodiments, the storage 150 may store data and/or instructions that the server 110 may execute or use to perform exemplary methods described in the present disclosure. In some embodiments, the storage 150 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • MROM mask ROM
  • PROM programmable ROM
  • EPROM erasable programmable ROM
  • EEPROM electrically erasable programmable ROM
  • CD-ROM compact disk ROM
  • digital versatile disk ROM etc.
  • the storage 150 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage 150 may be connected to the network 120 to communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100.
  • One or more components of the traffic prediction system 100 may access the data or instructions stored in the storage 150 via the network 120.
  • the storage 150 may be directly connected to or communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100.
  • the storage 150 may be part of the server 110.
  • one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100 may access the storage 150.
  • one or more components of the traffic prediction system 100 may read and/or modify information relating to the requester, the provider, and/or the public when one or more conditions are met.
  • the server 110 may read and/or modify one or more users’ information after a service is completed.
  • the provider terminal 140 may access information relating to the requester when receiving a service request from the requester terminal 130, but the provider terminal 140 can not modify the relevant information of the requester.
  • information exchanging of one or more components of the traffic prediction system 100 may be achieved by way of requesting a service.
  • the object of the service request may be any product.
  • the product may be a tangible product or immaterial product.
  • the tangible product may include food, medicine, commodity, chemical product, electrical appliance, clothing, car, housing, luxury, or the like, or any combination thereof.
  • the immaterial product may include a servicing product, a financial product, a knowledge product, an internet product, or the like, or any combination thereof.
  • the internet product may include an individual host product, a web product, a mobile internet product, a commercial host product, an embedded product, or the like, or any combination thereof.
  • the mobile internet product may be used in a software of a mobile terminal, a program, a system, or the like, or any combination thereof.
  • the mobile terminal may include a tablet computer, a laptop computer, a mobile phone, a personal digital assistance (PDA) , a smart watch, a point of sale (POS) device, an onboard computer, an onboard television, a wearable device, or the like, or any combination thereof.
  • PDA personal digital assistance
  • POS point of sale
  • the product may be any software and/or application used on the computer or mobile phone.
  • the software and/or application may relate to socializing, shopping, transporting, entertainment, learning, investment, or the like, or any combination thereof.
  • the software and/or application relating to transporting may include a traveling software and/or application, a vehicle scheduling software and/or application, a mapping software and/or application, etc.
  • the vehicle may include a horse, a carriage, a rickshaw (e.g., a wheelbarrow, a bike, a tricycle) , a car (e.g., a taxi, a bus, a private car) , or the like, or any combination thereof.
  • the element may perform through electrical signals and/or electromagnetic signals.
  • a requester terminal 130 processes a task, such as making a determination, identifying or selecting an object
  • the requester terminal 130 may operate logic circuits in its processor to process such task.
  • a processor of the service requester terminal 130 may generate electrical signals encoding the service request.
  • the processor of the requester terminal 130 may then send the electrical signals to an output port. If the requester terminal 130 communicates with the server 110 via a wired network, the output port may be physically connected to a cable, which may further transmit the electrical signals to an input port of the server 110.
  • the output port of the requester terminal 130 may be one or more antennas, which may convert the electrical signals to electromagnetic signals.
  • a provider terminal 140 may process a task through operation of logic circuits in its processor, and receive an instruction and/or service request from the server 110 via electrical signals or electromagnet signals.
  • an electronic device such as the requester terminal 130, the provider terminal 140, and/or the server 110, when a processor thereof processes an instruction, sends out an instruction, and/or performs an action, the instruction and/or action is conducted via electrical signals.
  • the processor when it retrieves or saves data from a storage medium (e.g., the storage 150) , it may send out electrical signals to a read/write device of the storage medium, which may read or write structured data in the storage medium.
  • the structured data may be transmitted to the processor in the form of electrical signals via a bus of the electronic device.
  • an electrical signal may refer to one electrical signal, a series of electrical signals, and/or a plurality of discrete electrical signals.
  • the traffic prediction system 100 may be a navigation system.
  • the navigation system may include a user terminal (e.g., the provider terminal 140) and a server (e.g., the server 110) .
  • the navigation system may provide a navigation service for the user and during the navigation service, the navigation system may periodically obtain GPS information of the vehicle from a GPS device integrated in the user terminal.
  • the navigation system may obtain GPS information associated with a plurality of vehicles and determine traffic information based on the GPS information. Further, the navigation system may predict future traffic information based on current traffic information and/or historical traffic information according to the process and/or method described in this disclosure.
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure.
  • the server 110, the requester terminal 130, and/or the provider terminal 140 may be implemented on the computing device 200.
  • the processing engine 112 may be implemented on the computing device 200 and configured to perform functions of the processing engine 112 disclosed in this disclosure.
  • the computing device 200 may be used to implement any component of the traffic prediction system 100 as described herein.
  • the processing engine 112 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof. Although only one such computer is shown, for convenience, the computer functions described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
  • the computing device 200 may include COM ports 250 connected to and from a network connected thereto to facilitate data communications.
  • the computing device 200 may also include a processor (e.g., the processor 220) , in the form of one or more processors (e.g., logic circuits) , for executing program instructions.
  • the processor may include interface circuits and processing circuits therein.
  • the interface circuits may be configured to receive electronic signals from a bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process.
  • the processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the bus 210.
  • the computing device 200 may also include program storage and data storage of different forms including, for example, a disk 270, a read only memory (ROM) 230, or a random access memory (RAM) 240, for storing various data files to be processed and/or transmitted by the computing device 200.
  • the computing device 200 may also include program instructions stored in the ROM 230, RAM 240, and/or other type of non-transitory storage medium to be executed by the processor 220.
  • the methods and/or processes of the present disclosure may be implemented as the program instructions.
  • the computing device 200 also includes an I/O component 260, supporting input/output between the computer and other components.
  • the computing device 200 may also receive programming and data via network communications.
  • CPUs and/or processors are also contemplated; thus operations and/or method steps performed by one CPU and/or processor as described in the present disclosure may also be jointly or separately performed by the multiple CPUs and/or processors.
  • the CPU and/or processor of the computing device 200 executes both step A and step B, it should be understood that step A and step B may also be performed by two different CPUs and/or processors jointly or separately in the computing device 200 (e.g., the first processor executes step A and the second processor executes step B, or the first and second processors jointly execute steps A and B) .
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure.
  • the requester terminal 130 and/or the provider 140 may be implemented on the mobile device 300.
  • the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to the traffic prediction system 100. User interactions with the information stream may be achieved via the I/O 350 and provided to one or more components of the traffic prediction system 100 via the network 120.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure.
  • the processing engine 112 may include a first obtaining module 410, a vector determination module 420, a second obtaining module 430, a prediction module 440, and a training module 440.
  • the first obtaining module 410 may be configured to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively.
  • the travel condition may be associated with a road section where vehicles may pass through.
  • each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, a first statistical parameter associated with the plurality of first historical parameters, etc.
  • the first obtaining module 410 may also be configured to obtain a first trained prediction model.
  • the first prediction trained model may be configured to extract feature information of the plurality of first inputs and fuse the plurality of first inputs based on the feature information.
  • the vector determination module 420 may be configured to determine a target vector (also referred to as a “state vector” ) based on the plurality of first inputs by using the first trained prediction model.
  • the target vector may be an expression indicating a fusion result of the plurality of first inputs, which includes a relationship between any two of the plurality of first inputs.
  • the second obtaining module 430 may be configured to obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively.
  • each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point (which may be a predicted future parameter corresponding to the previous time point described in operation 560; for a first future time point, the previous time point refers to the current time point) , a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, a second statistical parameter associated with the plurality of second historical parameters, etc.
  • the second obtaining module 430 may also be configured to obtain a second trained prediction model, which may be configured to predict future traffic information.
  • the prediction module 440 may be configured to predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • the training module 440 may be configured to determine the first trained prediction model and/or the second trained prediction model. More description of the trained process may be found elsewhere in the present disclosure (e.g., FIG. 8 and the description thereof) .
  • the modules in the processing engine 112 may be connected to or communicate with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof. Any two of the modules may be combined as a single module, and any one of the modules may be divided into two or more units.
  • the first obtaining module 410 and the second obtaining module 430 may be combined as a single module, which may be configured to obtain the plurality of first inputs, the first trained prediction model, the plurality of second inputs, and the second trained prediction model.
  • the training module 450 may be unnecessary and the first trained prediction model and/or the second prediction trained model may be obtained from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
  • the processing engine 112 may include a storage module (not shown) which may be configured to store the plurality of first inputs, the first trained prediction model, the plurality of second inputs, the second trained prediction model, the target vector, the plurality of future parameters, etc.
  • FIG. 5 is a flowchart illustrating an exemplary process for predicting future parameters associated with traffic condition according to some embodiments of the present disclosure.
  • the process 500 may be executed by the traffic prediction system 100.
  • the process 500 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 500.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing engine 112 e.g., the first obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively.
  • the travel condition may be associated with a road section where vehicles may pass through.
  • the plurality of candidate time points include a current time point.
  • a number count of the plurality of candidate time points and/or a time interval between any adjacent two of the plurality of candidate time points may be default settings of the traffic prediction system 100 or may be adjustable under different situations.
  • the plurality of candidate time points may be arranged in chronological order. For example, it is assumed that the current time point is "10: 00 a.m., " the number count of the plurality of candidate time points is 10, and the time interval between any adjacent two of the plurality of candidate time points is 1 minute, the plurality of candidate time points may be expressed as a set below:
  • T 1 ⁇ 9: 51, 9: 52, 9: 53, 9: 54, 9: 55, 9: 56, 9: 57, 9: 58, 9: 59, 10: 00 ⁇ (1)
  • the plurality of first inputs corresponding to the plurality of candidate time points respectively may be expressed as below:
  • F 1 refers to a set including the plurality of first inputs
  • f t refers to a tth first input corresponding a tth candidate time point
  • n refers to the number count of the plurality of candidate time points (i.e., a number count of the plurality of first inputs) .
  • each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, a first statistical parameter associated with the plurality of first historical parameters, etc.
  • each of the plurality of first historical time points may correspond to the candidate time point.
  • a specific candidate time point is "10: 00 a.m. on a working day”
  • the plurality of first historical time points may be a plurality of corresponding historical time points (i.e., 10: 00 a.m. on a working day) within a predetermined time period (e.g., last week, last month, last three months) .
  • the first parameter associated with traffic condition may include a traffic congestion level of the travel condition, a traffic speed of the travel condition, a traffic flow (which can be represented by a number count of vehicles) of the travel condition, or the like, or any combination thereof.
  • the traffic congestion level of the travel condition may refer to a plurality of traffic congestion levels of a plurality of locations respectively within the road section, an average traffic congestion level of the plurality of traffic congestion levels, a sum of the plurality of traffic congestion levels, etc.
  • the traffic speed of the travel condition may refer to a plurality of traffic speeds of a plurality of locations respectively within the road section, an average traffic speed of the plurality of traffic speeds, a sum of the plurality of traffic speeds, etc.
  • the traffic flow of the travel condition may refer to a plurality of traffic flows of a plurality of locations respectively within the road section, an average traffic flow of the plurality of traffic flows, a sum of the plurality of traffic flows, etc.
  • the traffic congestion level of the travel condition may be expressed as a plurality of levels based on the traffic flow of the travel condition, for example, “heavy congestion, ” “normal congestion, ” “mid congestion, ” “smooth traffic” illustrated in Table 1 below.
  • each of parameters “a, ” “b, ” and “c” refers to a traffic flow threshold
  • F refers to a traffic flow of a specific location point within road section.
  • the traffic flow thresholds may be default settings of the traffic prediction system 100 or may be adjustable under different situations (e.g., the traffic flow thresholds may be different for different cities) .
  • each of the plurality of first historical parameters associated with traffic condition may include a historical traffic congestion level of the travel condition, a historical traffic speed of the travel condition, a historical traffic flow of the travel condition, or the like, or any combination thereof.
  • the processing engine 112 may determine a first comprehensive historical parameter based on the plurality of first historical parameters. For example, the processing engine 112 may determine a sum or a weighted sum of the plurality of first historical parameters as the first comprehensive historical parameter, wherein the closer a first historical time point is to the current time point, the larger the weight of a first historical parameter corresponding to the first historical time point may be.
  • the first statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, a traffic flow statistical parameter, or the like, or any combination thereof.
  • the traffic congestion statistical parameter may include a mode of a plurality of historical traffic congestion levels, a congestion probability of the plurality of historical traffic congestion levels, etc.
  • the traffic speed statistical parameter may include a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, a minimum value of the plurality of historical speeds, etc.
  • the traffic flow statistical parameter may include a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, a minimum value of the plurality of historical traffic flows, etc.
  • the congestion probability of the plurality of historical traffic congestion levels refers to a rate of a specific historical traffic congestion level in the plurality of historical traffic congestion levels. For example, it is assumed that the plurality of historical traffic congestion levels are illustrated in Table 2 below.
  • Table 2 exemplary historical traffic congestion levels
  • the processing engine 112 (e.g., the first obtaining module 410) (e.g., the processing circuits of the processor 220) may obtain a first trained prediction model.
  • the first trained prediction model may be configured to extract feature information of the plurality of first inputs and fuse the plurality of first inputs based on the feature information.
  • the processing engine 112 may obtain the first trained prediction model from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
  • the first trained model may be a Recurrent Neural Network (RNN) model, a Long Short-Term Memory (LSTM) model, a Gated Recurrent Unit model (GRU) , etc.
  • the first trained prediction model may be a first part of a trained prediction model (e.g., a sequence to sequence model) .
  • the sequence to sequence model may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc. More descriptions regarding the trained prediction model may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
  • the processing engine 112 e.g., the vector determination module 420
  • the processing circuits of the processor 220 may determine a target vector based on the plurality of first inputs by using the first trained prediction model.
  • the target vector may be an expression indicating a fusion result of the plurality of first inputs, which includes a relationship between any two of the plurality of first inputs.
  • the plurality of first inputs may be input into the first trained model in chronological order, and an intermediate result corresponding to a previous candidate time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent candidate time point, thereby extracting a context dependence among the plurality of first inputs. More descriptions may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
  • the processing engine 112 e.g., the second obtaining module 430
  • the interface circuits of the processor 220 may obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively.
  • a number count of the plurality of future time points and/or a time interval between any adjacent two of the plurality of future time points may be default settings of the traffic prediction system 100 or may be adjustable under different situations.
  • the plurality of future time points may be arranged in chronological order. For example, it is assumed that a first future time point is "10: 01 a.m., " the number count of the plurality of future time points is 10, and the time interval between any adjacent two of the plurality of future time points is 1 minute, the plurality of future time points may be expressed as a set below:
  • T 2 ⁇ 10: 01, 10: 02, 10: 03, 10: 04, 10: 05, 10: 06, 10: 07, 10: 08, 10: 09, 10: 10 ⁇ (3)
  • the plurality of second inputs corresponding to the plurality of future time points respectively may be expressed as below:
  • F 2 refers to a set including the plurality of second inputs
  • f t+i refers to an ith second input (which corresponds to a (t+i) th time point)
  • m refers to the number count of the plurality of future time points (i.e., a number count of the plurality of second inputs) .
  • each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point (which may be a predicted future parameter corresponding to the previous time point described in operation 560; for a first future time point, the previous time point refers to the current time point) , a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, a second statistical parameter associated with the plurality of second historical traffic parameters, etc.
  • each of the plurality of second historical time points may correspond to the future time point. For example, it is assumed that a specific future time point is "10: 02 a.m.
  • the plurality of second historical time points may be a plurality of corresponding historical time points (i.e., 10: 02 a.m. on a working day) within a predetermined time period (e.g., last week, last month, last three months) .
  • the second parameter associated with traffic condition may include a traffic congestion level of the travel condition, a traffic speed of the travel condition, a traffic flow of the travel condition, or the like, or any combination thereof.
  • each of the plurality of second historical parameters associated with traffic condition may include a historical traffic congestion level of the travel condition, a historical traffic speed of the travel condition, a historical traffic flow of the travel condition, or the like, or any combination thereof.
  • the processing engine 112 may determine a second comprehensive historical parameter based on the plurality of second historical parameters.
  • the processing engine 112 may determine a sum or a weighted sum of the plurality of second historical parameters as the second comprehensive historical parameter, wherein the closer a second historical time point is to the current time point, the larger the weight of a second historical parameter corresponding to the second historical time point may be.
  • the second statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, a traffic flow statistical parameter, or the like, or any combination thereof.
  • each of the plurality of second inputs may further include a reference parameter (e.g., weather forecast information) at the corresponding future time point.
  • a reference parameter e.g., weather forecast information
  • the processing engine 112 may obtain a second trained prediction model, which may be configured to predict future traffic information.
  • the processing engine 112 may obtain the second trained prediction model from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
  • the second trained prediction model may be a Recurrent Neural Network (RNN) model, a Long Short-Term Memory (LSTM) model, a Gated Recurrent Unit model (GRU) , etc.
  • RNN Recurrent Neural Network
  • LSTM Long Short-Term Memory
  • GRU Gated Recurrent Unit
  • the second trained prediction model may be a second part of a trained prediction model (e.g., a sequence to sequence model) .
  • the sequence to sequence model may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc. More descriptions regarding the trained prediction model may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
  • the processing engine 112 (e.g., the prediction module 440) (e.g., the processing circuits of the processor 220) may predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively based on the target vector and the plurality of second inputs by using the second trained prediction model.
  • the plurality of second inputs may be input into the second trained model in chronological order and an intermediate result corresponding to a previous future time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent future time point, thereby extracting a context dependence among the plurality of second inputs. More descriptions may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
  • connection operation 540 the plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively may be expressed as below:
  • P refers to a set including the plurality of future parameters and p t+i refers to an ith future parameter (which corresponds to a (t+i) th time point) .
  • the processing engine 112 may transmit information (e.g., a travel tip) indicative of the plurality of future parameters to the requester terminal 130 and/or the provider terminal 140. In some embodiment, the processing engine 112 may pre-determine a scheduling strategy based on the predicted future parameters. In some embodiments, the processing engine 112 may use the plurality of future parameters as reference when estimating service information (e.g., a recommended route, an estimated time of arrival (ETA) ) associated with service requests.
  • service information e.g., a recommended route, an estimated time of arrival (ETA)
  • the processing engine 112 may store information and/or data (e.g., the plurality of first inputs, the target vector, the plurality of second inputs, the plurality of predicted future parameters) associated with the traffic prediction system 100 in a storage device (e.g., the storage 140) disclosed elsewhere in the present disclosure.
  • a storage device e.g., the storage 140
  • FIG. 8 is a flowchart illustrating an exemplary training process for determining a prediction model according to some embodiments of the present disclosure.
  • the process 800 may be executed by the traffic prediction system 100.
  • the process 800 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240.
  • the processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 800.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 800 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 800 illustrated in FIG. 8 and described below is not intended to be limiting.
  • the processing engine 112 may obtain a plurality of first sample inputs corresponding to a plurality of first sample time points respectively.
  • the processing engine 112 may obtain the plurality of first sample inputs from the storage 150 via the network 120.
  • each of the plurality of first sample inputs may include a first sample parameter associated with traffic condition at a corresponding sample time point, a plurality of first sample historical parameters associated with traffic condition at a plurality of first sample historical time points respectively, a first sample statistical parameter associated with the plurality of first historical parameters, etc.
  • the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a preliminary prediction model including a preliminary first part and a preliminary second part.
  • the preliminary first part and/or the preliminary second part may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc.
  • the preliminary prediction model may include one or more preliminary model parameters, for example, a number count of cells in the preliminary first part, a number count of cells in the preliminary second part, a size of each cell, a number count of layers in each cell, a number count of gates in each cell, a weight parameter (which may be used to assign a weight to an intermediate result corresponding to the first of the two adjacent cells) between any two adjacent cells (also referred to as a "weight parameter between any two adjacent time points" ) , etc.
  • preliminary model parameters for example, a number count of cells in the preliminary first part, a number count of cells in the preliminary second part, a size of each cell, a number count of layers in each cell, a number count of gates in each cell, a weight parameter (which may be used to assign a weight to an intermediate result corresponding to the first of the two adjacent cells) between any two adjacent cells (also referred to as a "weight parameter between any two adjacent time points" ) , etc.
  • the processing engine 112 may determine a preliminary vector based on the plurality of first sample inputs by using the preliminary first part.
  • the preliminary vector may be an expression indicating a fusion result of the plurality of first sample inputs, which includes a relationship between any two of the plurality of first sample inputs.
  • the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a plurality of second sample inputs corresponding to a plurality of second sample time points respectively.
  • the processing engine 112 may obtain the plurality of second sample inputs from the storage 150 via the network 120.
  • the plurality of second sample time points refer to future time points with respect to the plurality of first sample time points
  • each of the plurality of second sample inputs may include a second sample parameter associated with traffic condition at a previous time point of a corresponding second sample time point, a plurality of second sample historical parameters associated with traffic condition at a plurality of second sample historical time points respectively, a second sample statistical parameter associated with the plurality of second sample historical traffic parameters, etc.
  • the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may predict a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part.
  • the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively.
  • the processing engine 112 may obtain the plurality of actual parameters associated with traffic condition from the storage 150 via the network 120.
  • the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may determine a value of a loss function of the preliminary prediction model based on the plurality of sample parameters and the plurality of actual parameters.
  • the loss function may be a root mean square error (RMSE) .
  • the processing engine 112 may determine whether the value of the loss function of the preliminary prediction model is less than a loss threshold. In response to a determination that the value of the loss function of the preliminary prediction model is less than the loss threshold, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may designate the preliminary prediction model as a trained prediction model in 890, which means that the training process is completed and the trained prediction model may be stored in a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
  • a storage device e.g., the storage 150
  • the processing engine 112 may execute process 800 to return to operation 820 to update the preliminary prediction model (e.g., update the preliminary first part and/or the preliminary second part) .
  • the processing engine 112 may update the one or more preliminary model parameters.
  • the processing engine 112 may determine whether the value of the loss function of an updated prediction model is less than the loss threshold. In response to a determination that the value of the loss function of an updated prediction model is less than the loss threshold, the processing engine 112 may designate the updated prediction model as a trained prediction model. On the other hand, in response to a determination that the value of the loss function of an updated prediction model is larger than or equal to the loss threshold, the processing engine 112 may still execute the process 800 to return to operation 820 to update the updated prediction model until the value of the loss function of an updated prediction model is less than the loss threshold.
  • process 800 For an ordinary person in the art, it is obvious that operations of a training process are substantially similar to operations of a practice process, therefore, some details are omitted in process 800, which can be found elsewhere in the present disclosure (e.g., process 500 and the description thereof) .
  • the training module 450 may update the trained prediction model at a certain time interval (e.g., per month, per two months) based on a plurality of newly obtained samples.
  • the processing engine 112 may also use other conditions (e.g., a number count of iterations, an accuracy rate) to determine whether the training process is completed.
  • FIG. 9 is a schematic diagram illustrating an exemplary structure of a prediction model according to some embodiments of the present disclosure.
  • the prediction model may be a sequence to sequence model including an encoder and a decoder.
  • the encoder may include a plurality of GRU cells.
  • the plurality of first inputs e.g., X 1 , X 2 , ..., X n
  • a travel condition corresponding to a plurality of candidate time points e.g., t 1 , t 2 , ..., t n , wherein t n may refer to a current time point
  • t n may refer to a current time point
  • a target vector may be generated based on the plurality of first inputs.
  • the decoder may include a plurality of GRU cells and the target vector may be input into a first GRU of the decoder.
  • the plurality of second inputs e.g., Z 1 , Z 2 , ..., Z n
  • a plurality of future time points e.g., t n+1 , t n+2 , ..., t n+m
  • an intermediate result corresponding to a previous future time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent future time point.
  • a plurality of future parameters e.g., Y 1 , Y 2 , ..., Y n
  • associated with the travel condition may be predicted.
  • the sequence to sequence model may include only one GRU cell which may be shared by the plurality of inputs (i.e., the plurality of first inputs and the plurality of second inputs) .
  • the plurality of first inputs and the plurality of second inputs may be arranged in chronological order and may be input into the GRU cell in turn, accordingly, the plurality of future parameters associated with the travel condition may be predicted in order.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a "block, " “module, ” “engine, ” “unit, ” “component, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a software as a service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • an Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, etc.
  • SaaS software as a service

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system (100) and method for predicting traffic parameter, the system (100) may perform the method to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points (510); obtain a first trained prediction model (520); determine a target vector based on the plurality of first inputs by using the first trained prediction model (530); obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points (540); obtain a second trained prediction model (550); and predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model (560).

Description

SYSTEMS AND METHODS FOR TRAFFIC PREDICTION
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to Chinese Patent Application No. 201910665749.0 filed on July 23, 2019, the contents of which are incorporated herein by reference in their entirety.
TECHNICAL FIELD
The present disclosure generally relates to systems and methods for traffic prediction, and in particular, to systems and methods for predicting a plurality of future parameters associated with traffic condition.
BACKGROUND
With rapid development of traffic environment, efficient and accurate traffic information prediction becomes increasingly important to people's travels. A system providing traffic services (e.g., online to offline transportation services, navigation services, map services) may predict future traffic information based on current traffic information and/or historical traffic information according to a linear model or a tree model. However, it is difficult to fuse multi-channel data and predict traffic information corresponding to multiple future time points according to the linear model or the tree model, which may result in a relatively large deviation. Therefore, it is desirable to provide systems and methods for fusing multi-channel data and predicting future traffic information efficiently an accurately.
SUMMARY
A first aspect of the present disclosure relates to a system for predicting traffic parameter. The system may include at least one storage  medium including a set of instructions and at least one processor in communication with the at least one storage medium. When executing the set of instructions, the at least one processor may be directed to cause to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; obtain a first trained prediction model; determine a target vector based on the plurality of first inputs by using the first trained prediction model; obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtain a second trained prediction model; and predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
In some embodiments, the travel condition may be associated with a road section.
In some embodiments, each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, and/or a first statistical parameter associated with the plurality of first historical parameters.
In some embodiments, the first parameter associated with traffic condition may include at least one of a traffic congestion level of the travel condition, a traffic speed of the travel condition, and/or a traffic flow of the travel condition.
In some embodiments, the first statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, and/or a traffic flow statistical parameter.
In some embodiments, the traffic congestion statistical parameter may include at least one of a mode of a plurality of historical traffic congestion levels and/or a congestion probability of the plurality of historical traffic congestion levels.
In some embodiments, the traffic speed statistical parameter may include at least one of a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, and/or a minimum value of the plurality of historical speeds.
In some embodiments, the traffic flow statistical parameter may include at least one of a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, and/or a minimum value of the plurality of historical traffic flows.
In some embodiments, each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point, a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, and/or a second statistical parameter associated with the plurality of second historical traffic parameters.
In some embodiments, each of the plurality of second inputs may further include a reference parameter, the reference parameter including weather information at the corresponding future time point.
In some embodiments, the first trained prediction model may be a first part of a trained prediction model and the second trained prediction model may be a second part of the trained prediction model.
In some embodiments, the trained prediction model may be a sequence to sequence model. The first part of the trained prediction model  may be an encoder and the second part of the trained prediction model may be a decoder.
In some embodiments, the trained prediction model may be determined based on a training process. The training process may include obtaining a plurality of first sample inputs corresponding to a plurality of first sample time points respectively; obtaining a preliminary prediction model including a preliminary first part and a preliminary second part; determining a preliminary vector based on the plurality of first sample inputs by using the preliminary first part; obtaining a plurality of second sample inputs corresponding to a plurality of second sample time points respectively; predicting a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part; obtaining a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively; determining a value of a loss function of the preliminary prediction model based on the plurality of sample parameters associated with traffic condition and the plurality of actual parameters associated with traffic condition; and designating the preliminary prediction model as the trained prediction model in response to a determination that the value of the loss function is less than a loss threshold.
In some embodiments, the training process may further include updating the preliminary first part or the preliminary second part in response to a determination that the value of the loss function is larger than or equal to the loss threshold.
A second aspect of the present disclosure relates to a method for predicting traffic parameter. The method may include obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of  candidate time points; obtaining a first trained prediction model; determining a target vector based on the plurality of first inputs by using the first trained prediction model; obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtaining a second trained prediction model; and predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
A third aspect of the present disclosure relates to a system for predicting traffic parameter. The system may include a first obtaining module, a vector determination module, a second obtaining module, and a prediction module. The first obtaining module may be configured to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points and obtain a first trained prediction model. The vector determination module may be configured to determine a target vector based on the plurality of first inputs by using the first trained prediction model. The second obtaining module may be configured to obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points and obtain a second trained prediction model. The prediction module may be configured to predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
A fourth aspect of the present disclosure related to a non-transitory computer readable medium. The non-transitory computer readable medium may include executable instructions. When the executable instructions are executed by at least one processor, the executable instructions may direct the at least one processor to perform a method. The method may include  obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; obtaining a first trained prediction model; determining a target vector based on the plurality of first inputs by using the first trained prediction model; obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; obtaining a second trained prediction model; and predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities and combinations set forth in the detailed examples discussed below.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a schematic diagram illustrating an exemplary traffic prediction system according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure;
FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure;
FIG. 5 is a flowchart illustrating an exemplary process for predicting future parameters associated with a travel condition according to some embodiments of the present disclosure;
FIG. 6 is a schematic diagram illustrating an exemplary first input according to some embodiments of the present disclosure;
FIG. 7 is a schematic diagram illustrating an exemplary second input according to some embodiments of the present disclosure;
FIG. 8 is a flowchart illustrating an exemplary training process for determining a prediction model according to some embodiments of the present disclosure; and
FIG. 9 is a schematic diagram illustrating an exemplary structure of a prediction model according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
The following description is presented to enable any person skilled in the art to make and use the present disclosure and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present  disclosure. Thus, the present disclosure is not limited to the embodiments shown but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise, ” “comprises, ” and/or “comprising, ” “include, ” “includes, ” and/or “including, ” when used in this disclosure, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features, and characteristics of the present disclosure, as well as the methods of operations and functions of the related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended to limit the scope of the present disclosure. It is understood that the drawings are not to scale.
The flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood, the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
Moreover, while the systems and methods disclosed in the present disclosure are described primarily regarding on-demand transportation services, it should also be understood that this is only one exemplary embodiment. The systems and methods of the present disclosure may be applied to any other kind of on demand service. For example, the systems and methods of the present disclosure may be applied to transportation systems of different environments including land, ocean, aerospace, or the like, or any combination thereof. The vehicle of the transportation systems may include a taxi, a private car, a hitch, a bus, a train, a bullet train, a high-speed rail, a subway, a vessel, an aircraft, a spaceship, a hot-air balloon, a driverless vehicle, or the like, or any combination thereof. The transportation system may also include any transportation system for management and/or distribution, for example, a system for sending and/or receiving an express. The application of the system or method of the present disclosure may include a web page, a plug-in of a browser, a client terminal, a custom system, an internal analysis system, an artificial intelligence robot, or the like, or any combination thereof.
The terms “passenger, ” “requestor, ” “requester, ” “service requestor, ” “service requester” and “customer” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may request or order a service. Also, the terms “driver, ” “provider, ” “service provider, ” and “supplier” in the present disclosure are used interchangeably to refer to an individual, an entity, or a tool that may provide a service or facilitate the providing of the service. The term “user” in the present disclosure may refer to an individual, an entity, or a tool that may request a service, order a service, provide a service, or facilitate the providing of the service. For example, the user may be a passenger, a driver, an operator, or the like, or any combination thereof. In the present disclosure, terms “passenger” and “passenger  terminal” may be used interchangeably, and terms “driver” and “driver terminal” may be used interchangeably.
The terms “service, ” “request, ” and “service request” in the present disclosure are used interchangeably to refer to a request that may be initiated by a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, a supplier, or the like, or any combination thereof. The service request may be accepted by any one of a passenger, a requester, a service requester, a customer, a driver, a provider, a service provider, or a supplier. The service request may be chargeable or free.
The positioning technology used in the present disclosure may be based on a global positioning system (GPS) , a global navigation satellite system (GLONASS) , a compass navigation system (COMPASS) , a Galileo positioning system, a quasi-zenith satellite system (QZSS) , a wireless fidelity (WiFi) positioning technology, or the like, or any combination thereof. One or more of the above positioning systems may be used interchangeably in the present disclosure.
An aspect of the present disclosure relates to systems and methods for predicting traffic parameter. The systems may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points and determine a target vector based on the plurality of first inputs by using a first trained prediction model. The systems may also obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points. Further, the systems may predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using a second trained prediction model. According to the systems and methods of the present disclosure, a plurality of future parameters associated with a travel condition corresponding to the  plurality of future time points can be predicted and the plurality of future parameters associated with the travel condition are predicted based on a plurality of first inputs corresponding to a plurality of candidate time points and a plurality of second inputs corresponding to the plurality of future time points, which can fuse multi-channel data and improve the efficiency and accuracy of traffic prediction.
FIG. 1 is a schematic diagram illustrating an exemplary traffic prediction system according to some embodiments of the present disclosure. The traffic prediction system may predict future parameters associated with traffic condition based on current parameters associated with traffic condition and/or historical parameters associated with traffic condition. The traffic prediction system may be applied in various application scenarios, such as an on-demand transportation service scenario, a navigation service scenario, a map service scenario, etc. For illustration purposes, the present disclosure takes an on-demand transportation service scenario as an example, accordingly the traffic prediction system 100 may be an online transportation service platform for transportation services such as taxi hailing services, chauffeur services, express car services, carpool services, bus services, etc. In some embodiments, the traffic prediction system 100 may include a server 110, a network 120, a requester terminal 130, a provider terminal 140, and a storage 150.
The server 110 may be a single server or a server group. The server group may be centralized or distributed (e.g., server 110 may be a distributed system) . In some embodiments, the server 110 may be local or remote. For example, the server 110 may access information and/or data stored in the requester terminal 130, the provider terminal 140, and/or the storage 150 via the network 120. As another example, the server 110 may connect to the requester terminal 130, the provider terminal 140, and/or the storage 150 to  access stored information and/or data. In some embodiments, the server 110 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof. In some embodiments, the server 110 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
In some embodiments, the server 110 may include a processing engine 112. The processing engine 112 may process information and/or data to perform one or more functions described in the present disclosure.For example, the processing engine 112 may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively and a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively. The processing engine 112 may further predict a plurality of future parameters associated with travel condition corresponding to the plurality of future time points respectively based on the plurality of first inputs and the plurality of second inputs by using a trained prediction model. In some embodiments, the processing engine 112 may include one or more processing engines (e.g., single-core processing engine (s) or multi-core processor (s) ) . Merely by way of example, the processing engine 112 may include a central processing unit (CPU) , an application-specific integrated circuit (ASIC) , an application-specific instruction-set processor (ASIP) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a digital signal processor (DSP) , a field-programmable gate array (FPGA) , a programmable logic device (PLD) , a controller, a microcontroller unit, a reduced instruction-set computer (RISC) , a microprocessor, or the like, or any combination thereof.
The network 120 may facilitate exchange of information and/or data. In some embodiments, one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140, and the storage 150) of the traffic prediction system 100 may transmit information and/or data to other component (s) of the traffic prediction system 100 via the network 120. For example, the processing engine 112 may obtain the plurality of first inputs and the plurality of second inputs from the storage 150 via the network 120. In some embodiments, the network 120 may be any type of wired or wireless network, or combination thereof. Merely by way of example, the network 120 may include a cable network, a wireline network, an optical fiber network, a telecommunications network, an intranet, an Internet, a local area network (LAN) , a wide area network (WAN) , a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth network, a ZigBee network, a near field communication (NFC) network, or the like, or any combination thereof. In some embodiments, the network 120 may include one or more network access points. For example, the network 120 may include wired or wireless network access points such as base stations and/or internet exchange points 120-1, 120-2, …, through which one or more components of the traffic prediction system 100 may be connected to the network 120 to exchange data and/or information.
In some embodiments, a requester may be a user of the requester terminal 130. In some embodiments, the user of the requester terminal 130 may be someone other than the requester. For example, a user A of the requester terminal 130 may use the requester terminal 130 to transmit a service request for a user B, or receive service and/or information or instructions from the server 110. In some embodiments, a provider may be a user of the provider terminal 140. In some embodiments, the user of the  provider terminal 140 may be someone other than the provider. For example, a user C of the provider terminal 140 may use the provider terminal 140 to receive a service request for a user D, and/or information or instructions from the server 110. In some embodiments, “requester” and “requester terminal” may be used interchangeably, and “provider” and “provider terminal” may be used interchangeably.
In some embodiments, the requester terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, a built-in device in a motor vehicle 130-4, or the like, or any combination thereof. In some embodiments, the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart helmet, a smart watch, a smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistance (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof. For example, the virtual reality device and/or the augmented reality device may include a Google Glass TM, a RiftCon TM, a Fragments TM, a Gear VR TM, etc. In some embodiments, the built-in device in  the motor vehicle 130-4 may include an onboard computer, an onboard television, etc. In some embodiments, the requester terminal 130 may be a device with positioning technology for locating the position of the requester and/or the requester terminal 130.
In some embodiments, the provider terminal 140 may be similar to, or the same device as the requester terminal 130. In some embodiments, the provider terminal 140 may be a device with positioning technology for locating the position of the provider and/or the provider terminal 140. In some embodiments, the provider terminal 140 may periodically transmit GPS information to the server 110. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may communicate with another positioning device to determine the position of the requester, the requester terminal 130, the provider, and/or the provider terminal 140. In some embodiments, the requester terminal 130 and/or the provider terminal 140 may transmit positioning information to the server 110.
The storage 150 may store data and/or instructions. In some embodiments, the storage 150 may store data obtained from the requester terminal 130 and/or the provider terminal 140. In some embodiments, the storage 150 may store data and/or instructions that the server 110 may execute or use to perform exemplary methods described in the present disclosure. In some embodiments, the storage 150 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. Exemplary volatile read-and-write memory may include a random access memory (RAM) . Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate  synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc. In some embodiments, the storage 150 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the storage 150 may be connected to the network 120 to communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100. One or more components of the traffic prediction system 100 may access the data or instructions stored in the storage 150 via the network 120. In some embodiments, the storage 150 may be directly connected to or communicate with one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100. In some embodiments, the storage 150 may be part of the server 110.
In some embodiments, one or more components (e.g., the server 110, the requester terminal 130, the provider terminal 140) of the traffic prediction system 100 may access the storage 150. In some embodiments, one or more components of the traffic prediction system 100 may read and/or modify information relating to the requester, the provider, and/or the public when one or more conditions are met. For example, the server 110 may read and/or modify one or more users’ information after a service is completed. As another example, the provider terminal 140 may access information relating to  the requester when receiving a service request from the requester terminal 130, but the provider terminal 140 can not modify the relevant information of the requester.
In some embodiments, information exchanging of one or more components of the traffic prediction system 100 may be achieved by way of requesting a service. The object of the service request may be any product. In some embodiments, the product may be a tangible product or immaterial product. The tangible product may include food, medicine, commodity, chemical product, electrical appliance, clothing, car, housing, luxury, or the like, or any combination thereof. The immaterial product may include a servicing product, a financial product, a knowledge product, an internet product, or the like, or any combination thereof. The internet product may include an individual host product, a web product, a mobile internet product, a commercial host product, an embedded product, or the like, or any combination thereof. The mobile internet product may be used in a software of a mobile terminal, a program, a system, or the like, or any combination thereof. The mobile terminal may include a tablet computer, a laptop computer, a mobile phone, a personal digital assistance (PDA) , a smart watch, a point of sale (POS) device, an onboard computer, an onboard television, a wearable device, or the like, or any combination thereof. For example, the product may be any software and/or application used on the computer or mobile phone. The software and/or application may relate to socializing, shopping, transporting, entertainment, learning, investment, or the like, or any combination thereof. In some embodiments, the software and/or application relating to transporting may include a traveling software and/or application, a vehicle scheduling software and/or application, a mapping software and/or application, etc. In the vehicle scheduling software and/or application, the vehicle may include a horse, a carriage, a rickshaw (e.g., a  wheelbarrow, a bike, a tricycle) , a car (e.g., a taxi, a bus, a private car) , or the like, or any combination thereof.
One of ordinary skill in the art would understand that when an element of the traffic prediction system 100 performs, the element may perform through electrical signals and/or electromagnetic signals. For example, when a requester terminal 130 processes a task, such as making a determination, identifying or selecting an object, the requester terminal 130 may operate logic circuits in its processor to process such task. When the requester terminal 130 sends out a service request to the server 110, a processor of the service requester terminal 130 may generate electrical signals encoding the service request. The processor of the requester terminal 130 may then send the electrical signals to an output port. If the requester terminal 130 communicates with the server 110 via a wired network, the output port may be physically connected to a cable, which may further transmit the electrical signals to an input port of the server 110. If the requester terminal 130 communicates with the server 110 via a wireless network, the output port of the requester terminal 130 may be one or more antennas, which may convert the electrical signals to electromagnetic signals. Similarly, a provider terminal 140 may process a task through operation of logic circuits in its processor, and receive an instruction and/or service request from the server 110 via electrical signals or electromagnet signals. Within an electronic device, such as the requester terminal 130, the provider terminal 140, and/or the server 110, when a processor thereof processes an instruction, sends out an instruction, and/or performs an action, the instruction and/or action is conducted via electrical signals. For example, when the processor retrieves or saves data from a storage medium (e.g., the storage 150) , it may send out electrical signals to a read/write device of the storage medium, which may read or write structured data in the storage medium. The structured data  may be transmitted to the processor in the form of electrical signals via a bus of the electronic device. Here, an electrical signal may refer to one electrical signal, a series of electrical signals, and/or a plurality of discrete electrical signals.
It should be noted that the application scenario illustrated in FIG. 1 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, the traffic prediction system 100 may be a navigation system. The navigation system may include a user terminal (e.g., the provider terminal 140) and a server (e.g., the server 110) . When a user intends to drive a vehicle to a destination, the navigation system may provide a navigation service for the user and during the navigation service, the navigation system may periodically obtain GPS information of the vehicle from a GPS device integrated in the user terminal. The navigation system may obtain GPS information associated with a plurality of vehicles and determine traffic information based on the GPS information. Further, the navigation system may predict future traffic information based on current traffic information and/or historical traffic information according to the process and/or method described in this disclosure.
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure. In some embodiments, the server 110, the requester terminal 130, and/or the provider terminal 140 may be implemented on the computing device 200. For example, the processing engine 112 may be implemented on the computing device 200 and configured to perform functions of the processing engine 112 disclosed in this disclosure.
The computing device 200 may be used to implement any component  of the traffic prediction system 100 as described herein. For example, the processing engine 112 may be implemented on the computing device 200, via its hardware, software program, firmware, or a combination thereof. Although only one such computer is shown, for convenience, the computer functions described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
The computing device 200, for example, may include COM ports 250 connected to and from a network connected thereto to facilitate data communications. The computing device 200 may also include a processor (e.g., the processor 220) , in the form of one or more processors (e.g., logic circuits) , for executing program instructions. For example, the processor may include interface circuits and processing circuits therein. The interface circuits may be configured to receive electronic signals from a bus 210, wherein the electronic signals encode structured data and/or instructions for the processing circuits to process. The processing circuits may conduct logic calculations, and then determine a conclusion, a result, and/or an instruction encoded as electronic signals. Then the interface circuits may send out the electronic signals from the processing circuits via the bus 210.
The computing device 200 may also include program storage and data storage of different forms including, for example, a disk 270, a read only memory (ROM) 230, or a random access memory (RAM) 240, for storing various data files to be processed and/or transmitted by the computing device 200. The computing device 200 may also include program instructions stored in the ROM 230, RAM 240, and/or other type of non-transitory storage medium to be executed by the processor 220. The methods and/or processes of the present disclosure may be implemented as the program instructions. The computing device 200 also includes an I/O component 260, supporting input/output between the computer and other components. The  computing device 200 may also receive programming and data via network communications.
Merely for illustration, only one CPU and/or processor is illustrated in FIG 2. Multiple CPUs and/or processors are also contemplated; thus operations and/or method steps performed by one CPU and/or processor as described in the present disclosure may also be jointly or separately performed by the multiple CPUs and/or processors. For example, if in the present disclosure the CPU and/or processor of the computing device 200 executes both step A and step B, it should be understood that step A and step B may also be performed by two different CPUs and/or processors jointly or separately in the computing device 200 (e.g., the first processor executes step A and the second processor executes step B, or the first and second processors jointly execute steps A and B) .
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure. In some embodiments, the requester terminal 130 and/or the provider 140 may be implemented on the mobile device 300.
As illustrated in FIG. 3, the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390. In some embodiments, any other suitable component, including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300. In some embodiments, a mobile operating system 370 (e.g., iOS TM, Android TM, Windows Phone TM) and one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340. The applications 380 may include a browser or any other suitable mobile apps for receiving and  rendering information relating to the traffic prediction system 100. User interactions with the information stream may be achieved via the I/O 350 and provided to one or more components of the traffic prediction system 100 via the network 120.
To implement various modules, units, and their functionalities described in the present disclosure, computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein. A computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device. A computer may also act as a server if appropriately programmed.
FIG. 4 is a block diagram illustrating an exemplary processing engine according to some embodiments of the present disclosure. The processing engine 112 may include a first obtaining module 410, a vector determination module 420, a second obtaining module 430, a prediction module 440, and a training module 440.
The first obtaining module 410 may be configured to obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively. As used herein, the travel condition may be associated with a road section where vehicles may pass through. In some embodiments, as illustrated in FIG. 6, each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, a first statistical parameter associated with the plurality of first historical parameters, etc. In some embodiments, the first obtaining module 410 may also be configured to obtain a first trained prediction model. The first prediction trained model may be configured to extract feature information of the plurality of first inputs and fuse the plurality of first inputs based on the  feature information.
The vector determination module 420 may be configured to determine a target vector (also referred to as a “state vector” ) based on the plurality of first inputs by using the first trained prediction model. As used herein, the target vector may be an expression indicating a fusion result of the plurality of first inputs, which includes a relationship between any two of the plurality of first inputs.
The second obtaining module 430 may be configured to obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively. In some embodiments, as illustrated in FIG. 7, each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point (which may be a predicted future parameter corresponding to the previous time point described in operation 560; for a first future time point, the previous time point refers to the current time point) , a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, a second statistical parameter associated with the plurality of second historical parameters, etc. In some embodiments, the second obtaining module 430 may also be configured to obtain a second trained prediction model, which may be configured to predict future traffic information.
The prediction module 440 may be configured to predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively based on the target vector and the plurality of second inputs by using the second trained prediction model.
The training module 440 may be configured to determine the first trained prediction model and/or the second trained prediction model. More description of the trained process may be found elsewhere in the present  disclosure (e.g., FIG. 8 and the description thereof) .
The modules in the processing engine 112 may be connected to or communicate with each other via a wired connection or a wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof. The wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof. Any two of the modules may be combined as a single module, and any one of the modules may be divided into two or more units.
For example, the first obtaining module 410 and the second obtaining module 430 may be combined as a single module, which may be configured to obtain the plurality of first inputs, the first trained prediction model, the plurality of second inputs, and the second trained prediction model. As another example, the training module 450 may be unnecessary and the first trained prediction model and/or the second prediction trained model may be obtained from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure. As a further example, the processing engine 112 may include a storage module (not shown) which may be configured to store the plurality of first inputs, the first trained prediction model, the plurality of second inputs, the second trained prediction model, the target vector, the plurality of future parameters, etc.
FIG. 5 is a flowchart illustrating an exemplary process for predicting future parameters associated with traffic condition according to some embodiments of the present disclosure. The process 500 may be executed by the traffic prediction system 100. For example, the process 500 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240. The processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or  the modules may be configured to perform the process 500. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
In 510, the processing engine 112 (e.g., the first obtaining module 410) (e.g., the interface circuits of the processor 220) may obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points respectively. As used herein, the travel condition may be associated with a road section where vehicles may pass through.
In some embodiments, the plurality of candidate time points include a current time point. In some embodiments, a number count of the plurality of candidate time points and/or a time interval between any adjacent two of the plurality of candidate time points may be default settings of the traffic prediction system 100 or may be adjustable under different situations. In some embodiments, the plurality of candidate time points may be arranged in chronological order. For example, it is assumed that the current time point is "10: 00 a.m., " the number count of the plurality of candidate time points is 10, and the time interval between any adjacent two of the plurality of candidate time points is 1 minute, the plurality of candidate time points may be expressed as a set below:
T 1 = {9: 51, 9: 52, 9: 53, 9: 54, 9: 55, 9: 56, 9: 57, 9: 58, 9: 59, 10: 00}   (1)
Accordingly, the plurality of first inputs corresponding to the plurality of candidate time points respectively may be expressed as below:
F 1= {f t|t ∈ {1, 2…. n} } ,        (2)
where F 1 refers to a set including the plurality of first inputs, f t refers to a tth  first input corresponding a tth candidate time point, and n refers to the number count of the plurality of candidate time points (i.e., a number count of the plurality of first inputs) .
In some embodiments, as illustrated in FIG. 6, each of the plurality of first inputs may include a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, a first statistical parameter associated with the plurality of first historical parameters, etc.
In some embodiments, each of the plurality of first historical time points may correspond to the candidate time point. For example, it is assumed that a specific candidate time point is "10: 00 a.m. on a working day" the plurality of first historical time points may be a plurality of corresponding historical time points (i.e., 10: 00 a.m. on a working day) within a predetermined time period (e.g., last week, last month, last three months) .
In some embodiments, the first parameter associated with traffic condition may include a traffic congestion level of the travel condition, a traffic speed of the travel condition, a traffic flow (which can be represented by a number count of vehicles) of the travel condition, or the like, or any combination thereof. As used herein, the traffic congestion level of the travel condition may refer to a plurality of traffic congestion levels of a plurality of locations respectively within the road section, an average traffic congestion level of the plurality of traffic congestion levels, a sum of the plurality of traffic congestion levels, etc. The traffic speed of the travel condition may refer to a plurality of traffic speeds of a plurality of locations respectively within the road section, an average traffic speed of the plurality of traffic speeds, a sum of the plurality of traffic speeds, etc. The traffic flow of the travel condition may refer to a plurality of traffic flows of a plurality of locations respectively within  the road section, an average traffic flow of the plurality of traffic flows, a sum of the plurality of traffic flows, etc.
In some embodiments, the traffic congestion level of the travel condition may be expressed as a plurality of levels based on the traffic flow of the travel condition, for example, “heavy congestion, ” “normal congestion, ” “mid congestion, ” “smooth traffic” illustrated in Table 1 below.
Table 1 exemplary congestion levels
Congestion Level Traffic Flow Level Value
heavy congestion F<a 4
normal congestion A≤F<b 3
mild congestion B≤F<c 2
smooth traffic F≥c 1
As shown in Table 1, each of parameters “a, ” “b, ” and “c” refers to a traffic flow threshold, and F refers to a traffic flow of a specific location point within road section. The traffic flow thresholds may be default settings of the traffic prediction system 100 or may be adjustable under different situations (e.g., the traffic flow thresholds may be different for different cities) .
In some embodiments, as described above, each of the plurality of first historical parameters associated with traffic condition may include a historical traffic congestion level of the travel condition, a historical traffic speed of the travel condition, a historical traffic flow of the travel condition, or the like, or any combination thereof. In some embodiments, for the plurality of first historical parameters associated with traffic condition, the processing engine 112 may determine a first comprehensive historical parameter based on the plurality of first historical parameters. For example, the processing engine 112 may determine a sum or a weighted sum of the plurality of first historical parameters as the first comprehensive historical parameter, wherein the closer a first historical time point is to the current time point, the larger the  weight of a first historical parameter corresponding to the first historical time point may be.
In some embodiments, the first statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, a traffic flow statistical parameter, or the like, or any combination thereof. The traffic congestion statistical parameter may include a mode of a plurality of historical traffic congestion levels, a congestion probability of the plurality of historical traffic congestion levels, etc. The traffic speed statistical parameter may include a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, a minimum value of the plurality of historical speeds, etc. The traffic flow statistical parameter may include a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, a minimum value of the plurality of historical traffic flows, etc.
As used herein, the congestion probability of the plurality of historical traffic congestion levels refers to a rate of a specific historical traffic congestion level in the plurality of historical traffic congestion levels. For example, it is assumed that the plurality of historical traffic congestion levels are illustrated in Table 2 below.
Table 2 exemplary historical traffic congestion levels
Historical Time Point Congestion Level Level Value
1 mild congestion 2
2 mild congestion 2
3 mild congestion 2
4 smooth traffic 1
5 mild congestion 2
6 mild congestion 2
7 smooth traffic 1
8 mild congestion 2
9 mild congestion 2
10 smooth traffic 1
As shown in Table 2, it can be seen that a rate of "mid congestion" is 0.7 and a rate of "smooth traffic" is 0.3. Accordingly, the congestion probability of "mid congestion" is 0.7 and the congestion probability of "smooth traffic" is 0.3.
In 520, the processing engine 112 (e.g., the first obtaining module 410) (e.g., the processing circuits of the processor 220) may obtain a first trained prediction model. The first trained prediction model may be configured to extract feature information of the plurality of first inputs and fuse the plurality of first inputs based on the feature information. The processing engine 112 may obtain the first trained prediction model from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure. 
In some embodiments, the first trained model may be a Recurrent Neural Network (RNN) model, a Long Short-Term Memory (LSTM) model, a Gated Recurrent Unit model (GRU) , etc. In some embodiments, the first trained prediction model may be a first part of a trained prediction model (e.g., a sequence to sequence model) . The sequence to sequence model may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc. More descriptions regarding the trained prediction model may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
In 530, the processing engine 112 (e.g., the vector determination module 420) (e.g., the processing circuits of the processor 220) may determine a target vector based on the plurality of first inputs by using the first  trained prediction model. As used herein, the target vector may be an expression indicating a fusion result of the plurality of first inputs, which includes a relationship between any two of the plurality of first inputs.
In some embodiments, the plurality of first inputs may be input into the first trained model in chronological order, and an intermediate result corresponding to a previous candidate time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent candidate time point, thereby extracting a context dependence among the plurality of first inputs. More descriptions may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
In 540, the processing engine 112 (e.g., the second obtaining module 430) (e.g., the interface circuits of the processor 220) may obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points respectively.
In some embodiments, similar to the plurality of candidate time points, a number count of the plurality of future time points and/or a time interval between any adjacent two of the plurality of future time points may be default settings of the traffic prediction system 100 or may be adjustable under different situations. In some embodiments, the plurality of future time points may be arranged in chronological order. For example, it is assumed that a first future time point is "10: 01 a.m., " the number count of the plurality of future time points is 10, and the time interval between any adjacent two of the plurality of future time points is 1 minute, the plurality of future time points may be expressed as a set below:
T 2 = {10: 01, 10: 02, 10: 03, 10: 04, 10: 05, 10: 06, 10: 07, 10: 08, 10: 09, 10: 10} (3)
Accordingly, the plurality of second inputs corresponding to the plurality of future time points respectively may be expressed as below:
F 2= {f t+i|t ∈ {1, 2…. n} , i∈ {1, 2…. m} }    (4)
where F 2 refers to a set including the plurality of second inputs, f t+i refers to an ith second input (which corresponds to a (t+i) th time point) , and m refers to the number count of the plurality of future time points (i.e., a number count of the plurality of second inputs) .
In some embodiments, as illustrated in FIG. 7, each of the plurality of second inputs may include a second parameter associated with traffic condition at a previous time point of a corresponding future time point (which may be a predicted future parameter corresponding to the previous time point described in operation 560; for a first future time point, the previous time point refers to the current time point) , a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, a second statistical parameter associated with the plurality of second historical traffic parameters, etc. In some embodiments, similar to the plurality of first historical time points, each of the plurality of second historical time points may correspond to the future time point. For example, it is assumed that a specific future time point is "10: 02 a.m. on a working day, " the plurality of second historical time points may be a plurality of corresponding historical time points (i.e., 10: 02 a.m. on a working day) within a predetermined time period (e.g., last week, last month, last three months) .
In some embodiments, similar to the first parameter associated with traffic condition, the second parameter associated with traffic condition may include a traffic congestion level of the travel condition, a traffic speed of the travel condition, a traffic flow of the travel condition, or the like, or any combination thereof.
In some embodiments, as described above, each of the plurality of second historical parameters associated with traffic condition may include a historical traffic congestion level of the travel condition, a historical traffic  speed of the travel condition, a historical traffic flow of the travel condition, or the like, or any combination thereof. In some embodiments, also similar to the plurality of first historical parameters associated with traffic condition, for the plurality of second historical parameters associated with traffic condition, the processing engine 112 may determine a second comprehensive historical parameter based on the plurality of second historical parameters. For example, the processing engine 112 may determine a sum or a weighted sum of the plurality of second historical parameters as the second comprehensive historical parameter, wherein the closer a second historical time point is to the current time point, the larger the weight of a second historical parameter corresponding to the second historical time point may be.
In some embodiments, also similar to the first statistical parameter, the second statistical parameter may include a traffic congestion statistical parameter, a traffic speed statistical parameter, a traffic flow statistical parameter, or the like, or any combination thereof.
In some embodiments, each of the plurality of second inputs may further include a reference parameter (e.g., weather forecast information) at the corresponding future time point.
In 550, the processing engine 112 (e.g., the second obtaining module 430) (e.g., the interface circuits of the processor 220) may obtain a second trained prediction model, which may be configured to predict future traffic information. The processing engine 112 may obtain the second trained prediction model from a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
In some embodiments, the second trained prediction model may be a Recurrent Neural Network (RNN) model, a Long Short-Term Memory (LSTM) model, a Gated Recurrent Unit model (GRU) , etc. In some embodiments, the second trained prediction model may be a second part of a trained  prediction model (e.g., a sequence to sequence model) . The sequence to sequence model may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc. More descriptions regarding the trained prediction model may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
In 560, the processing engine 112 (e.g., the prediction module 440) (e.g., the processing circuits of the processor 220) may predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively based on the target vector and the plurality of second inputs by using the second trained prediction model.
In some embodiments, the plurality of second inputs may be input into the second trained model in chronological order and an intermediate result corresponding to a previous future time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent future time point, thereby extracting a context dependence among the plurality of second inputs. More descriptions may be found elsewhere in the present disclosure (e.g., FIG. 9 and the description thereof) .
As described in connection operation 540, the plurality of future parameters associated with the travel condition corresponding to the plurality of future time points respectively may be expressed as below:
P= {p t+i|t ∈ {1, 2…. n} , i∈ {1, 2…. m} }    (5)
where P refers to a set including the plurality of future parameters and p t+i refers to an ith future parameter (which corresponds to a (t+i) th time point) .
In some embodiments, after predicting the plurality of future parameters associated with the travel condition, the processing engine 112 may transmit information (e.g., a travel tip) indicative of the plurality of future parameters to the requester terminal 130 and/or the provider terminal 140. In some embodiment, the processing engine 112 may pre-determine a  scheduling strategy based on the predicted future parameters. In some embodiments, the processing engine 112 may use the plurality of future parameters as reference when estimating service information (e.g., a recommended route, an estimated time of arrival (ETA) ) associated with service requests.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. For example, one or more other optional operations (e.g., a storing operation) may be added elsewhere in the process 500. In the storing operation, the processing engine 112 may store information and/or data (e.g., the plurality of first inputs, the target vector, the plurality of second inputs, the plurality of predicted future parameters) associated with the traffic prediction system 100 in a storage device (e.g., the storage 140) disclosed elsewhere in the present disclosure.
FIG. 8 is a flowchart illustrating an exemplary training process for determining a prediction model according to some embodiments of the present disclosure. The process 800 may be executed by the traffic prediction system 100. For example, the process 800 may be implemented as a set of instructions stored in the storage ROM 230 or RAM 240. The processor 220 and/or the modules in FIG. 4 may execute the set of instructions, and when executing the instructions, the processor 220 and/or the modules may be configured to perform the process 800. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 800 may be accomplished with one or more additional operations not described and/or without one or more of the  operations discussed. Additionally, the order in which the operations of the process 800 illustrated in FIG. 8 and described below is not intended to be limiting.
In 810, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a plurality of first sample inputs corresponding to a plurality of first sample time points respectively. The processing engine 112 may obtain the plurality of first sample inputs from the storage 150 via the network 120. As described in connection with operation 510, each of the plurality of first sample inputs may include a first sample parameter associated with traffic condition at a corresponding sample time point, a plurality of first sample historical parameters associated with traffic condition at a plurality of first sample historical time points respectively, a first sample statistical parameter associated with the plurality of first historical parameters, etc.
In 820, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a preliminary prediction model including a preliminary first part and a preliminary second part. In some embodiments, the preliminary first part and/or the preliminary second part may include one or more RNN cells, one or more LSTM cells, one or more GRU cells, etc. In some embodiments, the preliminary prediction model may include one or more preliminary model parameters, for example, a number count of cells in the preliminary first part, a number count of cells in the preliminary second part, a size of each cell, a number count of layers in each cell, a number count of gates in each cell, a weight parameter (which may be used to assign a weight to an intermediate result corresponding to the first of the two adjacent cells) between any two adjacent cells (also referred to as a "weight parameter between any two adjacent time points" ) , etc.
In 830, the processing engine 112 (e.g., the training module 450)  (e.g., the processing circuits of the processor 220) may determine a preliminary vector based on the plurality of first sample inputs by using the preliminary first part. As described in connection with operation 530, the preliminary vector may be an expression indicating a fusion result of the plurality of first sample inputs, which includes a relationship between any two of the plurality of first sample inputs.
In 840, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a plurality of second sample inputs corresponding to a plurality of second sample time points respectively. The processing engine 112 may obtain the plurality of second sample inputs from the storage 150 via the network 120. As described in connection with operation 540, the plurality of second sample time points refer to future time points with respect to the plurality of first sample time points, and each of the plurality of second sample inputs may include a second sample parameter associated with traffic condition at a previous time point of a corresponding second sample time point, a plurality of second sample historical parameters associated with traffic condition at a plurality of second sample historical time points respectively, a second sample statistical parameter associated with the plurality of second sample historical traffic parameters, etc.
In 850, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may predict a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part.
In 860, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may obtain a plurality of  actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively. The processing engine 112 may obtain the plurality of actual parameters associated with traffic condition from the storage 150 via the network 120.
In 870, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may determine a value of a loss function of the preliminary prediction model based on the plurality of sample parameters and the plurality of actual parameters. In some embodiments, the loss function may be a root mean square error (RMSE) .
In 880, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may determine whether the value of the loss function of the preliminary prediction model is less than a loss threshold. In response to a determination that the value of the loss function of the preliminary prediction model is less than the loss threshold, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may designate the preliminary prediction model as a trained prediction model in 890, which means that the training process is completed and the trained prediction model may be stored in a storage device (e.g., the storage 150) disclosed elsewhere in the present disclosure.
In response to a determination that the value of the loss function of the preliminary prediction model is larger than or equal to the loss threshold, the processing engine 112 (e.g., the training module 450) (e.g., the processing circuits of the processor 220) may execute process 800 to return to operation 820 to update the preliminary prediction model (e.g., update the preliminary first part and/or the preliminary second part) . In some embodiments, the processing engine 112 may update the one or more preliminary model parameters.
Further, the processing engine 112 may determine whether the value  of the loss function of an updated prediction model is less than the loss threshold. In response to a determination that the value of the loss function of an updated prediction model is less than the loss threshold, the processing engine 112 may designate the updated prediction model as a trained prediction model. On the other hand, in response to a determination that the value of the loss function of an updated prediction model is larger than or equal to the loss threshold, the processing engine 112 may still execute the process 800 to return to operation 820 to update the updated prediction model until the value of the loss function of an updated prediction model is less than the loss threshold.
For an ordinary person in the art, it is obvious that operations of a training process are substantially similar to operations of a practice process, therefore, some details are omitted in process 800, which can be found elsewhere in the present disclosure (e.g., process 500 and the description thereof) .
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. For example, the training module 450 may update the trained prediction model at a certain time interval (e.g., per month, per two months) based on a plurality of newly obtained samples. As another example, other than the value of the loss function, the processing engine 112 may also use other conditions (e.g., a number count of iterations, an accuracy rate) to determine whether the training process is completed.
FIG. 9 is a schematic diagram illustrating an exemplary structure of a prediction model according to some embodiments of the present disclosure.  For illustration purposes, the prediction model may be a sequence to sequence model including an encoder and a decoder.
As illustrated, the encoder may include a plurality of GRU cells. The plurality of first inputs (e.g., X 1, X 2, …, X n) associated with a travel condition corresponding to a plurality of candidate time points (e.g., t 1, t 2, …, t n, wherein t n may refer to a current time point) may be input into the plurality of GRU cells respectively. It can be seen that an intermediate result corresponding to a previous candidate time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent candidate time point. Further, a target vector may be generated based on the plurality of first inputs.
Further, the decoder may include a plurality of GRU cells and the target vector may be input into a first GRU of the decoder. In addition, the plurality of second inputs (e.g., Z 1, Z 2, …, Z n) associated with the travel condition corresponding to a plurality of future time points (e.g., t n+1, t n+2, …, t n+m) may be input into the plurality of GRU cells respectively. It can be seen that an intermediate result corresponding to a previous future time point can be used as part (aweight may be assigned to the intermediate result) of an input corresponding to a next adjacent future time point. Further, a plurality of future parameters (e.g., Y 1, Y 2, …, Y n) associated with the travel condition may be predicted.
In some embodiments, the sequence to sequence model may include only one GRU cell which may be shared by the plurality of inputs (i.e., the plurality of first inputs and the plurality of second inputs) . In this situation, the plurality of first inputs and the plurality of second inputs may be arranged in chronological order and may be input into the GRU cell in turn, accordingly, the plurality of future parameters associated with the travel condition may be predicted in order.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure, and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ” and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment, ” “one embodiment, ” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a "block, " “module, ” “engine, ” “unit, ” “component, ” or “system. ” Furthermore, aspects of  the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the “C” programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide  area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a software as a service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations, therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose, and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For example, although the implementation of various components described above may be embodied in a hardware device, it may also be implemented as a software-only solution-e.g., an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, claimed subject matter may lie in less than all features of a single foregoing disclosed embodiment.

Claims (43)

  1. A system for predicting traffic parameter, comprising:
    at least one storage medium including a set of instructions; and
    at least one processor in communication with the at least one storage medium, wherein when executing the set of instructions, the at least one processor is directed to cause the system to:
    obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points;
    obtain a first trained prediction model;
    determine a target vector based on the plurality of first inputs by using the first trained prediction model;
    obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points;
    obtain a second trained prediction model; and
    predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  2. The system of claim 1, wherein the travel condition is associated with a road section.
  3. The system of claim 1 or claim 2, wherein each of the plurality of first inputs includes a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, and a first statistical parameter associated with the plurality of first historical parameters.
  4. The system of claim 3, wherein the first parameter associated with traffic condition includes at least one of a traffic congestion level of the travel condition, a traffic speed of the travel condition, or a traffic flow of the travel condition.
  5. The system of claim 3 or claim 4, wherein the first statistical parameter includes a traffic congestion statistical parameter, a traffic speed statistical parameter, or a traffic flow statistical parameter.
  6. The system of claim 5, wherein the first statistical parameter includes the traffic congestion statistical parameter and the traffic congestion statistical parameter includes at least one of a mode of a plurality of historical traffic congestion levels or a congestion probability of the plurality of historical traffic congestion levels.
  7. The system of claim 5 or claim 6, wherein the first statistical parameter includes the traffic speed statistical parameter and the traffic speed statistical parameter includes at least one of a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, or a minimum value of the plurality of historical speeds.
  8. The system of any of claims 5-7, wherein the first statistical parameter includes the traffic flow statistical parameter and the traffic flow statistical parameter includes at least one of a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the  plurality of historical traffic flows, or a minimum value of the plurality of historical traffic flows.
  9. The system of any of claims 1-8, wherein each of the plurality of second inputs includes a second parameter associated with traffic condition at a previous time point of a corresponding future time point, a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, and a second statistical parameter associated with the plurality of second historical traffic parameters.
  10. The system of claim 9, wherein each of the plurality of second inputs further includes a reference parameter, the reference parameter including weather information at the corresponding future time point.
  11. The system of any of claims 1-10, wherein the first trained prediction model is a first part of a trained prediction model and the second trained prediction model is a second part of the trained prediction model.
  12. The system of claim 11, wherein
    the trained prediction model is a sequence to sequence model,
    the first part of the trained prediction model is an encoder, and
    the second part of the trained prediction model is a decoder.
  13. The system of claim 11 or claim 12, wherein the trained prediction model is determined based on a training process, the training process including:
    obtaining a plurality of first sample inputs corresponding to a plurality of first sample time points respectively;
    obtaining a preliminary prediction model including a preliminary first part and a preliminary second part;
    determining a preliminary vector based on the plurality of first sample inputs by using the preliminary first part;
    obtaining a plurality of second sample inputs corresponding to a plurality of second sample time points respectively;
    predicting a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part;
    obtaining a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively;
    determining a value of a loss function of the preliminary prediction model based on the plurality of sample parameters associated with traffic condition and the plurality of actual parameters associated with traffic condition; and
    designating the preliminary prediction model as the trained prediction model in response to a determination that the value of the loss function is less than a loss threshold.
  14. The system of claim 13, the training process further including:
    updating the preliminary first part or the preliminary second part in response to a determination that the value of the loss function is larger than or equal to the loss threshold.
  15. A method implemented on a computing device having at least one processor, at least one storage medium, and a communication platform connected to a network, the method comprising:
    obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points;
    obtaining a first trained prediction model;
    determining a target vector based on the plurality of first inputs by using the first trained prediction model;
    obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points;
    obtaining a second trained prediction model; and
    predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  16. The method of claim 15, wherein the travel condition is associated with a road section.
  17. The method of claim 15 or claim 16, wherein each of the plurality of first inputs includes a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, and a first statistical parameter associated with the plurality of first historical parameters.
  18. The method of claim 17, wherein the first parameter associated with traffic condition includes at least one of a traffic congestion level of the travel  condition, a traffic speed of the travel condition, or a traffic flow of the travel condition.
  19. The method of claim 17 or claim 18, wherein the first statistical parameter includes a traffic congestion statistical parameter, a traffic speed statistical parameter, or a traffic flow statistical parameter.
  20. The method of claim 19, wherein the first statistical parameter includes the traffic congestion statistical parameter and the traffic congestion statistical parameter includes at least one of a mode of a plurality of historical traffic congestion levels or a congestion probability of the plurality of historical traffic congestion levels.
  21. The method of claim 19 or claim 20, wherein the first statistical parameter includes the traffic speed statistical parameter and the traffic speed statistical parameter includes at least one of a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, or a minimum value of the plurality of historical speeds.
  22. The method of any of claims 19-21, wherein the first statistical parameter includes the traffic flow statistical parameter and the traffic flow statistical parameter includes at least one of a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, or a minimum value of the plurality of historical traffic flows.
  23. The method of any of claims 15-22, wherein each of the plurality of second inputs includes a second parameter associated with traffic condition at a previous time point of a corresponding future time point, a plurality of second historical parameters associated with traffic condition at a plurality of second historical time points respectively, and a second statistical parameter associated with the plurality of second historical traffic parameters.
  24. The method of claim 23, wherein each of the plurality of second inputs further includes a reference parameter, the reference parameter including weather information at the corresponding future time point.
  25. The method of any of claims 15-24, wherein the first trained prediction model is a first part of a trained prediction model and the second trained prediction model is a second part of the trained prediction model.
  26. The method of claim 25, wherein
    the trained prediction model is a sequence to sequence model,
    the first part of the trained prediction model is an encoder, and
    the second part of the trained prediction model is a decoder.
  27. The method of claim 25 or claim 26, wherein the trained prediction model is determined based on a training process, the training process including:
    obtaining a plurality of first sample inputs corresponding to a plurality of first sample time points respectively;
    obtaining a preliminary prediction model including a preliminary first part and a preliminary second part;
    determining a preliminary vector based on the plurality of first sample inputs by using the preliminary first part;
    obtaining a plurality of second sample inputs corresponding to a plurality of second sample time points respectively;
    predicting a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part;
    obtaining a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively;
    determining a value of a loss function of the preliminary prediction model based on the plurality of sample parameters associated with traffic condition and the plurality of actual parameters associated with traffic condition; and
    designating the preliminary prediction model as the trained prediction model in response to a determination that the value of the loss function is less than a loss threshold.
  28. The method of claim 27, the training process further including:
    updating the preliminary first part or the preliminary second part in response to a determination that the value of the loss function is larger than or equal to the loss threshold.
  29. A system for predicting traffic parameter, comprising:
    a first obtaining module configured to:
    obtain a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points; and
    obtain a first trained prediction;
    a vector determination module configured to determine a target vector based on the plurality of first inputs by using the first trained prediction model;
    a second obtaining module configured to:
    obtain a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points; and
    obtain a second trained prediction model; and
    a prediction module configured to predict a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
  30. The system of claim 29, wherein the travel condition is associated with a road section.
  31. The system of claim 29 or claim 30, wherein each of the plurality of first inputs includes a first parameter associated with traffic condition at a corresponding candidate time point, a plurality of first historical parameters associated with traffic condition at a plurality of first historical time points respectively, and a first statistical parameter associated with the plurality of first historical parameters.
  32. The system of claim 31, wherein the first parameter associated with traffic condition includes at least one of a traffic congestion level of the travel condition, a traffic speed of the travel condition, or a traffic flow of the travel condition.
  33. The system of claim 31 or claim 32, wherein the first statistical parameter includes a traffic congestion statistical parameter, a traffic speed statistical parameter, or a traffic flow statistical parameter.
  34. The system of claim 33, wherein the first statistical parameter includes the traffic congestion statistical parameter and the traffic congestion statistical parameter includes at least one of a mode of a plurality of historical traffic congestion levels or a congestion probability of the plurality of historical traffic congestion levels.
  35. The system of claim 33 or claim 34, wherein the first statistical parameter includes the traffic speed statistical parameter and the traffic speed statistical parameter includes at least one of a mean value of a plurality of historical speeds, a median of the plurality of historical speeds, a variance of the plurality of historical speeds, a maximum value of the plurality of historical speeds, or a minimum value of the plurality of historical speeds.
  36. The system of any of claims 33-35, wherein the first statistical parameter includes the traffic flow statistical parameter and the traffic flow statistical parameter includes at least one of a mean value of a plurality of historical traffic flows, a median of the plurality of historical traffic flows, a variance of the plurality of historical traffic flows, a maximum value of the plurality of historical traffic flows, or a minimum value of the plurality of historical traffic flows.
  37. The system of any of claims 29-36, wherein each of the plurality of second inputs includes a second parameter associated with traffic condition at a previous time point of a corresponding future time point, a plurality of  second historical parameters associated with traffic condition at a plurality of second historical time points respectively, and a second statistical parameter associated with the plurality of second historical traffic parameters.
  38. The system of claim 37, wherein each of the plurality of second inputs further includes a reference parameter, the reference parameter including weather information at the corresponding future time point.
  39. The system of any of claims 29-38, wherein the first trained prediction model is a first part of a trained prediction model and the second trained prediction model is a second part of the trained prediction model.
  40. The system of claim 39, wherein
    the trained prediction model is a sequence to sequence model,
    the first part of the trained prediction model is an encoder, and
    the second part of the trained prediction model is a decoder.
  41. The system of claim 39 or claim 40, further comprising a training module configured to perform a training process to determine the trained prediction model, the training process including:
    obtaining a plurality of first sample inputs corresponding to a plurality of first sample time points respectively;
    obtaining a preliminary prediction model including a preliminary first part and a preliminary second part;
    determining a preliminary vector based on the plurality of first sample inputs by using the preliminary first part;
    obtaining a plurality of second sample inputs corresponding to a plurality of second sample time points respectively;
    predicting a plurality of sample parameters associated with traffic condition corresponding to the plurality of second sample time points respectively based on the preliminary vector and the plurality of second sample inputs by using the preliminary second part;
    obtaining a plurality of actual parameters associated with traffic condition corresponding to the plurality of second sample time points respectively;
    determining a value of a loss function of the preliminary prediction model based on the plurality of sample parameters associated with traffic condition and the plurality of actual parameters associated with traffic condition; and
    designating the preliminary prediction model as the trained prediction model in response to a determination that the value of the loss function is less than a loss threshold.
  42. The system of claim 41, the training process further including:
    updating the preliminary first part or the preliminary second part in response to a determination that the value of the loss function is larger than or equal to the loss threshold.
  43. A non-transitory computer readable medium, comprising executable instructions, wherein when executed by at least one processor, the executable instructions direct the at least one processor to perform a method, the method comprising:
    obtaining a plurality of first inputs associated with a travel condition corresponding to a plurality of candidate time points;
    obtaining a first trained prediction model;
    determining a target vector based on the plurality of first inputs by using the first trained prediction model;
    obtaining a plurality of second inputs associated with the travel condition corresponding to a plurality of future time points;
    obtaining a second trained prediction model; and
    predicting a plurality of future parameters associated with the travel condition corresponding to the plurality of future time points based on the target vector and the plurality of second inputs by using the second trained prediction model.
PCT/CN2019/101786 2019-07-23 2019-08-21 Systems and methods for traffic prediction WO2021012342A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910665749.0 2019-07-23
CN201910665749.0A CN111862585B (en) 2019-07-23 2019-07-23 System and method for traffic prediction

Publications (1)

Publication Number Publication Date
WO2021012342A1 true WO2021012342A1 (en) 2021-01-28

Family

ID=72970558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/101786 WO2021012342A1 (en) 2019-07-23 2019-08-21 Systems and methods for traffic prediction

Country Status (2)

Country Link
CN (1) CN111862585B (en)
WO (1) WO2021012342A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112918478A (en) * 2021-02-25 2021-06-08 中南大学 Method and device for predicting lane change of vehicle and computer storage medium
CN112990595A (en) * 2021-03-30 2021-06-18 北京嘀嘀无限科技发展有限公司 Travel time prediction method, travel time prediction device, storage medium and electronic equipment
CN113159453A (en) * 2021-05-17 2021-07-23 北京字跳网络技术有限公司 Resource data prediction method, device, equipment and storage medium
CN113660176A (en) * 2021-08-16 2021-11-16 中国电信股份有限公司 Traffic prediction method and device for communication network, electronic device and storage medium
CN113902137A (en) * 2021-12-06 2022-01-07 腾讯科技(深圳)有限公司 Streaming model training method and device, computer equipment and storage medium
CN113947182A (en) * 2021-09-24 2022-01-18 西安理工大学 Traffic flow prediction model construction method based on double-stage stack graph convolution network
CN115311842A (en) * 2021-05-07 2022-11-08 杭州海康威视数字技术股份有限公司 Traffic flow prediction model training and traffic flow prediction method, device and electronic equipment
CN116720634A (en) * 2023-08-11 2023-09-08 北京泰豪智能工程有限公司 Park operation data processing method and system
CN116959249A (en) * 2023-07-25 2023-10-27 深圳原世界科技有限公司 City information management platform and method based on CIM
CN118211737A (en) * 2024-05-21 2024-06-18 华芯智上半导体设备(上海)有限公司 Crown block system track congestion prediction method and device, electronic equipment and storage medium
CN118433141A (en) * 2024-07-05 2024-08-02 浙江浙交检测技术有限公司 Abnormality detection method, system and equipment for network switch

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112183899A (en) * 2020-11-04 2021-01-05 北京嘀嘀无限科技发展有限公司 Method, device, equipment and storage medium for determining safety degree prediction model
CN113343956B (en) * 2021-08-06 2021-11-19 腾讯科技(深圳)有限公司 Road condition information prediction method and device, storage medium and electronic equipment
CN118095797B (en) * 2024-04-24 2024-07-23 北京嘀嘀无限科技发展有限公司 Method, apparatus, medium, and article for demand prediction

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164386A (en) * 2002-11-14 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Method, apparatus, and program for predicting traffic condition, and recording medium with the program recorded thereon
CN104809877A (en) * 2015-05-14 2015-07-29 重庆大学 Expressway site traffic state estimation method based on feature parameter weighted GEFCM algorithm
EP3239905A1 (en) * 2016-04-29 2017-11-01 Fujitsu Limited Methods and apparatus for use in predicting non-stationary time-series data
WO2018129850A1 (en) * 2017-01-10 2018-07-19 Beijing Didi Infinity Technology And Development Co., Ltd. Method and system for estimating time of arrival
CN109102017A (en) * 2018-08-09 2018-12-28 百度在线网络技术(北京)有限公司 Neural network model processing method, device, equipment and readable storage medium storing program for executing
CN109214584A (en) * 2018-09-21 2019-01-15 北京百度网讯科技有限公司 Method and apparatus for passenger flow forecast amount
CN109300310A (en) * 2018-11-26 2019-02-01 平安科技(深圳)有限公司 A kind of vehicle flowrate prediction technique and device
CN109492597A (en) * 2018-11-19 2019-03-19 深圳市元征科技股份有限公司 The method for building up and device of driving behavior model based on SVM algorithm
CN109677341A (en) * 2018-12-21 2019-04-26 深圳市元征科技股份有限公司 A kind of information of vehicles blending decision method and device
CN109887272A (en) * 2018-12-26 2019-06-14 阿里巴巴集团控股有限公司 A kind of prediction technique and device of traffic flow of the people

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5139208B2 (en) * 2008-09-01 2013-02-06 Kddi株式会社 Disaster traffic prediction method and system, and disaster traffic control system
CN105160866A (en) * 2015-08-07 2015-12-16 浙江高速信息工程技术有限公司 Traffic flow prediction method based on deep learning nerve network structure
CN106447119A (en) * 2016-10-11 2017-02-22 济南观澜数据技术有限公司 Short-term traffic flow prediction method and system based on convolutional neural network
WO2018227368A1 (en) * 2017-06-13 2018-12-20 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for recommending an estimated time of arrival
CN108648450A (en) * 2018-05-14 2018-10-12 重庆市通信建设有限公司 Adaptive public traffic in priority road managing and control system
CN109159785B (en) * 2018-07-19 2020-05-01 重庆科技学院 Automobile driving condition prediction method based on Markov chain and neural network
CN109190795B (en) * 2018-08-01 2022-02-18 中山大学 Inter-area travel demand prediction method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004164386A (en) * 2002-11-14 2004-06-10 Nippon Telegr & Teleph Corp <Ntt> Method, apparatus, and program for predicting traffic condition, and recording medium with the program recorded thereon
CN104809877A (en) * 2015-05-14 2015-07-29 重庆大学 Expressway site traffic state estimation method based on feature parameter weighted GEFCM algorithm
EP3239905A1 (en) * 2016-04-29 2017-11-01 Fujitsu Limited Methods and apparatus for use in predicting non-stationary time-series data
WO2018129850A1 (en) * 2017-01-10 2018-07-19 Beijing Didi Infinity Technology And Development Co., Ltd. Method and system for estimating time of arrival
CN109102017A (en) * 2018-08-09 2018-12-28 百度在线网络技术(北京)有限公司 Neural network model processing method, device, equipment and readable storage medium storing program for executing
CN109214584A (en) * 2018-09-21 2019-01-15 北京百度网讯科技有限公司 Method and apparatus for passenger flow forecast amount
CN109492597A (en) * 2018-11-19 2019-03-19 深圳市元征科技股份有限公司 The method for building up and device of driving behavior model based on SVM algorithm
CN109300310A (en) * 2018-11-26 2019-02-01 平安科技(深圳)有限公司 A kind of vehicle flowrate prediction technique and device
CN109677341A (en) * 2018-12-21 2019-04-26 深圳市元征科技股份有限公司 A kind of information of vehicles blending decision method and device
CN109887272A (en) * 2018-12-26 2019-06-14 阿里巴巴集团控股有限公司 A kind of prediction technique and device of traffic flow of the people

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112918478B (en) * 2021-02-25 2022-04-01 中南大学 Method and device for predicting lane change of vehicle and computer storage medium
CN112918478A (en) * 2021-02-25 2021-06-08 中南大学 Method and device for predicting lane change of vehicle and computer storage medium
CN112990595A (en) * 2021-03-30 2021-06-18 北京嘀嘀无限科技发展有限公司 Travel time prediction method, travel time prediction device, storage medium and electronic equipment
CN115311842A (en) * 2021-05-07 2022-11-08 杭州海康威视数字技术股份有限公司 Traffic flow prediction model training and traffic flow prediction method, device and electronic equipment
CN113159453A (en) * 2021-05-17 2021-07-23 北京字跳网络技术有限公司 Resource data prediction method, device, equipment and storage medium
CN113159453B (en) * 2021-05-17 2024-04-30 北京字跳网络技术有限公司 Resource data prediction method, device, equipment and storage medium
CN113660176A (en) * 2021-08-16 2021-11-16 中国电信股份有限公司 Traffic prediction method and device for communication network, electronic device and storage medium
CN113947182A (en) * 2021-09-24 2022-01-18 西安理工大学 Traffic flow prediction model construction method based on double-stage stack graph convolution network
CN113902137A (en) * 2021-12-06 2022-01-07 腾讯科技(深圳)有限公司 Streaming model training method and device, computer equipment and storage medium
CN116959249A (en) * 2023-07-25 2023-10-27 深圳原世界科技有限公司 City information management platform and method based on CIM
CN116720634A (en) * 2023-08-11 2023-09-08 北京泰豪智能工程有限公司 Park operation data processing method and system
CN116720634B (en) * 2023-08-11 2023-10-13 北京泰豪智能工程有限公司 Park operation data processing method and system
CN118211737A (en) * 2024-05-21 2024-06-18 华芯智上半导体设备(上海)有限公司 Crown block system track congestion prediction method and device, electronic equipment and storage medium
CN118433141A (en) * 2024-07-05 2024-08-02 浙江浙交检测技术有限公司 Abnormality detection method, system and equipment for network switch

Also Published As

Publication number Publication date
CN111862585B (en) 2021-11-02
CN111862585A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
WO2021012342A1 (en) Systems and methods for traffic prediction
US11003677B2 (en) Systems and methods for location recommendation
AU2018282300B2 (en) Systems and methods for allocating service requests
US11398002B2 (en) Systems and methods for determining an estimated time of arrival
US11079244B2 (en) Methods and systems for estimating time of arrival
AU2020259040A1 (en) Systems and methods for determining estimated time of arrival
AU2019246881B2 (en) Systems and methods for determining a path of a moving device
US10876847B2 (en) Systems and methods for route planning
US20200300650A1 (en) Systems and methods for determining an estimated time of arrival for online to offline services
US20200193357A1 (en) Systems and methods for allocating service requests
US20200141741A1 (en) Systems and methods for determining recommended information of a service request
WO2019195996A1 (en) Systems and methods for vehicle scheduling
US20200167812A1 (en) Systems and methods for determining a fee of a service request
US20200286008A1 (en) Systems and methods for distributing on-demand service requests
US11303713B2 (en) Systems and methods for on-demand services
WO2021051221A1 (en) Systems and methods for evaluating driving path
WO2021022487A1 (en) Systems and methods for determining an estimated time of arrival
WO2022126354A1 (en) Systems and methods for obtaining estimated time of arrival in online to offline services
WO2020243963A1 (en) Systems and methods for determining recommended information of service request

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19938186

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19938186

Country of ref document: EP

Kind code of ref document: A1