Vehicle Trajectory Prediction and Collision Warning via Fusion of Multisensors and Wireless Vehicular Communications
<p>Positive and negative characteristics of perception using vehicle-to-everything (V2X) communications and on-board automotive sensors for remote sensing.</p> "> Figure 2
<p>Block diagram summarizing the steps for collision warning generation.</p> "> Figure 3
<p>Illustration for finding a possible collision event using the predicted trajectories of the host vehicle and the remote vehicle.</p> "> Figure 4
<p>Collision event detection (circles filled in red) and time-to-collision (TTC) estimation using the predicted future trajectories of the host vehicle (HV) and the remote vehicle (RV). (<b>a</b>) Preliminary risk assessment step for collision detection; (<b>b</b>) detailed risk assessment step for TTC estimation.</p> "> Figure 4 Cont.
<p>Collision event detection (circles filled in red) and time-to-collision (TTC) estimation using the predicted future trajectories of the host vehicle (HV) and the remote vehicle (RV). (<b>a</b>) Preliminary risk assessment step for collision detection; (<b>b</b>) detailed risk assessment step for TTC estimation.</p> "> Figure 5
<p>Locations of the sensors installed on the host vehicle and the sensor coverage. (<b>a</b>) Radar; (<b>b</b>) lidar; (<b>c</b>) camera; (<b>d</b>) GNSS antenna; (<b>e</b>) sensor range and FOV.</p> "> Figure 5 Cont.
<p>Locations of the sensors installed on the host vehicle and the sensor coverage. (<b>a</b>) Radar; (<b>b</b>) lidar; (<b>c</b>) camera; (<b>d</b>) GNSS antenna; (<b>e</b>) sensor range and FOV.</p> "> Figure 6
<p>Simulink blocks and subsystems designed for the proposed vehicle collision warning system.</p> "> Figure 7
<p>The simulation environment for the vehicle–vehicle collision scenario. (<b>a</b>) Experiment setup at the start of the simulation; (<b>b</b>) collision between the host and the remote vehicle at the end of the simulation.</p> "> Figure 8
<p>Simulation environment for the vehicle–pedestrian collision scenario: (<b>a</b>) Experiment setup at the start of the simulation; (<b>b</b>) collision between the host vehicle and the pedestrian at the end of the simulation.</p> "> Figure 8 Cont.
<p>Simulation environment for the vehicle–pedestrian collision scenario: (<b>a</b>) Experiment setup at the start of the simulation; (<b>b</b>) collision between the host vehicle and the pedestrian at the end of the simulation.</p> "> Figure 9
<p>Vehicle–vehicle collision simulation results and snapshots of the experimental environment at different time points. Shown in the center of the forward-looking view image is the visual collision warning generated to the host vehicle. The bird’s-eye-view image of the road scene shows the locations of the vehicles at the corresponding time instance. The sensor fusion image shows the filtered measurements from various sensors as well as the fusion result. Trajectory prediction and risk assessment enable detection of potential collision location, which is represented by a circle colored in red. (<b>a</b>) Results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>b</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>2</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>3</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>d</b>) results for the time point just before the collision.</p> "> Figure 9 Cont.
<p>Vehicle–vehicle collision simulation results and snapshots of the experimental environment at different time points. Shown in the center of the forward-looking view image is the visual collision warning generated to the host vehicle. The bird’s-eye-view image of the road scene shows the locations of the vehicles at the corresponding time instance. The sensor fusion image shows the filtered measurements from various sensors as well as the fusion result. Trajectory prediction and risk assessment enable detection of potential collision location, which is represented by a circle colored in red. (<b>a</b>) Results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>1</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>b</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>2</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>3</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>d</b>) results for the time point just before the collision.</p> "> Figure 10
<p>Collision warning generated over time in the vehicle–vehicle collision scenario.</p> "> Figure 11
<p>Vehicle–pedestrian collision simulation results and snapshots of the experimental environment at different time points. Shown in the center of the forward-looking view image is the visual collision warning generated to the host vehicle. The bird’s-eye-view image of the road scene shows the locations of the host vehicle and the pedestrian at the corresponding time instance. The sensor fusion image shows the filtered measurements from various sensors as well as the fusion result. Trajectory prediction and risk assessment enable detection of potential collision location, which is represented by a circle colored in red. (<b>a</b>) Results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>0.7</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>b</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>1.4</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>2.1</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>d</b>) results for the time point just before the collision.</p> "> Figure 11 Cont.
<p>Vehicle–pedestrian collision simulation results and snapshots of the experimental environment at different time points. Shown in the center of the forward-looking view image is the visual collision warning generated to the host vehicle. The bird’s-eye-view image of the road scene shows the locations of the host vehicle and the pedestrian at the corresponding time instance. The sensor fusion image shows the filtered measurements from various sensors as well as the fusion result. Trajectory prediction and risk assessment enable detection of potential collision location, which is represented by a circle colored in red. (<b>a</b>) Results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>0.7</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>b</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>1.4</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>c</b>) results for <math display="inline"><semantics> <mrow> <mi>t</mi> <mo>=</mo> <mn>2.1</mn> <mrow> <mtext> </mtext> <mi mathvariant="normal">s</mi> </mrow> </mrow> </semantics></math>; (<b>d</b>) results for the time point just before the collision.</p> "> Figure 12
<p>Collision warning generated over time in the vehicle–pedestrian collision scenario.</p> ">
Abstract
:1. Introduction
2. Related Work
3. System Overview
3.1. Architecture of the Proposed System
3.2. Automotive Sensors for Remote Sensing
3.3. V2X Communications
4. Implementation
4.1. Kalman Filtering
4.2. Trajectory Prediction and Risk Assessment
5. Experiments
5.1. Experimental Environment
5.1.1. Vehicle Configuration
5.1.2. Vehicle–Vehicle Collision Scenario
5.1.3. Vehicle–Pedestrian Collision Scenario
- Pedestrian crossing the road while vehicle going straight.
- Pedestrian crossing the road while vehicle turning right.
- Pedestrian crossing the road while vehicle turning left.
- Pedestrian traveling along/against traffic while vehicle going straight.
5.2. Performance Evaluation and Analysis
5.2.1. Vehicle–Vehicle Collision Scenario
5.2.2. Vehicle–Pedestrian Collision Scenario
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ADAS | Advanced Driver Assistance Systems |
AEB | Automatic Emergency Braking |
BSM | Basic Safety Message |
CTRV | Constant Turn Rate and Velocity |
DSRC | Dedicated Short-Range Communications |
FOV | Field of View |
FCW | Forward Collision Warning |
GNSS | Global Navigation Satellite System |
HMI | Human-Machine Interface |
NCAP | New Car Assessment Program |
NLOS | Non-Line of Sight |
OBU | On-Board Unit |
PSM | Personal Safety Message |
SCP | Straight Crossing Paths |
TTC | Time-to-Collision |
V2X | Vehicle-to-Everything |
V2I | Vehicle-to-Infrastructure |
V2P | Vehicle-to-Pedestrian |
V2V | Vehicle-to-Vehicle |
VRU | Vulnerable Road User |
WAVE | Wireless Access in Vehicular Environments |
References
- Dingus, T.A.; Klauer, S.G.; Neale, V.L.; Petersen, A.; Lee, S.E.; Sudweeks, J.; Perez, M.A.; Hankey, J.; Ramsey, D.; Gupta, S.; et al. The 100-Car Naturalistic Driving Study, Phase II—Results of the 100-Car Field Experiment; Rep. DOT HS 210 593; National Highway Traffic Safety Administration: Washington, DC, USA, 2006.
- Urmson, C.; Anhalt, J.; Bagnell, D.; Baker, C.; Bittner, R.; Clark, M.N.; Dolan, J.; Duggins, D.; Galatali, T.; Geyer, C.; et al. Autonomous driving in urban environments: Boss and the Urban Challenge. In The DARPA Urban Challenge—Autonomous Vehicles in City Traffic; Buehler, M., Iagnemma, K., Singh, S., Eds.; Springer: Berlin, Germany, 2009; ISBN 978-3-642-03990-4. [Google Scholar]
- Montemerlo, M.; Becker, J.; Bhat, S.; Dahlkamp, H.; Dolgov, D.; Ettinger, S.; Haehnel, D.; Hilden, T.; Hoffmann, G.; Huhnke, B.; et al. Junior: The Stanford entry in the Urban Challenge. In The DARPA Urban Challenge—Autonomous Vehicles in City Traffic; Buehler, M., Iagnemma, K., Singh, S., Eds.; Springer: Berlin, Germany, 2009; ISBN 978-3-642-03990-4. [Google Scholar]
- Wille, J.M.; Saust, F.; Maurer, M. Stadtpilot: Driving Autonomously on Braunschweig’s Inner Ring Road. In Proceedings of the IEEE Intelligent Vehicles Symposium, San Diego, CA, USA, 21–24 June 2010; pp. 506–511. [Google Scholar]
- Guizzo, E. How Google’s Self-Driving Car Works. Available online: http://spectrum.ieee.org/automaton/.robotics/artificial-intelligence/how-google-self-driving-car-works (accessed on 27 November 2016).
- Ziegler, J.; Bender, P.; Schreiber, M.; Lategahn, H.; Strauss, T.; Stiller, C.; Dang, T.; Franke, U.; Appenrodt, N.; Keller, C.G.; et al. Making Bertha drive—An autonomous journey on a historic route. IEEE Intell. Transp. Syst. Mag. 2014, 6, 8–20. [Google Scholar] [CrossRef]
- Broggi, A.; Cerri, P.; Debattisti, S.; Laghi, M.C.; Medici, P.; Molinari, D.; Panciroli, M.; Prioletti, A. PROUD—Public road urban driverless-car test. IEEE Trans. Intell. Transp. Syst. 2015, 16, 3508–3519. [Google Scholar] [CrossRef]
- Najm, W.G.; Koopmann, J.; Smith, J.D.; Brewer, J. Frequency of Target Crashes for IntelliDrive Safety Systems; Rep. DOT HS 811 381; National Highway Traffic Safety Administration: Washington, DC, USA, 2010.
- Schmidt, R.K.; Kloiber, B.; Schüttler, F.; Strang, T. Degradation of Communication Range in VANETs Caused by Interference 2.0—Real-World Experiment. In Communication Technologies for Vehicles; Strang, T., Festag, A., Vinel, A., Mehmood, R., Garcia, C.R., Röckl, M., Eds.; Springer: Berlin, Germany, 2011; ISBN 978-3-642-19785-7. [Google Scholar]
- Zang, S.; Ding, M.; Smith, D.; Tyler, P.; Rakotoarivelo, T.; Kaafar, M.A. The impact of adverse weather conditions on autonomous vehicles. IEEE Veh. Technol. Mag. 2019, 14, 103–111. [Google Scholar] [CrossRef]
- TASS International. PreScan—Simulation of ADAS and Active Safety. Available online: http://www.tassinternational.com/prescan (accessed on 30 September 2019).
- Bloecher, H.L.; Dickmann, J.; Andres, M. Automotive Active Safety and Comfort Functions Using Radar. In Proceedings of the IEEE International Conference on Ultra-Wideband, Vancouver, BC, Canada, 9–11 September 2009; pp. 490–494. [Google Scholar]
- Dagan, E.; Mano, O.; Stein, G.P.; Shashua, A. Forward Collision Warning with a Single Camera. In Proceedings of the IEEE Intelligent Vehicles Symposium, Parma, Italy, 14–17 June 2004; pp. 37–42. [Google Scholar]
- Ammoun, S.; Nashashibi, F. Real Time Trajectory Prediction for Collision Risk Estimation between Vehicles. In Proceedings of the IEEE International Conference on Intelligent Computer Communication and Processing, Cluj-Napoca, Romania, 27–29 August 2009; pp. 417–422. [Google Scholar]
- Xiang, X.; Qin, W.; Xiang, B. Research on a DSRC-based rear-end collision warning model. IEEE Trans. Intell. Transp. Syst. 2014, 15, 1054–1065. [Google Scholar] [CrossRef]
- Rauch, A.; Maier, S.; Klanner, F.; Dietmayer, K. Inter-Vehicle Object Association for Cooperative Perception Systems. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, The Hague, The Netherlands, 6–9 October 2013; pp. 893–898. [Google Scholar]
- Obst, M.; Hobert, L.; Reisdorf, P. Multi-Sensor Data Fusion for Checking Plausibility of V2V Communications by Vision-Based Multiple-Object Tracking. In Proceedings of the IEEE Vehicular Networking Conference, Paderborn, Germany, 3–5 December 2014; pp. 143–150. [Google Scholar]
- De Ponte Müller, F.; Diaz, E.M.; Rashdan, I. Cooperative Positioning and Radar Sensor Fusion for Relative Localization of Vehicles. In Proceedings of the IEEE Intelligent Vehicles Symposium, Gothenburg, Sweden, 19–22 June 2016; pp. 1060–1065. [Google Scholar]
- Lin, B.; Huang, C.H. Comparison between ARPA radar and AIS characteristics for vessel traffic services. J. Mar. Sci. Technol. 2006, 14, 182–189. [Google Scholar]
- Harati-Mokhtari, A.; Wall, A.; Brooks, P.; Wang, J. Automatic Identification System (AIS): Data reliability and human error implications. J. Navig. 2007, 60, 373–389. [Google Scholar] [CrossRef]
- Ross, P.E. The Audi A8: The World’s First Production Car to Achieve Level 3 Autonomy. Available online: https://spectrum.ieee.org/cars-that-think/transportation/self-driving/the-audi-a8-the-worlds-first-production-car-to-achieve-level-3-autonomy (accessed on 25 October 2019).
- Stein, G.P.; Mano, O.; Shashua, A. Vision-Based ACC with a Single Camera: Bounds on Range and Range Rate Accuracy. In Proceedings of the IEEE Intelligent Vehicles Symposium, Columbus, OH, USA, 9–11 June 2003; pp. 120–125. [Google Scholar]
- Shibata, M.; Makino, T.; Ito, M. Target Distance Measurement Based on Camera Moving Direction Estimated with Optical Flow. In Proceedings of the 10th IEEE International Workshop on Advanced Motion Control, Trento, Italy, 26–28 March 2008; pp. 62–67. [Google Scholar]
- Fritsch, J.; Michalke, T.; Gepperth, A.; Bone, S.; Waibel, F.; Kleinehagenbrock, M.; Gayko, J.; Goerick, C. Towards a Human-Like Vision System for Driver Assistance. In Proceedings of the IEEE Intelligent Vehicles Symposium, Eindhoven, The Netherlands, 4–6 June 2008; pp. 275–282. [Google Scholar]
- Part 11: Wireless LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications—Amendment 6: Wireless Access in Vehicular Environments; IEEE Std. 802.11p; IEEE: New York, NY, USA, 2010.
- IEEE Standard for Wireless Access in Vehicular Environments (WAVE)—Multi-Channel Operation; IEEE Std. 1609.4; IEEE: New York, NY, USA, 2016.
- Dedicated Short Range Communications (DSRC) Message Set Dictionary; SAE J2735; SAE International: Warrendale, PA, USA, 2016.
- Ahmed-Zaid, F.; Bai, F.; Bai, S.; Basnayake, C.; Bellur, B.; Brovold, S.; Brown, G.; Caminiti, L.; Cunningham, D.; Elzein, H.; et al. Vehicle Safety Communications—Applications (VSC-A) Final Report; Rep. DOT HS 811 492A; National Highway Traffic Safety Administration: Washington, DC, USA, 2011.
- Vulnerable Road User Safety Message Minimum Performance Requirements; SAE J2945/9; SAE International: Warrendale, PA, USA, 2017.
- Kalman, R.E. A new approach to linear filtering and prediction problems. Trans. ASME J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- Bar-Shalom, Y.; Li, X.R.; Kirubarajan, T. Estimation with Applications to Tracking and Navigation: Theory Algorithms and Software; John Wiley and Sons: New York, NY, USA, 2001; ISBN 0-471-41655-X. [Google Scholar]
- Welch, G.; Bishop, G. An Introduction to the Kalman Filter. In Proceedings of the SIGGRAPH, Los Angeles, CA, USA, 12–17 August 2001. Course 8. [Google Scholar]
- Julier, S.J.; Uhlmann, J.K. New Extension of the Kalman Filter to Nonlinear Systems. In Signal Processing, Sensor Fusion, and Target Recognition VI, Proceedings of AeroSense: The 11th International Symposium on Aerospace/Defense Sensing, Simulation, and Controls, Orlando, FL, United States, 21–25 April 1997; Kadar, I., Ed.; SPIE: Bellingham, WA, USA, 1997; pp. 182–193. [Google Scholar]
- Lerro, D.; Bar-Shalom, Y. Tracking with debiased consistent converted measurements vs. EKF. IEEE Trans. Aerosp. Electron. Syst. 1993, 29, 1015–1022. [Google Scholar] [CrossRef]
- Mo, L.; Song, X.; Zhou, Y.; Sun, Z.; Bar-Shalom, Y. Unbiased converted measurements for tracking. IEEE Trans. Aerosp. Electron. Syst. 1998, 34, 1023–1027. [Google Scholar]
- Euro NCAP. European New Car Assessment Programme (Euro NCAP) Test Protocol—AEB VRU Systems, Version 2.0.2. Available online: http://www.euroncap.com/en/for-engineers/protocols/pedestrian-protection/ (accessed on 30 November 2017).
- Mobileye. Forward Collision Warning (FCW). Available online: https://www.mobileye.com/au/fleets/technology/forward-collision-warning/ (accessed on 10 November 2019).
- Najm, W.G.; Smith, J.D.; Yanagisawa, M. Pre-Crash Scenario Typology for Crash Avoidance Research; Rep. DOT HS 810 767; National Highway Traffic Safety Administration: Washington, DC, USA, 2007.
- Najm, W.G.; Smith, J.D.; Smith, D.L. Analysis of Crossing Path Crashes; Rep. DOT HS 809 423; National Highway Traffic Safety Administration: Washington, DC, USA, 2001.
- André, M.; Hammarström, U. Driving speeds in Europe for pollutant emission estimation. Transp. Res. Part D Transp. Environ. 2000, 5, 321–335. [Google Scholar] [CrossRef]
- Yanagisawa, M.; Swanson, E.; Najm, W.G. Target Crashes and Safety Benefits Estimation Methodology for Pedestrian Crash Avoidance/Mitigation Systems; Rep. DOT HS 811 998; National Highway Traffic Safety Administration: Washington, DC, USA, 2014.
- Green, M. “How long does it take to stop?” Methodological analysis of driver perception-brake times. Transp. Hum. Factors 2000, 2, 195–216. [Google Scholar] [CrossRef]
Type | Delphi ESR | Bosch LRR3 | Continental ARS 30X |
---|---|---|---|
Frequency band | 76.5 GHz | 76–77 GHz | 76–77 GHz |
Range | 174 m | 250 m | 200 m |
Range accuracy | 0.5 m | 0.1 m | 0.25 m |
Angular accuracy | 0.5 deg | n/a 1 | 0.1 deg |
Horizontal FOV | 20 deg | 30 deg | 17 deg |
Data update | 50 ms | 80 ms | 66 ms |
Type | Ibeo Scala B3.0 |
---|---|
Laser wavelength | 905 nm |
Range | 80 m |
Range accuracy | 0.1 m |
Horizontal resolution | 0.25 deg |
Horizontal FOV | 145 deg |
Data update | 40 ms |
Type | Mobileye Camera |
---|---|
Frame size | 640 × 480 pixels |
Range | 70 m (detection) |
100 m (tracking) | |
Accuracy | 5% error at 45 m |
10% error at 90 m | |
Horizontal FOV | 47 deg |
Type | WAVE Standards |
---|---|
Frequency | 5.850–5.925 GHz |
Channel | 1 CCH, 6 SCH |
Bandwidth | 10 MHz |
Data rate | 3–27 Mbps |
Maximum range | 1000 m |
Message | Content |
---|---|
BSM Part I | Message count |
Temporary ID | |
Time | |
Position (latitude, longitude, elevation) | |
Position accuracy | |
Transmission state | |
Speed | |
Heading | |
Steering wheel angle | |
Acceleration | |
Yaw rate | |
Brake system status | |
Vehicle size (width, length) | |
BSM Part II | Event flags |
Path history | |
Path prediction | |
RTCM package |
Message | Content |
---|---|
PSM | Personal device user type |
Time | |
Message count | |
Temporary ID | |
Position (latitude, longitude, elevation) | |
Position accuracy | |
Speed | |
Heading |
Message | Type | Accuracy |
---|---|---|
BSM | Position | 0.5 m |
Heading | 0.3 deg | |
Speed | 0.3 m/s | |
Yaw rate | 0.5 deg/s | |
PSM | Position | 1.5 m |
Heading | 5 deg | |
Speed | 0.56 m/s |
Condition | Stage | Warning Type | Color |
---|---|---|---|
No collision detected | No threat (Level 0) | Visual | Gray |
Threat detected (Level 1) | Visual | Green | |
Inform driver (Level 2) | Visual and audible | Yellow | |
Warn driver (Level 3) | Visual and audible | Red |
Data Range | Mean (s) | SD (s) |
---|---|---|
0.08 | 0.05 | |
0.05 | 0.05 | |
0.03 | 0.02 | |
0.004 | 0.01 |
Data Range | Mean (s) | SD (s) |
---|---|---|
0.01 | 0.04 | |
0.007 | 0.03 | |
0.001 | 0.01 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Baek, M.; Jeong, D.; Choi, D.; Lee, S. Vehicle Trajectory Prediction and Collision Warning via Fusion of Multisensors and Wireless Vehicular Communications. Sensors 2020, 20, 288. https://doi.org/10.3390/s20010288
Baek M, Jeong D, Choi D, Lee S. Vehicle Trajectory Prediction and Collision Warning via Fusion of Multisensors and Wireless Vehicular Communications. Sensors. 2020; 20(1):288. https://doi.org/10.3390/s20010288
Chicago/Turabian StyleBaek, Minjin, Donggi Jeong, Dongho Choi, and Sangsun Lee. 2020. "Vehicle Trajectory Prediction and Collision Warning via Fusion of Multisensors and Wireless Vehicular Communications" Sensors 20, no. 1: 288. https://doi.org/10.3390/s20010288