Autonomous Vehicle State Estimation and Mapping Using Takagi–Sugeno Modeling Approach
<p>The architecture of the software modules, including information flow, and a conceptual overview of the interconnections. Software framework also includes low-level sensor measurement acquisition units.</p> "> Figure 2
<p>A representation of the bicycle model in 2D space for deriving the equations of motion. The vehicle schematic shows the inertial frame (O), the center of gravity (CoG) frame (C) attached to the center of gravity of the vehicle. The forces <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>r</mi> <mi>x</mi> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>r</mi> <mi>y</mi> </mrow> </msub> </semantics></math> on the rear wheel are the longitudinal and lateral force, respectively. The front wheel forces <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>f</mi> <mi>l</mi> <mi>o</mi> <mi>n</mi> <mi>g</mi> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>f</mi> <mi>l</mi> <mi>a</mi> <mi>t</mi> </mrow> </msub> </semantics></math> are the longitudinal and lateral forces, respectively, the forces <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>f</mi> <mi>x</mi> </mrow> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>F</mi> <mrow> <mi>f</mi> <mi>y</mi> </mrow> </msub> </semantics></math> are the result of these forces in frame C. The lengths <math display="inline"><semantics> <msub> <mi>l</mi> <mi>f</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>l</mi> <mi>r</mi> </msub> </semantics></math> are the distance from CoG to the front wheel and rear wheel, respectively.</p> "> Figure 3
<p>Calculation of laser end-point probability at sub-grid level by using occupancy grid probability map.</p> "> Figure 4
<p>Translation of scan information <math display="inline"><semantics> <msup> <mrow> <mo>[</mo> <mi>r</mi> <mo>,</mo> <mi>α</mi> <mo>]</mo> </mrow> <mi>T</mi> </msup> </semantics></math> to scan end-points <math display="inline"><semantics> <msup> <mrow> <mo>[</mo> <msub> <mi>s</mi> <mrow> <mi>i</mi> <mi>X</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>s</mi> <mrow> <mi>i</mi> <mi>Y</mi> </mrow> </msub> <mo>]</mo> </mrow> <mi>T</mi> </msup> </semantics></math> in LIDAR frame of reference.</p> "> Figure 5
<p>A defined sector that is designed to capture kinematic system matrices to obtain offline gain. Later on, using the same sector approach, online interpolated gains are obtained.</p> "> Figure 6
<p>Estimation comparison between wrapped yaw measurement and unwrap measurement. Panels (<b>a</b>,<b>b</b>) represent the position of vehicle and yaw measurement in the global frame, respectively, both of these states are estimated using yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mi>π</mi> <mo>,</mo> <mi>π</mi> <mo>]</mo> </mrow> </semantics></math>. Panels (<b>c</b>,<b>d</b>) represent the position of vehicle and yaw measurement in the global frame, respectively, both of these states are estimated using continuous yaw measurement. (<b>a</b>) X–Y global position on yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mi>π</mi> <mo>,</mo> <mi>π</mi> <mo>]</mo> </mrow> </semantics></math>. (<b>b</b>) Yaw sensor measurement. (<b>c</b>) X–Y global position on yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mo>∞</mo> <mo>,</mo> <mo>∞</mo> <mo>]</mo> </mrow> </semantics></math>. (<b>d</b>) Yaw transformed to continuous measurement.</p> "> Figure 6 Cont.
<p>Estimation comparison between wrapped yaw measurement and unwrap measurement. Panels (<b>a</b>,<b>b</b>) represent the position of vehicle and yaw measurement in the global frame, respectively, both of these states are estimated using yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mi>π</mi> <mo>,</mo> <mi>π</mi> <mo>]</mo> </mrow> </semantics></math>. Panels (<b>c</b>,<b>d</b>) represent the position of vehicle and yaw measurement in the global frame, respectively, both of these states are estimated using continuous yaw measurement. (<b>a</b>) X–Y global position on yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mi>π</mi> <mo>,</mo> <mi>π</mi> <mo>]</mo> </mrow> </semantics></math>. (<b>b</b>) Yaw sensor measurement. (<b>c</b>) X–Y global position on yaw measurement between <math display="inline"><semantics> <mrow> <mo>[</mo> <mo>−</mo> <mo>∞</mo> <mo>,</mo> <mo>∞</mo> <mo>]</mo> </mrow> </semantics></math>. (<b>d</b>) Yaw transformed to continuous measurement.</p> "> Figure 7
<p>Vehicle estimated trajectory in the presence of sensor disturbance during the simulation. As per the controller policy, the vehicle tried to follow two laps of the center-line track.</p> "> Figure 8
<p>Estimated states on vehicle simulator for two laps.</p> "> Figure 9
<p>Vehicle estimated trajectory in the presence of sensor disturbance during the real experiment. As per the controller policy, the vehicle tried to follow two laps of the center-line track.</p> "> Figure 10
<p>Snapshots of the vehicle following the center line of the oval track. The vehicle motion is represented by a sequence of images clockwise from the upper-left to the bottom-left.</p> "> Figure 11
<p>Estimated states on real vehicle for two laps.</p> "> Figure 11 Cont.
<p>Estimated states on real vehicle for two laps.</p> "> Figure 12
<p>Comparison of ground truth and final map obtained. (<b>a</b>) The ground truth image represents the environment where the testing is performed. (<b>b</b>) The final map formed after completion of two laps.</p> ">
Abstract
:1. Introduction
- A TS Kalman filter is developed which does not need linearization at each step, as in the case of EKF. In addition, the estimator provides a stable and optimal solution.
- The design of the TS Kalman filter is solved offline using LMIs while online (in real-time), only the interpolation of the TS subsystems gains is required.
- A case study of a small-scale autonomous vehicle is presented to obtain the full state by correcting the raw measurement obtained from sensors and LIDAR scan end-point matching.
- The whole framework, including NMPC, is validated on a small computation board Odroid XU4. The low computational requirements of the framework, especially SLAM, make it easy to deploy on small robots.
2. Proposed Approach
- 1.
- Derive a TS model from the nonlinear model which embeds the nonlinearity term inside the system matrices.
- 2.
- Obtain the polytopic systems and derive the fuzzy representation of the TS model [4].
- 3.
- Formulate LMIs for Lyapunov stability and optimal dual LQR for all the obtained polytopes or system vertices of the fuzzy model to obtain the offline Kalman gain.
- 4.
- Exploit fuzzy interpolation technique to find the online gain for the TS Kalman filter.
- 5.
- Apply online TS Kalman gain for the state estimation.
2.1. Considered Autonomous Vehicle
2.1.1. Construction of TS Model
2.1.2. Measurement Model
2.1.3. Observability Analysis
3. SLAM Algorithm
3.1. Mapping
3.2. Localization
4. Takagi–Sugeno Kalman Filter Design
4.1. LMI Design Procedure
4.2. Disturbance and Noise Matrix
4.3. Switching Estimator Design
Algorithm 1: Offline Gain Optimization Algorithm |
Algorithm 2: Gain Interpolation Algorithm |
|
Algorithm 3: State Estimation Algorithm |
5. Results
5.1. Simulation
5.2. Real Experiment
5.3. Simulated vs. Real Experiment
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Litman, T. Autonomous Vehicle Implementation Predictions: Implications for Transport Planning. 2020. Available online: https://www.vtpi.org/avip.pdf (accessed on 29 March 2022).
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- Rotondo, D. Advances in Gain-Scheduling and Fault Tolerant Control Techniques; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Takagi, T.; Sugeno, M. Fuzzy identification of systems and its applications to modeling and control. IEEE Trans. Syst. Man Cybern. 1985, SMC-15, 116–132. [Google Scholar] [CrossRef]
- Zhang, Z. Iterative point matching for registration of free-form curves and surfaces. Int. J. Comput. Vis. 2005, 13, 119–152. [Google Scholar] [CrossRef]
- Kohlbrecher, S.; Meyer, J.; von Stryk, O.; Klingauf, U. A Flexible and Scalable SLAM System with Full 3D Motion Estimation. In Proceedings of the IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), Tokyo, Japan, 9–12 November 2011. [Google Scholar]
- Olson, E.B. Real-time correlative scan matching. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 4387–4393. [Google Scholar] [CrossRef] [Green Version]
- Grisetti, G.; Stachniss, C.; Burgard, W. Improved Techniques for Grid Mapping With Rao-Blackwellized Particle Filters. IEEE Trans. Robot. 2007, 23, 34–46. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. In Proceedings of the 2014 Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014. [Google Scholar]
- Pathirana, C.D. Fuzzy Kalman Filtering of the Slam Problem Using Pseudo-Linear Models with Two-Sensor Data Association. Int. J. Syst. Signal Control. Eng. Appl. 2008, 1, 89–100. [Google Scholar]
- Rotondo, D.; Reppa, V.; Puig, V.; Nejjari, F. Adaptive Observer for Switching Linear Parameter-Varying (LPV) Systems. IFAC Proc. Vol. 2014, 47, 1471–1476. [Google Scholar] [CrossRef] [Green Version]
- Alcalá, E.; Puig, V.; Quevedo, J.; Rosolia, U. Autonomous racing using Linear Parameter Varying-Model Predictive Control (LPV-MPC). Control. Eng. Pract. 2020, 95, 104270. [Google Scholar] [CrossRef]
- Hartmann, J.; Klüssendorff, J.H.; Maehle, E. A comparison of feature descriptors for visual SLAM. In Proceedings of the 2013 European Conference on Mobile Robots, Barcelona, Spain, 25–27 September 2013; pp. 56–61. [Google Scholar] [CrossRef]
- Chen, C.; Wang, B.; Lu, C.X.; Trigoni, N.; Markham, A. A Survey on Deep Learning for Localization and Mapping: Towards the Age of Spatial Machine Intelligence. arXiv 2020, arXiv:2006.12567. [Google Scholar]
- Zadeh, L.; Klir, G.; Yuan, B. Fuzzy Sets, Fuzzy Logic, and Fuzzy Systems—Selected Papers by Lotfi A Zadeh. In Advances in Fuzzy Systems—Applications and Theory; World Scientific Publishing: Singapore, 1996. [Google Scholar]
- Fantuzzi, C.; Rovatti, R. On the approximation capabilities of the homogeneous Takagi–Sugeno model. In Proceedings of the IEEE 5th International Fuzzy Systems, New Orleans, LA, USA, 11 September 1996; Volume 2, pp. 1067–1072. [Google Scholar] [CrossRef]
- Rajmani, R. Vehicle Dynamics and Control; Springer: New York, NY, USA, 2012; Volume 2. [Google Scholar] [CrossRef]
- Pacejka, H.B.; Bakker, E. The magic formula tyre model. Veh. Syst. Dyn. 1992, 21, 1–18. [Google Scholar] [CrossRef]
- Ostertag, E. Mono- and Multivariable Control and Estimation: Linear, Quadratic and LMI Methods; Springer: Berlin/Heidelberg, Germany, 2011; Volume 2, pp. 239–241. [Google Scholar] [CrossRef]
- Burguera, A.B.; Cid, Y.G.; Oliver, G. On the use of likelihood fields to perform sonar scan matching localization. Auton. Robot. 2009, 26, 203–222. [Google Scholar] [CrossRef]
- Masti, D.; Bemporad, A. Learning nonlinear state–space models using autoencoders. Automatica 2021, 129, 109666. [Google Scholar] [CrossRef]
- Yu, H.S.A.; Yao, D.; Zimmer, C.; Toussaint, M.; Nguyen-Tuong, D. Active Learning in Gaussian Process State Space Model. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases; Springer: Cham, Switzerland, 2021; pp. 346–361. [Google Scholar]
- Li, X.; Du, S.; Li, G.; Li, H. Integrate Point-Cloud Segmentation with 3D LiDAR Scan-Matching for Mobile Robot Localization and Mapping. Sensors 2020, 20, 237. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Symbol | Description |
---|---|
Longitudinal velocity of vehicle in center of gravity (CoG) frame (C), see Figure 2. | |
Lateral velocity of the vehicle in CoG frame (C). | |
Angular velocity of the vehicle in CoG frame (C). | |
X | Global position of the vehicle in X-axis frame (O). |
Y | Global position of the vehicle in Y-axis frame (O). |
Orientation of the vehicle with respect to the x-axis of the frame (O). | |
D | Duty Cycle of motor, normalized between [0,1]. |
Steering angle. |
Parameters | Values | Description |
---|---|---|
m | kg | Mass of the vehicle |
m | Distance from CoG to front wheel | |
m | Distance from CoG to rear wheel | |
kg/m | Air density | |
N | Motor parameter 1 | |
kg/s | Motor parameter 2 | |
kg/s | Resistive driveline parameter | |
N | Static friction force | |
m | Coefficient of drag multiplied with area | |
N/rad | Front wheel cornering stiffness | |
N/rad | Rear wheel cornering stiffness | |
kg·m | Moment of inertia |
Variables | ||
---|---|---|
States | Sensors | Covariance [m] |
---|---|---|
Motor encoder | ||
IMU | ||
X | Scan matching | |
Y | Scan matching | |
IMU |
States | X | Y | ||||
---|---|---|---|---|---|---|
NRMSE | 0.0781 | 0.0323 | 0.0913 | 0.0157 | 0.0175 | 0.0198 |
States | min [m] | max [m] | std [m] | |||
---|---|---|---|---|---|---|
Sim | Real | Sim | Real | Sim | Real | |
X | 0.124 | 0.094 | 0.405 | 0.317 | 0.159 | 0.116 |
Y | 0.137 | 0.119 | 0.344 | 0.241 | 0.138 | 0.126 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chaubey, S.; Puig, V. Autonomous Vehicle State Estimation and Mapping Using Takagi–Sugeno Modeling Approach. Sensors 2022, 22, 3399. https://doi.org/10.3390/s22093399
Chaubey S, Puig V. Autonomous Vehicle State Estimation and Mapping Using Takagi–Sugeno Modeling Approach. Sensors. 2022; 22(9):3399. https://doi.org/10.3390/s22093399
Chicago/Turabian StyleChaubey, Shivam, and Vicenç Puig. 2022. "Autonomous Vehicle State Estimation and Mapping Using Takagi–Sugeno Modeling Approach" Sensors 22, no. 9: 3399. https://doi.org/10.3390/s22093399