Analytically Embedding Differential Equation Constraints into Least Squares Support Vector Machines Using the Theory of Functional Connections
<p>Accuracy gain for the Theory of Functional Connections (TFC) and constrained support vector machine (CSVM) methods over least-squares support vector machines (LS-SVMs) for problem #1 using 100 training points.</p> "> Figure 2
<p>Mean squared error vs. solution time for problem #1.</p> "> Figure 3
<p>Accuracy gain for TFC and CSVM methods over LS-SVM for problem #2 using 100 training points.</p> "> Figure 4
<p>Mean squared error vs. solution time for problem #2.</p> "> Figure 5
<p>Accuracy gain for TFC and CSVM methods over LS-SVMs for problem #3 using 100 training points.</p> "> Figure 6
<p>Mean squared error vs. solution time for problem #3 accuracy vs. time.</p> "> Figure 7
<p>Accuracy gain for TFC and CSVM methods over LS-SVMs for problem #4 using 100 training points in the domain.</p> "> Figure 8
<p>Mean squared error vs. solution time for problem #4 accuracy vs. time.</p> ">
Abstract
:1. Introduction
2. Background on the Theory of Functional Connections
3. The Support Vector Machine Technique
3.1. An Overview of SVMs
3.2. Constrained SVM (CSVM) Technique
3.3. Nonlinear ODEs
3.4. Linear PDEs
4. Numerical Results
4.1. Problem #1
4.2. Problem #2
4.3. Problem #3
4.4. Problem #4
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
BVP | boundary-value problem |
CP | Chebyshev polynomial |
CSVM | constrained support vector machines |
DE | differential equation |
IVP | initial-value problem |
LS | least-squares |
LS-SVM | least-squares support vector machines |
MSE | mean square error |
MVP | multi-value problem |
ODE | ordinary differential equation |
PDE | partial differential equation |
RBF | radial basis function |
SVM | support vector machines |
TFC | Theory of Functional Connections |
Appendix A Numerical Data
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | m |
---|---|---|---|---|---|---|
8 | 7.813 × | 6.035 × | 1.057 × | 6.187 × | 8.651 × | 7 |
16 | 1.406 × | 2.012 × | 1.257 × | 1.814 × | 8.964 × | 17 |
32 | 5.000 × | 2.220 × | 1.887 × | 3.331 × | 2.086 × | 25 |
50 | 7.500 × | 2.220 × | 9.368 × | 2.220 × | 1.801 × | 25 |
100 | 1.266 × | 4.441 × | 1.750 × | 2.220 × | 1.138 × | 26 |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 1.719 × | 1.179 × | 5.638 × | 1.439 × | 7.251 × | 5.995 × | 3.162 × |
16 | 1.719 × | 1.710 × | 1.107 × | 1.849 × | 1.161 × | 3.594 × | 6.813 × |
32 | 2.188 × | 9.792 × | 3.439 × | 9.525 × | 3.359 × | 3.594 × | 3.162 × |
50 | 4.375 × | 1.440 × | 2.983 × | 8.586 × | 2.356 × | 3.594 × | 3.162 × |
100 | 1.031 × | 3.671 × | 3.781 × | 3.673 × | 3.947 × | 2.154 × | 3.162 × |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 3.125 × | 1.018 × | 4.131 × | 1.357 × | 5.547 × | 2.154 × | 3.162 × |
16 | 1.406 × | 2.894 × | 2.588 × | 2.818 × | 2.468 × | 5.995 × | 6.813 × |
32 | 5.313 × | 2.283 × | 1.355 × | 2.576 × | 1.494 × | 3.594 × | 3.162 × |
50 | 3.281 × | 8.887 × | 2.055 × | 1.072 × | 2.783 × | 7.743 × | 3.162 × |
100 | 1.078 × | 2.230 × | 5.571 × | 2.163 × | 5.337 × | 3.594 × | 1.468 × |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | m |
---|---|---|---|---|---|---|
8 | 3.437 × | 8.994 × | 2.242 × | 1.192 × | 4.132 × | 8 |
16 | 1.547 × | 4.586 × | 6.514 × | 9.183 × | 2.431 × | 16 |
32 | 1.891 × | 3.109 × | 9.291 × | 4.885 × | 9.590 × | 32 |
50 | 3.125 × | 1.110 × | 2.100 × | 2.665 × | 3.954 × | 32 |
100 | 4.828 × | 1.776 × | 3.722 × | 2.665 × | 4.321 × | 32 |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 7.813 × | 1.001 × | 1.965 × | 1.001 × | 7.904 × | 1.000 × | 3.704 × |
16 | 1.250 × | 4.017 × | 4.909 × | 3.872 × | 4.514 × | 1.000 × | 4.198 × |
32 | 6.875 × | 4.046 × | 4.834 × | 3.900 × | 4.575 × | 1.000 × | 4.536 × |
50 | 1.203 × | 4.048 × | 4.792 × | 3.902 × | 4.580 × | 1.000 × | 4.666 × |
100 | 3.156 × | 4.050 × | 4.752 × | 3.903 × | 4.582 × | 1.000 × | 4.853 × |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 1.250 × | 1.556 × | 7.644 × | 1.480 × | 5.325 × | 1.000 × | 3.452 × |
16 | 1.563 × | 4.021 × | 4.914 × | 3.876 × | 4.517 × | 1.000 × | 4.719 × |
32 | 2.594 × | 4.047 × | 4.834 × | 3.901 × | 4.575 × | 1.000 × | 5.109 × |
50 | 4.109 × | 4.050 × | 4.792 × | 3.903 × | 4.580 × | 1.000 × | 5.252 × |
100 | 9.219 × | 4.051 × | 4.753 × | 3.904 × | 4.583 × | 1.000 × | 5.469 × |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | m |
---|---|---|---|---|---|---|
8 | 1.563 × | 1.313 × | 5.184 × | 1.456 × | 6.818 × | 8 |
16 | 7.969 × | 5.551 × | 6.123 × | 8.882 × | 7.229 × | 15 |
32 | 7.187 × | 1.221 × | 2.377 × | 9.992 × | 2.229 × | 15 |
50 | 5.000 × | 7.772 × | 3.991 × | 5.551 × | 3.672 × | 15 |
100 | 9.844 × | 7.772 × | 5.525 × | 6.661 × | 3.518 × | 15 |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 1.563 × | 1.420 × | 8.300 × | 1.638 × | 6.522 × | 5.995 × | 6.813 × |
16 | 1.875 × | 1.811 × | 1.015 × | 1.871 × | 1.014 × | 3.594 × | 3.162 × |
32 | 4.687 × | 5.455 × | 1.025 × | 9.005 × | 1.015 × | 5.995 × | 1.468 × |
50 | 7.656 × | 8.563 × | 3.771 × | 8.391 × | 3.646 × | 2.154 × | 1.468 × |
100 | 2.688 × | 6.441 × | 1.500 × | 6.128 × | 1.640 × | 2.154 × | 1.468 × |
Number of Training Points | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
8 | 1.563 × | 1.263 × | 7.737 × | 2.017 × | 1.339 × | 1.000 × | 6.813 × |
16 | 4.687 × | 1.269 × | 4.961 × | 1.631 × | 5.342 × | 3.594 × | 3.162 × |
32 | 1.406 × | 1.763 × | 8.308 × | 2.230 × | 1.248 × | 3.594 × | 3.162 × |
50 | 3.281 × | 1.429 × | 1.045 × | 1.569 × | 1.017 × | 2.154 × | 1.468 × |
100 | 1.297 × | 8.261 × | 8.832 × | 7.209 × | 5.589 × | 2.154 × | 1.468 × |
Number of Training Points in Domain | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | m |
---|---|---|---|---|---|---|
9 | 4.375 × | 1.107 × | 1.904 × | 1.543 × | 4.633 × | 8 |
16 | 5.000 × | 3.336 × | 2.131 × | 4.938 × | 3.964 × | 9 |
36 | 6.406 × | 6.628 × | 5.165 × | 2.333 × | 6.961 × | 12 |
64 | 9.844 × | 4.441 × | 2.091 × | 8.882 × | 8.320 × | 15 |
100 | 1.031 × | 3.331 × | 1.229 × | 6.661 × | 1.246 × | 15 |
Number of Training Points in Domain | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
9 | 2.031 × | 2.578 × | 9.984 × | 3.941 × | 3.533 × | 1.000 × | 6.635 × |
16 | 2.344 × | 2.229 × | 6.277 × | 3.794 × | 1.731 × | 1.000 × | 3.577 × |
36 | 4.219 × | 1.254 × | 2.542 × | 2.435 × | 4.517 × | 1.000 × | 1.894 × |
64 | 5.156 × | 2.916 × | 1.193 × | 4.962 × | 1.390 × | 1.000 × | 1.589 × |
100 | 1.297 × | 1.730 × | 3.028 × | 2.673 × | 3.668 × | 1.000 × | 9.484 × |
Number of Training Points in Domain | Training Time (s) | Maximum Error on Training Set | MSE on Training Set | Maximum Error on Test Set | MSE on Test Set | ||
---|---|---|---|---|---|---|---|
9 | 5.000 × | 1.305 × | 1.936 × | 3.325 × | 8.262 × | 1.000 × | 6.948 × |
16 | 1.172 × | 2.121 × | 7.965 × | 5.507 × | 2.530 × | 1.000 × | 4.894 × |
36 | 1.891 × | 2.393 × | 6.242 × | 3.738 × | 1.341 × | 1.000 × | 2.154 × |
64 | 3.156 × | 9.501 × | 1.021 × | 1.251 × | 1.165 × | 1.000 × | 1.371 × |
100 | 8.453 × | 4.362 × | 2.687 × | 5.561 × | 2.951 × | 1.000 × | 8.891 × |
Appendix B Nonlinear ODE LS-SVM and CSVM Derivation
Appendix C Linear PDE CSVM Derivation
References
- Dormand, J.; Prince, P. A Family of Embedded Runge-Kutta Formulae. J. Comp. Appl. Math. 1980, 6, 19–26. [Google Scholar] [CrossRef]
- Berry, M.M.; Healy, L.M. Implementation of Gauss-Jackson integration for orbit propagation. J. Astronaut. Sci. 2004, 52, 351–357. [Google Scholar]
- Bai, X.; Junkins, J.L. Modified Chebyshev-Picard Iteration Methods for Orbit Propagation. J. Astronaut. Sci. 2011, 58, 583–613. [Google Scholar] [CrossRef]
- Junkins, J.L.; Younes, A.B.; Woollands, R.; Bai, X. Picard Iteration, Chebyshev Polynomials, and Chebyshev Picard Methods: Application in Astrodynamics. J. Astronaut. Sci. 2015, 60, 623–653. [Google Scholar] [CrossRef]
- Reed, J.; Younes, A.B.; Macomber, B.; Junkins, J.L.; Turner, D.J. State Transition Matrix for Perturbed Orbital Motion using Modified Chebyshev Picard Iteration. J. Astronaut. Sci. 2015, 6, 148–167. [Google Scholar] [CrossRef]
- Driscoll, T.A.; Hale, N. Rectangular spectral collocation. IMA J. Numer. Anal. 2016, 36, 108–132. [Google Scholar] [CrossRef]
- Mortari, D. The Theory of Connections: Connecting Points. Mathematics 2017, 5, 57. [Google Scholar] [CrossRef]
- Mortari, D. Least-squares Solutions of Linear Differential Equations. Mathematics 2017, 5, 48. [Google Scholar] [CrossRef]
- Mortari, D.; Johnston, H.; Smith, L. High accuracy least-squares solutions of nonlinear differential equations. J. Comput. Appl. Math. 2019, 352, 293–307. [Google Scholar] [CrossRef]
- Johnston, H.; Mortari, D. Linear Differential Equations Subject to Relative, Integral, and Infinite Constraints. In Proceedings of the 2018 AAS/AIAA Astrodynamics Specialist Conference, Snowbird, UT, USA, 19–23 August 2018. [Google Scholar]
- Johnston, H.; Leake, C.; Efendiev, Y.; Mortari, D. Selected Applications of the Theory of Connections: A Technique for Analytical Constraint Embedding. Mathematics 2019, 7, 537. [Google Scholar] [CrossRef]
- Mehrkanoon, S.; Falck, T.; Johan, A.K. Approximate Solutions to Ordinary Differential Equations using Least-squares Support Vector Machines. IEEE Trans. Neural Netw. Learn. Syst. 2012, 23, 1356–1367. [Google Scholar] [CrossRef] [PubMed]
- Freire, R.Z.; Santos, G.H.d.; Coelho, L.d.S. Hygrothermal Dynamic and Mould Growth Risk Predictions for Concrete Tiles by Using Least Squares Support Vector Machines. Energies 2017, 10, 1093. [Google Scholar] [CrossRef]
- Zhao, X.; Chen, X.; Xu, Y.; Xi, D.; Zhang, Y.; Zheng, X. An EMD-Based Chaotic Least Squares Support Vector Machine Hybrid Model for Annual Runoff Forecasting. Water 2017, 9, 153. [Google Scholar] [CrossRef]
- Gedik, N. Least Squares Support Vector Mechanics to Predict the Stability Number of Rubble-Mound Breakwaters. Water 2018, 10, 1452. [Google Scholar] [CrossRef]
- Gao, C.; Xue, W.; Ren, Y.; Zhou, Y. Numerical Control Machine Tool Fault Diagnosis Using Hybrid Stationary Subspace Analysis and Least Squares Support Vector Machine with a Single Sensor. Appl. Sci. 2017, 7. [Google Scholar] [CrossRef]
- Vapnik, V.N. Statistical Learning Theory; Wiley: Hoboken, NJ, USA, 1998. [Google Scholar]
- Kramer, M.A.; Thompson, M.L.; Bhagat, P.M. Embedding Theoretical Models in Neural Networks. In Proceedings of the 1992 American Control Conference, Chicago, IL, USA, 24–26 June 1992; pp. 475–479. [Google Scholar]
- Pathak, D.; Krähenbühl, P.; Darrell, T. Constrained Convolutional Neural Networks for Weakly Supervised Segmentation. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 11–18 December 2015; pp. 1796–1804. [Google Scholar]
- Márquez-Neila, P.; Salzmann, M.; Fua, P. Imposing Hard Constraints on Deep Networks: Promises and Limitations. arXiv 2017, arXiv:1706.02025. [Google Scholar]
- Lanczos, C. Applied Analysis. In Progress in Industrial Mathematics at ECMI 2008; Dover Publications, Inc.: New York, NY, USA, 1957; Chapter 7; p. 504. [Google Scholar]
- Wright, K. Chebyshev Collocation Methods for Ordinary Differential Equations. Comput. J. 1964, 6, 358–365. [Google Scholar] [CrossRef] [Green Version]
- Mortari, D.; Leake, C. The Multivariate Theory of Connections. Mathematics 2019, 7, 296. [Google Scholar] [CrossRef]
- Leake, C.; Mortari, D. An Explanation and Implementation of Multivariate Theory of Functional Connections via Examples. In Proceedings of the 2019 AAS/AIAA Astrodynamics Specialist Conference, Portland, ME, USA, 11–15 August 2019. [Google Scholar]
- Theodoridis, S.; Koutroumbas, K. Pattern Recognition; Academic Press: Cambridge, MA, USA, 2008. [Google Scholar]
- Mehrkanoon, S.; Suykens, J.A. LS-SVM Approximate Solution to Linear Time Varying Descriptor Systems. Automatica 2012, 48, 2502–2511. [Google Scholar] [CrossRef]
- Mehrkanoon, S.; Suykens, J. Learning Solutions to Partial Differential Equations using LS-SVM. Neurocomputing 2015, 159, 105–116. [Google Scholar] [CrossRef]
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Leake, C.; Johnston, H.; Smith, L.; Mortari, D. Analytically Embedding Differential Equation Constraints into Least Squares Support Vector Machines Using the Theory of Functional Connections. Mach. Learn. Knowl. Extr. 2019, 1, 1058-1083. https://doi.org/10.3390/make1040060
Leake C, Johnston H, Smith L, Mortari D. Analytically Embedding Differential Equation Constraints into Least Squares Support Vector Machines Using the Theory of Functional Connections. Machine Learning and Knowledge Extraction. 2019; 1(4):1058-1083. https://doi.org/10.3390/make1040060
Chicago/Turabian StyleLeake, Carl, Hunter Johnston, Lidia Smith, and Daniele Mortari. 2019. "Analytically Embedding Differential Equation Constraints into Least Squares Support Vector Machines Using the Theory of Functional Connections" Machine Learning and Knowledge Extraction 1, no. 4: 1058-1083. https://doi.org/10.3390/make1040060
APA StyleLeake, C., Johnston, H., Smith, L., & Mortari, D. (2019). Analytically Embedding Differential Equation Constraints into Least Squares Support Vector Machines Using the Theory of Functional Connections. Machine Learning and Knowledge Extraction, 1(4), 1058-1083. https://doi.org/10.3390/make1040060