[edit]
Decoupling dynamics and sampling: RNNs for unevenly sampled data and flexible online predictions
Proceedings of the 3rd Conference on Learning for Dynamics and Control, PMLR 144:943-953, 2021.
Abstract
Recurrent neural networks (RNNs) incorporate a memory state which makes them suitable for time series analysis. The Linear Antisymmetric RNN (LARNN) is a previously suggested recurrent layer which is proven to ensure long-term memory using a simple structure without gating. The LARNN is based on an ordinary differential equation which is solved using numerical methods with a defined step size variable. In this paper, this step size is related to the sampling frequency of the data used for training and testing of the models. In particular, industrial datasets often consist of measurements that are sampled and analyzed manually or sampled only for sufficiently large change. This is usually handled by resampling and performing some kind of interpolation to gain a dataset with evenly sampled data. However, in doing so, one has to apply several assumption regarding the nature of the data (e.g. linear interpolation) and valuable information about the dynamics captured by the actual sampling is lost. Furthermore, interpolation is non-causal by nature, and thus poses a challenge in an online setting as future values are not known. By using information about sampling time in the LARNN structure, interpolation is obsolete as the model decouples the dynamics of the sampled system from the sampling regime. Furthermore, the suggested structure enables predictions related to specific times in the future, resulting in updated predictions regardless of whether new measurements are available. The performance of the LARNN is compared to an LSTM on a simulated industrial benchmark system.