This paper makes use of long short-term memory (LSTM) neural networks for forecasting probability distributions of time series in terms of discrete symbols that are quantized from real-valued data. The developed framework formulates the forecasting problem into a probabilistic paradigm as hΘ: X × Y → [0, 1] such that , where X is the finite-dimensional state space, Y is the symbol alphabet, and Θ is the set of model parameters. The proposed method is different from standard formulations (e.g., autoregressive moving average (ARMA)) of time series modeling. The main advantage of formulating the problem in the symbolic setting is that density predictions are obtained without any significantly restrictive assumptions (e.g., second-order statistics). The efficacy of the proposed method has been demonstrated by forecasting probability distributions on chaotic time series data collected from a laboratory-scale experimental apparatus. Three neural architectures are compared, each with 100 different combinations of symbol-alphabet size and forecast length, resulting in a comprehensive evaluation of their relative performances.
Neural Probabilistic Forecasting of Symbolic Sequences With Long Short-Term Memory
Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT,AND CONTROL. Manuscript received April 17, 2017; final manuscript received January 8, 2018; published online March 30, 2018. Assoc. Editor: Dumitru I. Caruntu.
- Views Icon Views
- Share Icon Share
- Cite Icon Cite
- Search Site
Hauser, M., Fu, Y., Phoha, S., and Ray, A. (March 30, 2018). "Neural Probabilistic Forecasting of Symbolic Sequences With Long Short-Term Memory." ASME. J. Dyn. Sys., Meas., Control. August 2018; 140(8): 084502. https://doi.org/10.1115/1.4039281
Download citation file:
- Ris (Zotero)
- Reference Manager