arXiv Open Access 2021

Initializing LSTM internal states via manifold learning

Felix P. Kemeth Tom Bertalan Nikolaos Evangelou Tianqi Cui Saurabh Malani +1 lainnya
Lihat Sumber

Abstrak

We present an approach, based on learning an intrinsic data manifold, for the initialization of the internal state values of LSTM recurrent neural networks, ensuring consistency with the initial observed input data. Exploiting the generalized synchronization concept, we argue that the converged, "mature" internal states constitute a function on this learned manifold. The dimension of this manifold then dictates the length of observed input time series data required for consistent initialization. We illustrate our approach through a partially observed chemical model system, where initializing the internal LSTM states in this fashion yields visibly improved performance. Finally, we show that learning this data manifold enables the transformation of partially observed dynamics into fully observed ones, facilitating alternative identification paths for nonlinear dynamical systems.

Penulis (6)

F

Felix P. Kemeth

T

Tom Bertalan

N

Nikolaos Evangelou

T

Tianqi Cui

S

Saurabh Malani

I

Ioannis G. Kevrekidis

Format Sitasi

Kemeth, F.P., Bertalan, T., Evangelou, N., Cui, T., Malani, S., Kevrekidis, I.G. (2021). Initializing LSTM internal states via manifold learning. https://arxiv.org/abs/2104.13101

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2021
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓