arXiv Open Access 2020

Variational Auto-Regressive Gaussian Processes for Continual Learning

Sanyam Kapoor Theofanis Karaletsos Thang D. Bui
Lihat Sumber

Abstrak

Through sequential construction of posteriors on observing data online, Bayes' theorem provides a natural framework for continual learning. We develop Variational Auto-Regressive Gaussian Processes (VAR-GPs), a principled posterior updating mechanism to solve sequential tasks in continual learning. By relying on sparse inducing point approximations for scalable posteriors, we propose a novel auto-regressive variational distribution which reveals two fruitful connections to existing results in Bayesian inference, expectation propagation and orthogonal inducing points. Mean predictive entropy estimates show VAR-GPs prevent catastrophic forgetting, which is empirically supported by strong performance on modern continual learning benchmarks against competitive baselines. A thorough ablation study demonstrates the efficacy of our modeling choices.

Topik & Kata Kunci

Penulis (3)

S

Sanyam Kapoor

T

Theofanis Karaletsos

T

Thang D. Bui

Format Sitasi

Kapoor, S., Karaletsos, T., Bui, T.D. (2020). Variational Auto-Regressive Gaussian Processes for Continual Learning. https://arxiv.org/abs/2006.05468

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓