arXiv Open Access 2022

Capacity dependent analysis for functional online learning algorithms

Xin Guo Zheng-Chu Guo Lei Shi
Lihat Sumber

Abstrak

This article provides convergence analysis of online stochastic gradient descent algorithms for functional linear models. Adopting the characterizations of the slope function regularity, the kernel space capacity, and the capacity of the sampling process covariance operator, significant improvement on the convergence rates is achieved. Both prediction problems and estimation problems are studied, where we show that capacity assumption can alleviate the saturation of the convergence rate as the regularity of the target function increases. We show that with properly selected kernel, capacity assumptions can fully compensate for the regularity assumptions for prediction problems (but not for estimation problems). This demonstrates the significant difference between the prediction problems and the estimation problems in functional data analysis.

Topik & Kata Kunci

Penulis (3)

X

Xin Guo

Z

Zheng-Chu Guo

L

Lei Shi

Format Sitasi

Guo, X., Guo, Z., Shi, L. (2022). Capacity dependent analysis for functional online learning algorithms. https://arxiv.org/abs/2209.12198

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓