Semantic Scholar Open Access 2024 5 sitasi

Estimating the Local Learning Coefficient at Scale

Zach Furman Edmund Lau

Abstrak

The \textit{local learning coefficient} (LLC) is a principled way of quantifying model complexity, originally derived in the context of Bayesian statistics using singular learning theory (SLT). Several methods are known for numerically estimating the local learning coefficient, but so far these methods have not been extended to the scale of modern deep learning architectures or data sets. Using a method developed in {\tt arXiv:2308.12108 [stat.ML]} we empirically show how the LLC may be measured accurately and self-consistently for deep linear networks (DLNs) up to 100M parameters. We also show that the estimated LLC has the rescaling invariance that holds for the theoretical quantity.

Penulis (2)

Z

Zach Furman

E

Edmund Lau

Format Sitasi

Furman, Z., Lau, E. (2024). Estimating the Local Learning Coefficient at Scale. https://doi.org/10.48550/arXiv.2402.03698

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.48550/arXiv.2402.03698
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Total Sitasi
Sumber Database
Semantic Scholar
DOI
10.48550/arXiv.2402.03698
Akses
Open Access ✓