arXiv
Open Access
2024
A Proof of Exact Convergence Rate of Gradient Descent. Part I. Performance Criterion $\Vert \nabla f(x_N)\Vert^2/(f(x_0)-f_*)$
Jungbin Kim
Abstrak
We prove the exact worst-case convergence rate of gradient descent for smooth strongly convex optimization, with respect to the performance criterion $\Vert \nabla f(x_N)\Vert^2/(f(x_0)-f_*)$. The proof differs from the previous one by Rotaru \emph{et al.} [RGP24], and is based on the performance estimation methodology [DT14].
Topik & Kata Kunci
Penulis (1)
J
Jungbin Kim
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2024
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓