arXiv Open Access 2020

On the derivatives of feed-forward neural networks

Rabah Abdul Khalek Valerio Bertone
Lihat Sumber

Abstrak

In this paper we present a C++ implementation of the analytic derivative of a feed-forward neural network with respect to its free parameters for an arbitrary architecture, known as back-propagation. We dubbed this code NNAD (Neural Network Analytic Derivatives) and interfaced it with the widely-used ceres-solver minimiser to fit neural networks to pseudodata in two different least-squares problems. The first is a direct fit of Legendre polynomials. The second is a somewhat more involved minimisation problem where the function to be fitted takes part in an integral. Finally, using a consistent framework, we assess the efficiency of our analytic derivative formula as compared to numerical and automatic differentiation as provided by ceres-solver. We thus demonstrate the advantage of using NNAD in problems involving both deep or shallow neural networks.

Topik & Kata Kunci

Penulis (2)

R

Rabah Abdul Khalek

V

Valerio Bertone

Format Sitasi

Khalek, R.A., Bertone, V. (2020). On the derivatives of feed-forward neural networks. https://arxiv.org/abs/2005.07039

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2020
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓