Semantic Scholar Open Access 2018 1046 sitasi

Meta-learning with differentiable closed-form solvers

Luca Bertinetto João F. Henriques Philip H. S. Torr A. Vedaldi

Abstrak

Adapting deep networks to new concepts from a few examples is challenging, due to the high computational requirements of standard fine-tuning procedures. Most work on few-shot learning has thus focused on simple learning techniques for adaptation, such as nearest neighbours or gradient descent. Nonetheless, the machine learning literature contains a wealth of methods that learn non-deep models very efficiently. In this paper, we propose to use these fast convergent methods as the main adaptation mechanism for few-shot learning. The main idea is to teach a deep network to use standard machine learning tools, such as ridge regression, as part of its own internal model, enabling it to quickly adapt to novel data. This requires back-propagating errors through the solver steps. While normally the cost of the matrix operations involved in such a process would be significant, by using the Woodbury identity we can make the small number of examples work to our advantage. We propose both closed-form and iterative solvers, based on ridge regression and logistic regression components. Our methods constitute a simple and novel approach to the problem of few-shot learning and achieve performance competitive with or superior to the state of the art on three benchmarks.

Penulis (4)

L

Luca Bertinetto

J

João F. Henriques

P

Philip H. S. Torr

A

A. Vedaldi

Format Sitasi

Bertinetto, L., Henriques, J.F., Torr, P.H.S., Vedaldi, A. (2018). Meta-learning with differentiable closed-form solvers. https://www.semanticscholar.org/paper/208cd4b25768f0096fb2e80e7690473da0e2a563

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2018
Bahasa
en
Total Sitasi
1046×
Sumber Database
Semantic Scholar
Akses
Open Access ✓