arXiv
Open Access
2019
Contrastive Learning for Lifted Networks
Christopher Zach
Virginia Estellers
Abstrak
In this work we address supervised learning of neural networks via lifted network formulations. Lifted networks are interesting because they allow training on massively parallel hardware and assign energy models to discriminatively trained neural networks. We demonstrate that the training methods for lifted networks proposed in the literature have significant limitations and show how to use a contrastive loss to address those limitations. We demonstrate that this contrastive training approximates back-propagation in theory and in practice and that it is superior to the training objective regularly used for lifted networks.
Topik & Kata Kunci
Penulis (2)
C
Christopher Zach
V
Virginia Estellers
Akses Cepat
Informasi Jurnal
- Tahun Terbit
- 2019
- Bahasa
- en
- Sumber Database
- arXiv
- Akses
- Open Access ✓