arXiv Open Access 2023

Choosing Public Datasets for Private Machine Learning via Gradient Subspace Distance

Xin Gu Gautam Kamath Zhiwei Steven Wu
Lihat Sumber

Abstrak

Differentially private stochastic gradient descent privatizes model training by injecting noise into each iteration, where the noise magnitude increases with the number of model parameters. Recent works suggest that we can reduce the noise by leveraging public data for private machine learning, by projecting gradients onto a subspace prescribed by the public data. However, given a choice of public datasets, it is not a priori clear which one may be most appropriate for the private task. We give an algorithm for selecting a public dataset by measuring a low-dimensional subspace distance between gradients of the public and private examples. We provide theoretical analysis demonstrating that the excess risk scales with this subspace distance. This distance is easy to compute and robust to modifications in the setting. Empirical evaluation shows that trained model accuracy is monotone in this distance.

Penulis (3)

X

Xin Gu

G

Gautam Kamath

Z

Zhiwei Steven Wu

Format Sitasi

Gu, X., Kamath, G., Wu, Z.S. (2023). Choosing Public Datasets for Private Machine Learning via Gradient Subspace Distance. https://arxiv.org/abs/2303.01256

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2023
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓