Semantic Scholar Open Access 2018 437 sitasi

Why Is My Classifier Discriminatory?

I. Chen Fredrik D. Johansson D. Sontag

Abstrak

Recent attempts to achieve fairness in predictive models focus on the balance between fairness and accuracy. In sensitive applications such as healthcare or criminal justice, this trade-off is often undesirable as any increase in prediction error could have devastating consequences. In this work, we argue that the fairness of predictions should be evaluated in context of the data, and that unfairness induced by inadequate samples sizes or unmeasured predictive variables should be addressed through data collection, rather than by constraining the model. We decompose cost-based metrics of discrimination into bias, variance, and noise, and propose actions aimed at estimating and reducing each term. Finally, we perform case-studies on prediction of income, mortality, and review ratings, confirming the value of this analysis. We find that data collection is often a means to reduce discrimination without sacrificing accuracy.

Penulis (3)

I

I. Chen

F

Fredrik D. Johansson

D

D. Sontag

Format Sitasi

Chen, I., Johansson, F.D., Sontag, D. (2018). Why Is My Classifier Discriminatory?. https://www.semanticscholar.org/paper/fd674e10770eb72e66a20e1c752c62dc7c12c0a4

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2018
Bahasa
en
Total Sitasi
437×
Sumber Database
Semantic Scholar
Akses
Open Access ✓