Semantic Scholar Open Access 1996 2571 sitasi

Assessing Agreement on Classification Tasks: The Kappa Statistic

J. Carletta

Abstrak

Currently, computational linguists and cognitive scientists working in the area of discourse and dialogue argue that their subjective judgments are reliable using several different statistics, none of which are easily interpretable or comparable to each other. Meanwhile, researchers in content analysis have already experienced the same difficulties and come up with a solution in the kappa statistic. We discuss what is wrong with reliability measures as they are currently used for discourse and dialogue work in computational linguistics and cognitive science, and argue that we would be better off as a field adopting techniques from content analysis.

Penulis (1)

J

J. Carletta

Format Sitasi

Carletta, J. (1996). Assessing Agreement on Classification Tasks: The Kappa Statistic. https://www.semanticscholar.org/paper/613b6c9a85ae338cd3b405dc019c8edb1c15717c

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
1996
Bahasa
en
Total Sitasi
2571×
Sumber Database
Semantic Scholar
Akses
Open Access ✓