Agreement and Information in the Reliability of Coding

Loading...
Thumbnail Image
Penn collection
Departmental Papers (ASC)
Degree type
Discipline
Subject
Communication
Social and Behavioral Sciences
Funder
Grant number
License
Copyright date
Distributor
Related resources
Contributor
Abstract

Coefficients that assess the reliability of data making processes – coding text, transcribing interviews, or categorizing observations into analyzable terms – are mostly conceptualized in terms of the agreement a set of coders, observers, judges, or measuring instruments exhibit. When variation is low, reliability coefficients reveal their dependency on an often neglected phenomenon, the amount of information that reliability data provide about the reliability of the coding process or the data it generates. This paper explores the concept of reliability, simple agreement, four conceptions of chance to correct that agreement, sources of information deficiency, and develops two measures of information about reliability, akin to the power of a statistical test, intended as a companion to traditional reliability coefficients, especially Krippendorff‟s (2004, pp. 221-250; Hayes & Krippendorff, 2007) alpha.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2011-01-01
Journal title
Communication Methods and Measures
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection