Appendix E: Content Coding and Intercoder Reliability
For the web site and social media content analysis, intercoder reliability scores were calculated for the variables with an interpretive dimension that were used in the analysis. Thus, in this case the three variables of concern were whether the story/post was Original, whether it was About the Community, and whether it Addressed a Critical Information Need. In order to be more, rather than less, inclusive, stories/posts that were coded as Unclear for the Original and About Community variables were recoded as Yeses for these categories; thus the Yes and Unclear coding categories were collapsed for the purposes of calculating intercoder reliability. Similarly, because the analyses below utilize the Critical Information Needs variable in a binary capacity (i.e., Yes or No), the eight critical information needs categories also were collapsed into a single Yes category for the purposes of calculating intercoder reliability.
For the social media analyses, the average pairwise agreement across the three coders was 81 percent for the Critical Information Needs and About Community variables; and 100 percent for the Originality variable. For the web site analyses, the average pairwise agreement across the three coders was 79 percent for Critical Information Needs, 89 percent for About Community, and 81 percent for Originality. According to Neuendorf (2002), agreement levels of 80 percent or greater are generally acceptable, with levels in the 70 percent range appropriate for exploratory studies of new indices (as is the case here).