Readability Standards for Informed-Consent Forms as Compared With Actual Readability
Paasche-Orlow, Michael K.
Taylor, Holly A.
Brancati, Frederick L.
New England Journal of Medicine 2003 February 20; 348(8): 721- 726
BACKGROUND: Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed- consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight. METHODS: To test these hypotheses, we conducted a cross-sectional study linking data from several public-use sources. A total of 114 Web sites of U.S. medical schools were surveyed for IRB readability standards and informed-consent-form templates. Actual readability was measured with the Flesch-Kincaid scale, which assigns a score on the basis of the minimal grade level required to read and understand English text (range, 0 to 12). Data on the level of research activity, local literacy rates, and federal oversight were obtained from organizational Web sites. RESULTS: The average readability score for text provided by IRBs was 10.6 (95 percent confidence interval, 10.3 to 10.8) on the Flesch-Kincaid scale. Specific readability standards, found on 61 Web sites (54 percent), ranged from a 5th-grade reading level to a 10th-grade reading level. The mean Flesch-Kincaid scores for the readability of sample text provided by IRBs exceeded the stated standard by 2.8 grade levels (95 percent confidence interval, 2.4 to 3.2; P
Showing items related by title, author, creator and subject.
Dickert, Neal; Kass, Nancy; Paasche-Orlow, Michael; Taylor, Holly (2005-01)Although the informed consent process is crucial to protecting human research subjects, there are cases when particular information within the consent form may present risks to those subjects. In this paper, we examine a ...