home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Collection of Education
/
collectionofeducationcarat1997.iso
/
HEALTH
/
MED9602.ZIP
/
M9620753.TXT
< prev
next >
Wrap
Text File
|
1996-02-26
|
2KB
|
32 lines
Document 0753
DOCN M9620753
TI How reliable is computerized assessment of readability?
DT 9602
AU Mailloux SL; Johnson ME; Fisher DG; Pettibone TJ; Kenai Care Center,
Alaska, USA.
SO Comput Nurs. 1995 Sep-Oct;13(5):221-5. Unique Identifier : AIDSLINE
MED/96005263
AB To assess the consistency and comparability of readability software
programs, four software programs (Corporate Voice, Grammatix IV,
Microsoft Word for Windows, and RightWriter) were compared. Standard
materials included 28 pieces of printed educational materials on human
immunodeficiency virus/acquired immunodeficiency syndrome distributed
nationally and the Gettysburg Address. Statistical analyses for the
educational materials revealed that each of the three formulas assessed
(Flesch-Kincaid, Flesch Reading Ease, and Gunning Fog Index) provided
significantly different grade equivalent scores and that the Microsoft
Word program provided significantly lower grade levels and was more
inconsistent in the scores provided. For the Gettysburg Address,
considerable variation was revealed among formulas, with the discrepancy
being up to two grade levels. When averaging across formulas, there was
a variation of 1.3 grade levels between the four software programs.
Given the variation between formulas and programs, implications for
decisions based on results of these software programs are provided.
DE Analysis of Variance *Artificial Intelligence Comparative Study
Evaluation Studies *Health Education *Reading Support, U.S. Gov't,
P.H.S. JOURNAL ARTICLE
SOURCE: National Library of Medicine. NOTICE: This material may be
protected by Copyright Law (Title 17, U.S.Code).