Using automated questions to assess reading comprehension, vocabulary, and effects of tutorial interventions

Jack Mostow, Joseph E. Beck, Juliet Bey, Andrew Cuneo, J. Sison, Brian Tobin, and Joseph Valeri
Technology, Instruction, Cognition and Learning, Vol. 2, 2004, pp. 97 - 134.


Download
  • Adobe portable document format (pdf) (619KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
We describe the automated generation and use of 69,326 comprehension cloze questions and 5,668 vocabulary matching questions in the 2001-2002 version of Project LISTEN's Reading Tutor used by 364 students in grades 1-9 at seven schools. To validate our methods, we used students' performance on these multiple-choice questions to predict their scores on the Woodcock Reading Mastery Test. A model based on students' cloze performance predicted their Passage Comprehension scores with correlation R=.85. The percentage of vocabulary words that students matched correctly to their definitions predicted their Word Comprehension scores with correlation R=.61.

We used both types of questions in a within-subject automated experiment to compare four ways to preview new vocabulary before a story - defining the word, giving a synonym, asking about the word, and doing nothing. Outcomes included comprehension as measured by performance on multiple-choice cloze questions during the story, and vocabulary as measured by matching words to their definitions in a posttest after the story. A synonym or short definition significantly improved posttest performance compared to just encountering the word in the story - but only for words students didn't already know, and only if they had a grade 4 or better vocabulary. Such a preview significantly improved performance during the story on cloze questions involving the previewed word - but only for students with a grade 1-3 vocabulary.

Notes
Associated Lab(s) / Group(s): Project LISTEN
Associated Project(s): Project LISTEN\'s Reading Tutor
Number of pages: 38

Text Reference
Jack Mostow, Joseph E. Beck, Juliet Bey, Andrew Cuneo, J. Sison, Brian Tobin, and Joseph Valeri, "Using automated questions to assess reading comprehension, vocabulary, and effects of tutorial interventions," Technology, Instruction, Cognition and Learning, Vol. 2, 2004, pp. 97 - 134.

BibTeX Reference
@article{Mostow_2004_4989,
   author = "Jack Mostow and Joseph E Beck and Juliet Bey and Andrew Cuneo and J. Sison and Brian Tobin and Joseph Valeri",
   title = "Using automated questions to assess reading comprehension, vocabulary, and effects of tutorial interventions",
   journal = "Technology, Instruction, Cognition and Learning",
   pages = "97 - 134",
   year = "2004",
   volume = "2",
}