A New Method for Analyzing Students’ Answers for Constructed Response Items in Formative Assessments - An Application of a Topic Model

Location

Measurement and Statistics - Boston 1

Proposal Track

Research Project

Session Format

Presentation

Abstract

Constructed response (CR) items are often used in order to assess higher-order thinking skills (Brookhart, 2010). Answers to CR items are scored using a rubric, and then the scores are analyzed, for example, to understand the effects of instruction with respect to the objectives measured on the test. In this way, rubric-based scores provide useful information. There is also information, however, in the text of written answers that may not be fully represented in the rubric-based scores (Cardozo-Gaibisso et al., 2019). Topic modeling provides a useful set of methods for extracting this additional information from this text.

Topic models are statistical methods for extracting latent themes from the text in a collection of documents. Recent research has shown that these models can be used to analyze CR responses to provide evidence of examinee cognitive processing (Copur-Gencturk, Choi & Cohen, in preparation).

In this study, we will demonstrate how a topic model along with the rubric-based scores can be used to improve the information from CR responses and, thus, to better assess students’ knowledge status. We do this in the context of formative assessments for English Language Arts and social studies for Grades 6-10.

References

Brookhart, S. M. (2010). How to assess higher-order thinking skills in your classroom. Alexandria, VA: ASCD.

Cardozo-Gaibisso, L., Kim, S., Buxton, C., & Cohen, A. (2019, July). Thinking beyond the score with SFL and text analysis: multidimensional analysis of student assessment performance. Paper presented at the 46th International Systemic Functional Congress, Santiago, Chile.

Copur-Gencturk, Y., Choi, H.-J., & Cohen, A.S. (2019, in preparation). Qualitative and quantitative assessment of teachers’ proportional reasoning. University of Georgia: Athens, GA.

Keywords

topic model, constructed response item

Professional Bio

Hye-Jeong Choi is an Associate Research Scientist and an Adjunct Assistant Professor in the Quantitative Methodology Program in the Department of Educational Psychology at University of Georgia. Her expertise is in applied statistics with a focus on psychometric and latent variable models. Dr. Choi’s research focuses on item response theory and diagnostic measurement.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.

This document is currently not available here.

Share

COinS
 
Oct 4th, 1:45 PM Oct 4th, 3:30 AM

A New Method for Analyzing Students’ Answers for Constructed Response Items in Formative Assessments - An Application of a Topic Model

Measurement and Statistics - Boston 1

Constructed response (CR) items are often used in order to assess higher-order thinking skills (Brookhart, 2010). Answers to CR items are scored using a rubric, and then the scores are analyzed, for example, to understand the effects of instruction with respect to the objectives measured on the test. In this way, rubric-based scores provide useful information. There is also information, however, in the text of written answers that may not be fully represented in the rubric-based scores (Cardozo-Gaibisso et al., 2019). Topic modeling provides a useful set of methods for extracting this additional information from this text.

Topic models are statistical methods for extracting latent themes from the text in a collection of documents. Recent research has shown that these models can be used to analyze CR responses to provide evidence of examinee cognitive processing (Copur-Gencturk, Choi & Cohen, in preparation).

In this study, we will demonstrate how a topic model along with the rubric-based scores can be used to improve the information from CR responses and, thus, to better assess students’ knowledge status. We do this in the context of formative assessments for English Language Arts and social studies for Grades 6-10.

References

Brookhart, S. M. (2010). How to assess higher-order thinking skills in your classroom. Alexandria, VA: ASCD.

Cardozo-Gaibisso, L., Kim, S., Buxton, C., & Cohen, A. (2019, July). Thinking beyond the score with SFL and text analysis: multidimensional analysis of student assessment performance. Paper presented at the 46th International Systemic Functional Congress, Santiago, Chile.

Copur-Gencturk, Y., Choi, H.-J., & Cohen, A.S. (2019, in preparation). Qualitative and quantitative assessment of teachers’ proportional reasoning. University of Georgia: Athens, GA.