In recent years, the focus of large-scale assessments has begun to shift from summative evaluations of student achievement to more actionable and fine-grained information about student proficiency of skills within cognitive learning models. This change in emphasis not only has benefits for teachers, students, and parents, but also offers opportunities for researchers to better understand student learning and skill acquisition. To meet these challenges and opportunities, we need innovative psychometric models that we can use for reporting valid results and understanding student cognition.
In this presentation, we describe a framework for using diagnostic classification models (DCMs; also called cognitive diagnostic models) within an operational assessment program to measure student proficiency and evaluate hierarchies in skill acquisition. The diagnostic framework for evaluating hierarchies includes multiple methods and applications of DCMs, each providing complementary information for evaluating the theoretically implied skill hierarchy and for identifying potential causes of violations to the hierarchy. We then illustrate the diagnostic framework in practice using data from a large-scale operational assessment, the Dynamic Learning Maps® (DLM®) Alternate Assessment System. Results for DLM assessments are calculated using DCMs to provide instructionally informative results within the proposed cognitive learning model structure. Thus, DCMs are used for both scoring and evaluating skill hierarchies within the DLM assessments.
The findings demonstrate we can use DCMs for providing actionable assessment results and evaluating skill hierarchies to better understand student learning. Using the proposed DCM framework, we show that DLM skills largely conform to the hierarchy implied by the assessment design. Finally, we discuss how we can use findings to inform revisions to skill hierarchies and future avenues of research.