$30 off During Our Annual Pro Sale. View Details »

Empirical methods for evaluating maps: Illustrations and results

Empirical methods for evaluating maps: Illustrations and results

Learning progressions and learning map structures are increasingly being used as the basis for the design of large-scale assessments. Of critical importance to these designs is the validity of the map structure used to build the assessments. Most commonly, evidence for the validity of a map structure comes from procedural evidence gathered during the learning map creation process (e.g., research literature, external reviews, etc.). However, it is also important to provide support for the validity of the map structure with empirical evidence using data gathered from the assessment. In this paper, we propose a framework for the empirical validation of learning maps and progressions using diagnostic classification models. Three methods are proposed within this framework that provide different levels of model assumptions and types of inferences. The framework is then applied to the Dynamic Learning Maps (DLM) alternate assessment system to illustrate the utility and limitations of each method. Results show that each of the proposed methods has some limitations, but are able to provide complementary information for the evaluation of the proposed structure of content standards (Essential Elements) in the DLM assessment.

Jake Thompson

April 06, 2019
Tweet

More Decks by Jake Thompson

Other Decks in Education

Transcript

  1. Empirical Methods for Evaluating
    Maps: Illustrations and Results
    W. Jake Thompson & Brooke Nash

    View Slide

  2. 2
    Methods for Evaluating Map Structure
    • External outcomes
    • Classical item statistics
    • Unidimensional models

    View Slide

  3. 3
    A Framework for Map Evaluation
    • Diagnostic Classification Models (DCMs)
    • Mastery profiles on the set of assessed skills
    • Three methods
    –Patterns of Mastery Profiles
    –Patterns of Mastery Assignment
    –Patterns of Attribute Difficulty

    View Slide

  4. 4
    An Illustrative Example
    • 3 attribute assessment
    • Linear map structure
    Initial Precursor Target

    View Slide

  5. 5
    Map Structure in a DCM Context

    View Slide

  6. 6
    • Estimate two models
    – Saturated model with all
    profiles
    – Reduced model with only
    hypothesized profiles
    • Assess model fit
    – Posterior predictive model
    checks
    – Model comparisons
    Patterns of Mastery Profiles
    Initial Precursor Target
    0 0 0
    1 0 0
    0 1 0
    0 0 1
    1 1 0
    1 0 1
    0 1 1
    1 1 1

    View Slide

  7. 7
    Patterns of Attribute Mastery
    • Estimate each attribute as a separate 1-attribute DCM
    (equivalent to LCA)
    • Set mastery threshold (0.8)
    Student Initial Precursor Target
    1 .97 .85 .43
    2 .86 .52 .13
    3 .92 .89 .83
    4 .88 .65 .85
    5 .55 .70 .33
    … … … …
    Student Initial Precursor Target
    1 1 1 0
    2 1 0 0
    3 1 1 1
    4 1 0 1
    5 0 0 0
    … … … …
    Student Initial Precursor Target
    1 1 1 0
    2 1 0 0
    3 1 1 1
    4 1 0 1
    5 0 0 0
    … … … …

    View Slide

  8. 8
    • Measure attribute difficulty using
    classical p-values
    • Group similar respondents a priori
    • Calculate the weighted average p-
    value for each attribute and group
    Patterns of Attribute Difficulty

    View Slide

  9. 9
    Case Study: Dynamic Learning Maps
    • Each Essential Element (EE) available at multiple
    levels of depth, breadth, and complexity
    –5 levels in ELA and mathematics
    –3 levels in science
    • Linkage levels are assumed to follow a linear
    progression
    • Students test on only one linkage level for each EE
    during the operational assessment

    View Slide

  10. 10
    • Patterns of Profile Mastery
    – Models fail to converge due to
    missing data
    • Patterns of Attribute Mastery
    – The majority of flags were in
    ELA
    – More flags for higher linkage
    level reversals than lower
    Case Study: Dynamic Learning Maps

    View Slide

  11. 11
    Case Study: Dynamic Learning Maps
    • Patterns of Attribute Difficulty
    –Flags by subject
    • 28 ELA EEs
    • 35 mathematics EEs
    • 0 science EEs

    View Slide

  12. 12
    Summary
    • Benefits and limitations of each method within the
    framework
    • Wide breadth of methods provides complementary
    information
    • Application to DLM shows insights that can be
    applied to future test and map development

    View Slide

  13. 13
    Ongoing Research
    • Continue to refine methods
    –Alternative modeling strategies for Patterns of
    Mastery Profiles
    –Simulation studies to inform empirical flagging
    criteria
    • Expanding beyond the progression of linkage levels
    within EEs to the more fine-grained map structure

    View Slide

  14. More Information

    View Slide