Slide 36
Slide 36 text
Reliability (factor.sav)
Internal consistency
• Coefficient alpha
• Analyze, Scale, reliability, select alpha from drop
down, move over items
• Click statistics, choose all in the “descriptives for” box
and check the inter-item correlations
Multiple Raters
• Cohen’s kappa (Agreement for 2 raters)
• Analyze, Descriptives, Crosstabs, move over the rater
1 to rows and 1 to columns
• Click on statistics, choose kappa
• ICC (intraclass correlation coefficient
• Analyze, Scale, reliability, select alpha from drop
down, move over items
• Click on ICC, choose appropriate one
ICC
• One-Way: when raters did NOT rate all items
• Two-way random: If raters rated all items, but they
represent only a sample of possible raters
• Two-way mixed: if raters rated all items, and
represent all the raters that will be used.
• This is most common for research
• If the variability due to personal responding of raters
is not “error” and Consistency should be chosen in
the drop down on the right side.