Off-campus UMass Amherst users: To download dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users, please click the view more button below to purchase a copy of this dissertation from Proquest.

(Some titles may also be available free of charge in our Open Access Dissertation Collection, so please check there first.)

Using item mapping to evaluate alignment between curriculum and assessment

Leah T Kaira, University of Massachusetts Amherst

Abstract

There is growing interest in alignment between state's standards and test content partly due to accountability requirements of the No Child Left Behind (NCLB) Act of 2001. Among other problems, current alignment methods almost entirely rely on subjective judgment to assess curriculum-assessment alignment. In addition none of the current alignment models accounts for student actual performance on the assessment and there are no consistent criteria for assessing alignment across the various models. Due to these problems, alignment results employing different models cannot be compared. This study applied item mapping to student response data for the Massachusetts Adult Proficiency Test (MAPT) for Math and Reading to assess alignment. Item response theory (IRT) was used to locate items on a proficiency scale and then two criterion response probability (RP) values were applied to the items to map each item to a proficiency category. Item mapping results were compared to item writers' classification of the items. Chi-square tests, correlations, and logistic regression were used to assess the degree of agreement between the two sets of data. Seven teachers were convened for a one day meeting to review items that do not map to intended grade level to explain the misalignment. Results show that in general, there was higher agreement between SMEs classification and item mapping results at RP50 than RP67. Higher agreement was also observed for items assessing lower level cognitive abilities. Item difficulty, cognitive demand, clarity of the item, level of vocabulary of item compared to reading level of examinees and mathematical concept being assessed were some of the suggested reasons for misalignment.

Subject Area

Educational tests & measurements|Educational evaluation

Recommended Citation

Kaira, Leah T, "Using item mapping to evaluate alignment between curriculum and assessment" (2010). Doctoral Dissertations Available from Proquest. AAI3427541.
https://scholarworks.umass.edu/dissertations/AAI3427541

Share

COinS