Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.

Author ORCID Identifier

N/A

AccessType

Open Access Dissertation

Document Type

dissertation

Degree Name

Doctor of Education (EdD)

Degree Program

Education

Year Degree Awarded

2015

Month Degree Awarded

September

First Advisor

Ronald K. Hambleton

Second Advisor

Jennifer T. Randall

Subject Categories

Educational Assessment, Evaluation, and Research | Educational Methods

Abstract

Common Core State Standards in English Language Arts and Mathematics at grades K to 12 were introduced in 2009 and at one time had been accepted by 45 of the states in the U.S. The new standards have created national curricula in these two subject areas. Along with this reform, new assessment systems have been developed too. Many of these new tests are showing signs of being more multidimensional than the tests they were replacing because of the use of new item formats, and the assessment of higher level thinking skills and various performance skills. In the short term at least, the new testing programs will still be using unidimensional IRT models because multidimensional IRT models are not well developed for wide-scale operational use. Test equating and proficiency estimates are likely to be influenced due to the violation of the unidimensionality assumption in the test data, which posed a threat to test fairness and the validity of score interpretations.

Looking at the consequences of unidimensional models with multidimensional data is an important problem to investigate. In the first study, the potential effect of multidimensionality on item parameter invariance and IRT model fit at the item level was examined. The finding highlighted the fact that very modest changes in construct shift in the operational items of a test would not be problematic.

As an extension of the first study, the second study introduced multidimensionality to a much greater magnitude and evaluated impact on IRT equating and proficiency estimates. It was found when construct shift occurred in the operational items, the impact was minimal in most aspects. Whereas, when construct shift occurred in the anchor and when these equating items were included in the total test, the influence of construct shift was quite substantial.

The present research has provided evidence of consequences caused by construct shift as a function of the correlations between pairs of dimensions, the amount of shift, and where the shift occurs (equating or operational items). In general, it was found that construct shift had increasing impact as the amount of shift increased and the correlations between dimensions decreased.

DOI

https://doi.org/10.7275/7510639.0

Share

COinS