Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.
Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.
Author ORCID Identifier
N/A
AccessType
Open Access Dissertation
Document Type
dissertation
Degree Name
Doctor of Education (EdD)
Degree Program
Education
Year Degree Awarded
2014
Month Degree Awarded
February
First Advisor
Lisa A. Keller
Second Advisor
Erin M. Conlon
Third Advisor
Craig S. Wells
Subject Categories
Education | Educational Methods | Other Education
Abstract
In adaptive testing, including multistage adaptive testing (MST), the psychometric properties of the test items are needed to route the examinees through the test. However, if testing programs use items which are automatically generated at the time of administration there is no opportunity to calibrate the items therefore the items’ psychometric properties need to be predicted. This simulation study evaluates the accuracy with which examinees’ abilities can be estimated when automatically generated items, specifically, item clones, are used in MSTs. The behavior of the clones in this study was modeled according to the results of Sinharay and Johnson’s (2008) investigation into item clones that were administered in an experimental section of the Graduate Record Examination (GRE). In the current study, as more clones were incorporated or when the clones varied greatly from the parent items, the examinees’ abilities were not as accurately estimated. However, there were a number of promising conditions; for example, on a 600-point scale, the absolute bias was less than 10 points for most examinees when all items were simulated to be clones with small variation from their parent items or when all first stage items were simulated to have moderate variation from their parents and no items in the second stage were cloned items.
DOI
https://doi.org/10.7275/5428533
Recommended Citation
Colvin, Kimberly F., "Effect of Automatic Item Generation on Ability Estimates in a Multistage Test" (2014). Doctoral Dissertations. 4.
https://doi.org/10.7275/5428533
https://scholarworks.umass.edu/dissertations_2/4