Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.
Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.
Author ORCID Identifier
Open Access Dissertation
Doctor of Philosophy (PhD)
Year Degree Awarded
Month Degree Awarded
Craig S. Wells
Educational Assessment, Evaluation, and Research
Large-scale assessments (LSAs), such as the National Assessment of Educational Progress (NAEP) are low-stakes tests for examinees; consequently, they might randomly guess or generate no responses. Such disengaged test-taking behavior can undermine the validity of test score interpretation. To account for such behavior, various methods have been proposed over the years, which can be classified as ad hoc or model-based. For instance, the Programme for the International Assessment of Adult Competencies (PIAAC) uses a common time threshold (e.g., 5 seconds) method for all items: if an examinee spends more than or equal to five seconds on an item, the omitted response is coded as incorrect; otherwise, it is coded as ignored. Recently, the speed-accuracy+omission (i.e., SA+O model) has been proposed for modeling the processes underlying response and nonresponse behavior. The present research aims to investigate the impact of omitted responses on item and person parameter estimates with the ad hoc and the model-based approaches in the context of LSAs. In a simulation study, we examine (a) how ad hoc and model-based approaches for handling omitted responses compare in terms of item and person parameter estimation in IRT and (b) whether there is a practical difference between ad hoc and model-based approaches to handling omitted responses in real data analyses. Finally, we illustrate the practical implications of selecting a certain approach for handling the omitted items in LSAs through an empirical analysis.
Hong, Seong Eun, "Evaluating Approaches for Dealing with Omitted Items in Large-Scale Assessments" (2021). Doctoral Dissertations. 2188.