Application of Two-Parameter Item Response Theory for Determining Form-Dependent Items on Exams Using Different Item Orders




Using multiple versions of an assessment has the potential to introduce item environment effects. These types of effects result in version dependent item characteristics (i.e., difficulty and discrimination). Methods to detect such effects and resulting implications are important for all levels of assessment where multiple forms of an assessment are created. This report describes a novel method for identifying items that do and do not display form dependence. The first two steps identify form dependent items using a differential item functioning (DIF) analysis of item parameters estimated by Item Response Theory. The method is illustrated using items that appeared in four forms (two trial and two released versions) of a first semester general chemistry examination. Eighteen of fifty-six items were identified as having item parameters that were form dependent. Thirteen of those items displayed a form dependence consistent with reasons previously identified in the literature: preceding item difficulty, content priming, and a combination of preceding item difficulty and content priming. The remaining five items had form dependence that did not align reasons reported in the literature. An analysis was done to determine if all possible instances of predicted form dependence could be found. Several instances where form dependence could have been found, based on the preceding item difficulty or content priming, were identified, and those items did not display form dependence. We identify and rationalize form dependence for thirteen of the eighteen items flagged; however, we are unable to predict form dependence for items.