Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.

Author ORCID Identifier

https://orcid.org/0000-0001-5226-7653

AccessType

Open Access Dissertation

Document Type

dissertation

Degree Name

Doctor of Philosophy (PhD)

Degree Program

Education

Year Degree Awarded

2022

Month Degree Awarded

September

First Advisor

Stephen G. Sireci

Subject Categories

Educational Assessment, Evaluation, and Research

Abstract

Provision of item choice and use of digital avatars in reading assessment contexts may play a critical role in fostering student agency, engagement, and equity in assessment. Using a mixed methods approach, this study was conducted over four phases within an equity framework. Phase I consisted of a survey, during Phase II an iOS software application and reading test was developed, Phase III was an experimental study, and during phase IV post-experiment interviews were conducted. Research questions for this study (1) explored student interests and experiences related to assessments and technology use, (2) investigated the effect of item choice and avatar use on student performance in a reading assessment, (3) examined the effect of item choice and avatar use on student engagement in a reading assessment, (4) investigated the alignment between student reading preferences and item selections, and (5) explored student experience with the software application and reading test. The initial sample included students in grades 6-7 from a public middle school in Arkansas (N = 298). The three factors included in the main analyses were item choice, avatar use, and test block order. In terms of effect on test performance, results indicated that students performed better on the ‘no choice’ section of the test compared to the ‘choice’ section. Results related to the effect on test engagement showed that students had higher RTE values on the ‘choice’ section of the test when that section was presented first. Similarly, students had higher RTE values on the ‘no choice’ section when it was presented first. Furthermore, results indicate that most students’ reading test preferences aligned with their choices on the reading assessment; however, this alignment did not influence students’ test performance. Finally, students reported mostly positive experiences/perspectives in relation to the software application, use of avatars, and provision of choice on the assessment; however, these were mitigated by some negative perspectives regarding their expectations around the choice options and the lack of functionality within the reading test. The limitations of this study and future directions in educational assessment are also discussed.

DOI

https://doi.org/10.7275/31052440

Creative Commons License

Creative Commons Attribution-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-No Derivative Works 4.0 License.

Share

COinS