Off-campus UMass Amherst users: To download dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users, please click the view more button below to purchase a copy of this dissertation from Proquest.

(Some titles may also be available free of charge in our Open Access Dissertation Collection, so please check there first.)

Measurements of student understanding on complex scientific reasoning problems

Alisa Sau-Lin Izumi, University of Massachusetts Amherst

Abstract

While there has been much discussion of cognitive processes underlying effective scientific teaching, less is known about the response nature of assessments targeting processes of scientific reasoning specific to biology content. This study used multiple-choice (m-c) and short-answer essay student responses to evaluate progress in high-order reasoning skills. In a pilot investigation of student responses on a non-content-based test of scientific thinking, it was found that some students showed a pre-post gain on the m-c test version while showing no gain on a short-answer essay version of the same questions. This result led to a subsequent research project focused on differences between alternate versions of tests of scientific reasoning. Using m-c and written responses from biology tests targeted toward the skills of (1) reasoning with a model and (2) designing controlled experiments, test score frequencies, factor analysis, and regression models were analyzed to explore test format differences. Understanding the format differences in tests is important for the development of practical ways to identify student gains in scientific reasoning. The overall results suggested test format differences. Factor analysis revealed three interpretable factors—m-c format, genetics content, and model-based reasoning. Frequency distributions on the m-c and open explanation portions of the hybrid items revealed that many students answered the m-c portion of an item correctly but gave inadequate explanations. In other instances students answered the m-c portion incorrectly yet demonstrated sufficient explanation or answered the m-c correctly and also provided poor explanations. When trying to fit test score predictors for non-associated student measures—VSAT, MSAT, high school grade point average, or final course grade—the test scores accounted for close to zero percent of the variance. Overall, these results point to the importance of using multiple methods of testing and of further research and development in the area of assessment of scientific reasoning.

Subject Area

Educational evaluation|Science education

Recommended Citation

Izumi, Alisa Sau-Lin, "Measurements of student understanding on complex scientific reasoning problems" (2004). Doctoral Dissertations Available from Proquest. AAI3118308.
https://scholarworks.umass.edu/dissertations/AAI3118308

Share

COinS