•  
  •  
 

Co-evaluation of Expositive Texts in Primary Education: Rubric vs Comments

DOI

https://doi.org/10.7275/9hgz-sz82

Abstract

This study compares the effects of two resources, a paper rubric (CR) or the comment bubbles from a word processor (CCB), to support peer co-evaluation of expository texts in primary education. A total of 57 students wrote a text which, after a peer co-evaluation process, was rewritten. To analyze the improvements in the texts, we used a rubric that was similar to the one in the first condition. The messages and suggestions for improvement were quantified and classified according to their range, evaluative content, and rhetorical content. Lastly, the incorporation of these suggestions in the final version of the expository text was analyzed. The results showed that the evaluative comments focused mainly on pointing out, rating, or simply correcting errors. However, hardly any justification was given for such corrections, nor were there any questions or improvement alternatives recorded for other shortcomings or non-error content. The students who co-evaluated each other with a rubric wrote more comments, addressing the different rhetorical components in a balanced way, even though these comments were written in a generic way. This might be why many of them were not incorporated in the second version of the texts, where a significant improvement could be noticed, but only in the conclusion section. In contrast, the comment bubbles recorded much more specific suggestions for correction. Although there was a slightly higher percentage of modifications in the second version of those texts, it was not enough to indicate a significant improvement in quality compared to the first version.

Share

COinS