Abstract
This study investigates using technology to promote authentic and meaningful learning in applying a peer assessment rubric for a public speaking assessment in a higher education institution in Brunei Darussalam. Three hundred six undergraduates from Universiti Teknologi Brunei's Schools of Business, Computing, and the Engineering Faculty conducted the assessments in real-time using online-based rubrics accessible via their smartphones or laptops. Comparisons were made between the lecturers' marks and students for each rubric criterion, and a set of questionnaires was distributed to investigate students' perceptions toward the peer assessment after the assessment. The results indicated a variable discrepancy between assessments by the lecturers and students for the rubric criteria. While in some disciplines, peer marking was found to overmark compared to the lecturer by more than 15%, in other cases, the marks were similar. Comparison between peer and lecturer assessment indicated that the level of agreement was sensitive to the lecturer, but less so between student cohort when assessed by the same lecturer. When differences were observed, there was no apparent discrepancy in an agreement between aspects of the rubric which evaluated content or delivery. Students’ feedback revealed a positive response towards peer assessment but highlighted issues surrounding the technological aspects of the implementation process.
Publisher
Association of Language Teachers in Southeast Asia - ALTSA
Reference28 articles.
1. Andrade, H. G. (2000). Using rubrics to promote thinking and learning. Educational Leadership, 57 (5), 13-19.
2. Andrade, H. G., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research & Evaluation, 10 (3), 1-11.
3. Developing procedures for implementing peer assessment in large classes using an action research process;Ballantyne;Assessment & Evaluation in Higher Education 27 (5),2002
4. Application of ICT and rubrics to the assessment process where professional judgment is involved: the features of an e‐marking tool;Campbell;Assessment & Evaluation in Higher Education 30 (5),2005
5. Peer assessment of language proficiency;Cheng;Language Testing 22 (1),2005