Author:
ARHIN ATO KWAMINA,Essuman Jonathan,Arhin Ekua
Abstract
Adhering to the rules governing the writing of multiple-choice test items will ensure quality and validity. However, realizing this ideal could be challenging for non-native English language teachers and students. This is especially so for non-native English language teachers because developing test items in a language that neither they nor their students use as their mother tongue raises a multitude of issues related to quality and validity. A descriptive study on this problem was conducted at a Technical University in Ghana which focused on item writing flaws in a communication skills test. The use of multiple-choice test in Ghanaian universities has increased over the last decade due to increasing student intake. A 20-item multiple-choice test in communication skills was administered to 110 students. The test items were analyzed using a framework informed by standard item writing principles based on the revised taxonomy of multiple-choice item-writing guides by Haladyna, Downing and Rodriguez (2002). The facility and discrimination index (DI) was calculated for all the items. In total, 60% of the items were flawed based on standard items writing principles. The most violated guideline was wording stems negatively. Pearson correlation analysis indicated a weak relationship between the difficulty and discrimination indices. Using the discrimination indices of the flawed items showed that 84.6 % of them had discrimination indices below the optimal level of 0.40 and above. The lowest DI was recorded by an item with which was worded negatively. The mean facility of the test was 45%. It was observed that the flawed items were more difficult than the non-flawed items. The study suggested that test items must be properly reviewed before they are used to assess students’ knowledge.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献