Main Article Content

Inter-rater agreement in assigning levels of difficulty to examination questions in Life Sciences


Edith R. Dempster
Nicki F. Kirby

Abstract

Public perception of “declining standards” in school-leaving examinations often accompanies increases in pass rates in school-leaving examinations. “Declining standards” to the public means easier examination papers. The present study evaluates a South African attempt to estimate the level of difficulty, as distinct from cognitive demand, to exit-level examination papers in Life Sciences. A team of four expert raters assigned a level of difficulty ranging from 1 (easy) to 4 (very difficult). Invalid items were assigned a difficulty level of 0. The reference point was “the ideal average South African learner.” Discussion and practice was conducted for 12 examination papers, followed by individual analysis of four examination papers. Inter-rater agreement for the final four papers was low. Raters assigned most items to difficulty levels 1 and 2, indicating that unreliability may be caused by the instrument having too many levels. Raters’ predictions of levels of difficulty supported the actual mark distribution for private school candidates, but not for public school candidates. The “ideal average South African learner” is an unsuitable reference point in the unequal educational landscape of the public school system. We recommend that the instrument be modified by reducing the number of levels of difficulty and removing the hypothetical reference point.

Keywords: comparability; difficulty; examinations; inter-rater agreement; reliability; standards


Journal Identifiers


eISSN: 2076-3433
print ISSN: 0256-0100