Author(s): Gustafsson, Martin
Organisation(s): Global Education Monitoring Report Team
Pages: 58 p.
To an increasing extent international assessment results inform education policy debates, yet little is known about the floor effects in these assessments. To what extent do they fail to differentiate between the most disadvantaged, and what are the implications of this, for instance in terms of the comparability of national statistics across space and time? Microdata from TIMSS, SACMEQ and LLECE are analysed to answer this question, with reference to primary schools. In TIMSS, floor effects have been greatly reduced through the introduction, in 2015, of TIMSS Numeracy, which includes a greater number of easier items relative to regular TIMSS. SACMEQ and LLECE, despite being specifically designed for developing countries, often display large floor effects. This results in a situation where many students scoring zero, after adjustments for random guessing, are classified as having passed proficiency thresholds. Though these floor effects do not substantially alter the rankings of countries, they are large enough to undermine proper monitoring of progress over time. They can also undermine public trust in the programmes, and they leave information gaps in relation to those students requiring most support. Designers of assessment programmes need to limit floor effects though the presence of more easy multiple choice items and more constructed response items. The former solution is the easiest to implement.