Five Issues about the Use of Learning Assessment Data in Education Planning: lessons from the Namibian Education System

Written on 08 Mar 21 by Dr Hertha Pomuti, Dr Charmaine Villet
National student assessment


There has been a growing interest in introducing learning assessments in education systems around the world. Developing countries have designed and administered national and regional learning assessments with the technical support of the development partners. Our recent study established five issues that influence the use of learning assessment data in the education system in Namibia.

Namibia began conducting national and regional learning assessments soon after its independence in 1990. In 1995, Namibia became a founding member of the Southern and Eastern Africa Consortium for Monitoring Educational Quality (SEACMEQ), a regional assessment carried out in Grade 6, and has participated in all four (1995, 2000, 2007 and 2013) rounds of SEACMEQ administration. In 2009, a national assessment known as the National Standardised Achievement Test (NSAT), carried out in Grades 5 and 7, was implemented for the first time. The NSAT replaced a semi-external Grade 7 examination that had been used for monitoring the education system performance. In 2012, an Early Grade Reading Assessment (EGRA) was conducted in Grades 2, 3 and 4. All three learning assessments focus on the primary education level. (You can find more information about the Namibian assessment system here.)

Namibia is one of the six sub-Saharan African countries that participated in the UNESCO International Institute for Educational Planning study that explored how learning assessment data were used in the education planning cycle and what influenced the use of learning data. Our study highlighted five issues that may influence the use of learning assessment data in education planning in Namibia.

1. A comprehensive national assessment policy is critical to enhance a common understanding of learning assessments

Namibia does not have a comprehensive national assessment policy that defines the purpose, the use of learning assessment data, the data dissemination strategy and how the learning assessments link to other assessments. The fact that there are different documents that outline the use of learning assessment data makes it difficult for users (e.g. education planners, inspectors of education, education officers, school principals and teachers) to have a common understanding of the role of the learning assessments. It was sometimes difficult for the participants in our study to make a distinction between the purpose of national public examinations and the purpose of learning assessments.

2. Linking learning assessment data to other data sets could provide rich information to the users

Data obtained from the NSAT was analysed without linking them to other data sets on student learning or to background characteristics. NSAT data analysis is not linked to SEACMEQ data, examination data, or data contained in the Education Management Information System (EMIS). Data analysis often lacked rigour and did not provide the rich information that could be used for effective planning and decision-making.

3. Learning assessment data should be linked to contextual information
The NSAT data analysis provides useful information on school level performance and enables comparison of each school’s results with the mean score in the same region and at national level. However, NSAT data analysis lacks linkages with variables such as socio-economic issues, infrastructure, teachers, teacher training, teaching and learning materials, and sanitary conditions that may influence student learning. Linking learning assessment data to other variables that affect student learning could influence planning decisions such as allocating more financial, physical and human resources to certain groups or areas. A comprehensive analysis of related data could provide valuable information for the planning and decision-making process. 

4. Timely release of learning assessment data could make it more valuable to end-users
Our study established that assessments, especially SEACMEQ, were not administered regularly and that the data analysis was delayed. Data that is released late makes it difficult for users to have up-to date results and consequently use them for planning activities. Information from other assessments, such as public examinations and continuous assessments, become more valuable for planning because the information is frequently and rapidly available.

5. Sufficient (financial and human) resources are needed to sustain learning assessments
The study established that it became difficult for Namibia to sustain the implementation of SEACMEQ, NSAT and EGRA assessments. NSAT was not conducted in 2019 due to budgetary constraints. NSAT implementation suffers due to financial difficulties for test administration, insufficient number of test papers and issues with the transportation of assessment materials. The EGRA assessment was not conducted again after its initial implementation due to the lack of financial and human resources. SEACMEQ studies were financed by both national and external funding and since the withdrawal of external funding SEACMEQ assessment implementation has suffered. It was reported that difficulties were experienced with the SEACMEQ assessment implementation due to outdated software license, insufficient funding for training staff, printing of assessment items and dissemination of assessment results. Other issue that hinder the implementation of the learning assessments in Namibia is a lack of human capacities. Although some capacities have been developed since the introduction of the learning assessments, units responsible for administering assessments continue to rely on external consultants for complex technical tasks in data analysis.

To conclude, conducting learning assessments should be linked to how learning data will be used. It is clear from our study that the absence of a comprehensive regulatory assessment framework that defines the purpose and modalities of different learning assessments jeopardizes a common clarity about the use of learning data. In addition, the value of learning assessments could be appreciated if learning data is relevant and comprehensive. In Namibia, each learning assessment was designed, developed and implemented in isolation from the others and other data sets. For example, both NSAT and SEACMEQ focus on the primary education, but there is no linkage between the two assessments which would enable them to complement each other for better utilization of the learning assessment data. Linking the NSAT data analysis to EMIS data could also make NSAT data more relevant and comprehensive for national planning and decision-making.

You can find more information on the results of the study in Namibia here.

Bookmark this