5. Building a system-level monitoring framework that can advance national education goals

Bulgaria has made several policy changes in recent years to raise the quality of its education system. The government has, for example, introduced a new competency-based curriculum, established a National Inspectorate of Education and developed a new school financing model (see Chapter 1). While some of the basic components needed to monitor and evaluate the impact of these changes already exist, many of Bulgaria’s national tools and processes for system evaluation remain nascent and communication among different actors is not adequate to build trust in the reforms. This chapter suggests several policy measures that Bulgaria can take to strengthen system evaluation. Specifically, the chapter will address the country’s new education management information system (EMIS) and the National External Assessment (NEA) of student learning, which serve as the two main sources of information about the education sector. These tools need to provide actors with timely and trustworthy data so that policy makers at both the central and local levels can take informed decisions. School-level actors and the public also need more evidence about the education system to develop a better understanding of where and why students are falling behind in their learning and what actions can help achieve national education goals. Such processes are especially important in Bulgaria, where broad decentralisation reforms have led to mixed results (OECD, 2021[1]).

System evaluation frameworks generate and use information to develop education policies and hold the government and other stakeholders responsible for achieving stated objectives (OECD, 2013[2]). Bulgaria already has many components of a system evaluation framework (Table 5.1). For example, several government agencies collect data and conduct research on issues relevant to their work, and the country regularly participates in international assessments and surveys that provide credible metrics for monitoring performance. Despite these positive features, there are several weaknesses in terms of Bulgaria’s national tools and processes for system evaluation. The NEA, for example, has been measuring learning outcomes since 2007 but the scoring system does not generate trend data that are comparable across years. Moreover, current data collection processes are not co-ordinated across state administrative bodies and there is no regular analysis and reporting on the performance of the education system as a whole. While it is positive the Bulgarian government is already addressing some of these issues, namely by developing an integrated EMIS, further strengthening national tools and processes for evaluation is crucial. Without such efforts, it will be difficult for Bulgaria to fully understand and address persistent educational challenges, such as high dropout rates and poor learning outcomes. Developing stronger national tools and using evidence to evaluate and shape education policies will enable Bulgaria to better monitor the impact of reforms, improve transparency in decision-making processes and help communicate progress towards achieving national education goals.

Policy goals provide a reference point against which actors can assess performance and are thus an important tool for accountability and driving system improvement. Bulgaria has a set of clear, long-term goals for the education sector, which are contained within the government’s Bulgaria 2030 development strategy and the sector-specific Strategic Framework for Education. Both documents articulate education goals from pre-primary to the upper secondary sector, as well as lifelong learning (Table 5.2). By defining long-term goals in this way, the Bulgarian government helps reinforce policy continuity, a considerable achievement since evidence suggests the country struggles with a high turnover of civil service personnel, in both senior and technical-level positions (EBRD, 2019[3]). To focus education reform efforts, countries often associate goals with specific, time-bound targets that enable the public, opposition parties and future administrations to get a sense of how policies have performed and any needs for adjustment. Bulgaria publishes its national education goals alongside quantitative indicators, which help to steer different policy actors. However, some indicators do not measure progress in a way that can meaningfully support system evaluation and detailed action plans have not been developed to support the implementation of long-term education strategies.

Bulgaria has developed two high-level policy documents that set goals for its education sector. The first is Bulgaria 2030, which identifies the government’s general priorities over a ten-year period, including in the area of education. Positively, this strategy was formulated through an assessment of the country’s socio-economic development since EU accession and key challenges that remain in light of national development goals and international commitments. It was steered by Bulgaria’s Council of Ministers and formulated by the Ministry of Finance in consultation with national and international development partners. The second policy document, the Strategic Framework for the Development of Education, Training and Learning in the Republic Of Bulgaria (2021 – 2030) (hereinafter, the Strategic Framework for Education), is more sector-specific and implementation-oriented. The Ministry of Education and Science (hereinafter, the Ministry) formulated this latter document, which covers the same time horizon as Bulgaria 2030 but establishes a more detailed set of policy goals and interventions required to achieve them. In particular, the framework expands the five education goals of Bulgaria 2030 into seven priority areas (Table 5.2) that were identified through a strengths, weaknesses, opportunities and threats (SWOT) analysis, a vision of schooling in Bulgaria by 2030 and consultation with education-specific stakeholders. Importantly, the Strategic Framework for Education takes into account other relevant government strategies and regulations affecting the education sector, as well as eligible financing facilities, which stand to support its implementation and in turn the implementation of Bulgaria 2030.

Most OECD countries associate their education goals with a set of outcome indicators. This practice provides a metric to guide and assess progress towards longer-term goals. Both the Bulgaria 2030 strategy and the Ministry’s Strategic Framework for Education contain a set of indicators with specific targets that enable the public to hold the government and other implementers to account. However, some indicators do not support performance measurement sufficiently. The indicator for measuring inclusive education, for instance, tracks overall school participation across Bulgaria at different key stages of schooling but does not measure participation among the specific vulnerable groups that figure disproportionally in the country’s out-of-school population. While Bulgaria does not compile data based on ethnicity, it does compile data on vulnerable students based on parents’ income level and other factors, which has been used to provide additional support to certain schools (see Chapter 4) and regions, for instance through the country’s new funding model (see Chapter 1) and the large-scale School for Success project. Without more detailed and disaggregated indicators that track education outcomes for vulnerable groups, Bulgaria may struggle to measure the inclusiveness of its education system as well as other national education goals. A number of OECD countries, such as Australia and the United States (US), compile data disaggregated by student background to reveal and address disparities present in their education systems.

The Ministry plans to prepare two mid-term implementation plans linked to its Strategic Framework for Education. While Bulgaria has begun to pursue activities linked to policy priorities set out under the framework, it does not yet have the first of these two implementation plans in place and this may slow and complicate the implementation of planned interventions. Deliberately planning for the implementation of policies can help keep actors on course, channel restricted resources to where they are needed most and reinforce accountability. In the wake of the COVID-19 crisis, implementation has become even more challenging and therefore mid-term action plans will need to have more flexibility built in, to allow for policy adjustments when needed. One way to achieve this is by establishing intermediate targets to ascertain if policy goals are on track and changes are needed to achieve goals (for instance, by associating each major planned activity with a set of output indicators).

Most OECD countries compile education statistics using a variety of instruments, such as national assessments and school questionnaires. Importantly, these tools and processes should complement not duplicate each other to increase the efficiency and accuracy of data collection. Bulgaria has established some of the institutions and processes required to gather information and monitor the performance of the education system. Its NEA provides data on learning outcomes and statistical bodies compile most of the key education statistics collected through the United Nations Educational, Scientific and Cultural Organization (UNESCO) Institute for Statistics (UIS)/OECD/Eurostat joint data collection programme. However, until recently, the Ministry’s education database has not been harmonised with other parts of the state administrative system nor with international reporting requirements. Moreover, the NEA does not generate data that are comparable across cycles. Bulgaria needs to continue to upgrade and co-ordinate its evaluation tools to support system accountability and improvement.

The Ministry collects comprehensive data for administrative purposes. Schools record their data (on attendance, grades, etc.) in either paper or electronic format and upload them once a month to the regional department of education’s (RED) electronic ledger. This information is then stored in the Ministry EMIS. The time lag between when schools collect data and when authorities aggregate it at the system level has meant that users cannot maintain up-to-date records on highly dynamic phenomena such as school dropout.

Another major challenge facing the collection of education data in Bulgaria has been the lack of co-ordination within the sector and across state agencies. In many OECD countries, national statistical agencies use data collected by their education ministries to fulfil international reporting requirements. This practice helps lower costs, reduces the reporting burden on schools and tends to improve overall data coverage and availability. In Bulgaria, however, the Ministry has used its own data protocols, statistical definitions and methodologies, which are not fully compatible with EU reporting requirements. Compatibility issues exist, for instance, around data on vocational education and training (VET) colleges, which the National Agency for Vocational Education and Training produces, and in the definition of International Standard Classification of Education (ISCED) levels and education personnel. As a result, the country’s National Statistical Institute (NSI) does not currently use data from the Ministry EMIS but compiles its own data directly from education institutions through an annual statistical survey.

Finally, the Ministry’s existing EMIS does not connect to data collected by other education agencies, such as the Center for Assessment, which stores results from the NEA and state matriculation examination. This arrangement limits the type of analysis researchers and policy makers can do to understand system performance because they have to connect different databases manually. To address some of these challenges, the Ministry is creating a new integrated EMIS, which represents a valuable opportunity to improve the efficiency of data collection and management in Bulgaria’s education system.

Since 2014, the Ministry has been working to establish a new EMIS with the aim of improving the collection and use of administrative data. Several modules for the new EMIS have already been developed and the entire system is expected to be operational for use over the 2021/22 school year. The new system has many positive features, including that new student and teacher data will be verified against the National Population Database then submitted to the Ministry, which will create an anonymised personal profile for the student or teacher linked to the country’s unique citizenship number. This will enable the Ministry to track progress through the teaching and learning process and enable researchers to track outcomes against a range of background characteristics. Privacy protocols have been established, in line with the General Data Protection Regulation (GDPR). In addition, the EMIS will include a school-level module so that school actors can directly input their data into the central database, reducing the current time lag between the collection and aggregation of information. These changes will enable policy makers to directly access information across interoperable databases and reduce the data entry burden on schools. The new EMIS has been established as a single service platform, wherein teachers and students can access a range of tools, such as spaces for online learning and textbooks.

The new system is a considerable achievement and should significantly improve the quality of information available on the education system, as well as create opportunities for more complex analysis. However, there may be areas for further development. For instance, while the new EMIS was developed through strong and inclusive consultation within the Ministry (underpinned by a 40-person working group comprised of staff from different directorates), there seems to have been limited consultation with other education agencies and the NSI. Including these actors in the development and implementation of the new EMIS can help reduce the risk that they will continue to collect their own data in parallel to the central process once the Ministry has implemented its new EMIS. Moreover, while students, teachers, schools and Ministry staff can access certain datasets within the system through a unique identifier, the Ministry may consider developing a public interface, where discrete datasets can be accessed by any interested party. They may also consider consulting with a wider range of stakeholders – for instance, other (relevant) administrative data producers and NGOs – to ensure that EMIS data are comprehensive and can be used for a variety of purposes.

Bulgaria participates in several international assessments and surveys that provide reliable and comparable measures of how the education system performs in comparison to other economies and over time. Bulgaria has participated in every round of both PIRLS and PISA since they first began in the early 2000s, except one round of PISA (in 2003), and in every round of the Teaching and Learning International Survey (TALIS) since its inception in 2008. In addition, Bulgaria has regularly participated in TIMSS since its inception. Initially, the country participated in the study at Grade 8 level (in 1995, 1999, 2003 and 2007) but, since 2015, has participated in the assessment at Grade 4 level. This decision was made since the study of science is not a core part of the curriculum at Grade 8.

The Center for Assessment manages Bulgaria’s participation in international assessments and surveys, and conducts analysis with this data to understand how the education system is performing over time. Positively, Bulgaria also uses data from international studies to set national benchmarks, such as the share of Grade 4 students scoring below intermediate benchmarks in mathematics (using TIMSS data). This practice provides a clear measure of learning outcomes that can help monitor system performance. At the same time, studies like PISA and TIMSS cannot help monitor implementation of the national curriculum.

National assessments can serve as an important tool for collecting reliable, recurrent data on learning outcomes with the goal of monitoring education systems, informing policy and supporting strategic planning. Results from such assessments can also have formative functions because they can serve as external references to strengthen teachers’ classroom-based assessments. Since 2007, Bulgaria has conducted annual, census-based NEAs at three grades of schooling. NEAs are run at Grades 4, 7 and 10, which, respectively, mark the completion of primary, lower secondary and first stage of upper secondary education. The latter is the end of compulsory education in Bulgaria. The NEAs’ stated objectives (Table 5.3) are broadly positive and reflect two of the main purposes of national assessment systems in OECD countries: to monitor student learning to inform policy- and system-level interventions (i.e. a monitoring function) and to generate information that can help improve student learning at the school and student levels (i.e. a formative function). As a result, the NEA generates both system- and student-level data, which is an ambitious and positive policy decision with the potential to help address many of Bulgaria’s educational challenges, including concerns that students are not meeting national learning standards.

However, the design and implementation of Bulgaria’s NEA system as well as the country’s broader assessment culture (see Chapter 2) currently hinder the instrument’s ability to serve its monitoring and formative purposes effectively. This is mainly because, in practice, the NEA fulfils a different purpose: to help select students into schools. Specifically, the Grade 7 NEA, which takes place when most students leave their basic school (covering Grades 1-7), is used to identify which students will attend elite secondary schools that specialise in mathematics or foreign languages, among other areas. As a result, the Grade 7 NEA currently acts more as an examination. This is a unique feature of Bulgaria’s assessment system, as the vast majority of OECD countries do not use a national assessment for examination purposes but rather have separate instruments for system monitoring and selecting students. While NEA results also have a selective function in Grades 4 and 10, this is to a much lesser extent since only a minority of students change schools after these grades.

Another key issue with the NEA is that the system lacks some of the basic features that typically allow national assessments to measure learning outcomes against national standards. This partly involves Bulgaria’s lack of basic psychometric resources for the NEA, such as proficiency scales, processes for calibrating items and criterion-referenced scoring. Moreover, Bulgaria does not take steps to offset the potential risks associated with having a census-based assessment, which can easily accrue high stakes by forming judgements about individual students, teachers and schools. For example, administering the NEA at the end of the school year and end of curriculum cycles, including scores on student certificates of completion and not investing in formative measures like interpreting and communicating results for different audiences, all convey a message that the NEA’s dominant purpose is purely summative. These factors reinforce a strong traditional focus on competition and performance in assessments, rather than using assessment as a formative tool to support teaching and learning.

To make system evaluation a meaningful exercise, countries need to report on the performance of their education systems and use this information to support planning and accountability. Across the OECD, many countries produce annual reports on the state of their education systems and make these publicly available. A growing number are also starting to publish evaluations of major policies and programmes. These policy evaluations typically take place shortly after implementation or in the form of ex ante reviews to support future decision making (OECD, 2018[5]). In Bulgaria, regular reporting on the performance of the education system is limited. The country’s capacity to compile these reports has been hampered by issues related to the collection and management of data across the sector (e.g. lack of trend data from the NEA). Positively, the Ministry reports on thematic issues and disseminates core administrative data. However, much of the analysis and research conducted by the Ministry is not accessible to the public or even to other sectoral agencies.

Bulgaria compiles and publishes data on the education system through the NSI’s annual statistical yearbook. This yearbook provides descriptive information on the number of schools, students and teachers, at different levels of schooling, and disaggregates these figures by various characteristics (including gender, geographical location and legal status of the school). It also provides information on school attendance and other data to give education stakeholders a sense of system performance against key indicators such as teacher-student ratios, dropout rates and regional differences in the density of students and school resourcing. However, Bulgaria does not produce regular reports that explain how the education system is performing against national goals. Data that could inform this reporting, such as analysis from NEA results and school inspection findings, are not systematically collected and reviewed in one place but, instead, held by different technical education agencies (such as the Center for Assessment and the National Inspectorate of Education (hereinafter the Inspectorate)). As a result, it is difficult for stakeholders to get a sense of how the education system is performing as a whole.

The Ministry as well as other sectoral bodies conduct evaluations and ad hoc research on topics relevant to their areas of work. The Ministry’s Strategic Policies Development Division, which manages most of the Ministry’s research work, does not have a specific, fixed budget line for research but is allocated funds for specific projects. In recent years, the Ministry has prepared reports on the drivers of early school leaving, Roma participation in schooling and other issues around participation and inclusive education. It has also recently produced research on how the COVID-19 pandemic has affected learning processes and student performance in Bulgaria (Ministry of Education and Science, 2021[6]). This signals a commitment to improving policy through evidence and many of the thematic areas reflect Bulgaria’s commitment to improving educational equity. However, education research has been hampered by data limitations – specifically, a lack of reliable and timely student-level data. For instance, the Ministry cannot thoroughly evaluate its performance in addressing early school dropouts or regularly report on this issue to the public, since it cannot access data on early school leaving that is adjusted for overall population shifts (for instance through outward migration). Thorough policy evaluation and reporting will likely become more important in the wake of the COVID-19 pandemic, as administrators seek to rapidly offset learning losses and make difficult decisions about the allocation of limited funding.

System evaluation requires public sector resources and technical skills to collect and manage reliable, quality datasets and to exploit education information for evaluation and policy making. Many OECD countries have established evaluation institutions that sit outside or at arm’s length from education ministries, which can contribute to independent system evaluation. These institutions may produce evaluations of major policy programmes or annual reports on the education system, among other tasks. They typically receive public funding to ensure they have sufficient capacity but their statutes and operating rules ensure the independence and integrity of their work. While Bulgaria has technical education agencies that conduct evaluation activities, there is no dedicated body responsible for research and evaluation across the entire education system. Providing independent and periodic system evaluation is particularly important for Bulgaria, where political priorities risk interfering with research activities and public funding can be volatile.

Many different actors carry out system-level analysis in Bulgaria but these activities take place intermittently and are often under-resourced. There is no dedicated public body tasked with conducting periodic research and evaluation on system performance; however, the Strategic Policies Development Division within the Ministry typically co-ordinates system-level research. This division is responsible for developing education strategies, background papers and other policy inputs and frameworks, based on instruction from the Minister. In February 2020, the Strategic Policies Development Division merged with the Teacher Training and Qualification Division. While this merger highlights the critical role that teachers play in Bulgaria’s development plans for the sector, it may result in a more myopic focus on a single system-level priority, at the expense of others. The Strategic Policies Development Division is also small, with only eight employees, and relies heavily on contracting external experts and researchers. While many countries rely on external support to conduct system evaluation activities, the Ministry’s capacity constraints limit its ability to evaluate major policies and programmes systematically and monitor how large-scale reforms are affecting students, teachers and schools.

A number of specialised government bodies in the field of education process their service data to inform operational planning. The National Center for Professional Development of Pedagogical Specialists, for instance, compiles an annual report to summarise the courses it delivered, the topics covered and the satisfaction of participants. The Inspectorate has also signalled plans to compile a summary report once it completes a full cycle of school evaluations. However, these strands of research activities are not compiled regularly for system-wide analysis.

Bulgaria has established many of the building blocks needed for a robust monitoring and evaluation system that can help inform education policy. The country has set clear goals and standards to guide the development of its education sector over the long term and regularly compiles administrative data and information on student learning outcomes through national and international assessments. Bulgaria also consults with stakeholders in formulating education policy and conducts research on systemic issues. In these respects, Bulgaria’s system evaluation practices are similar to those of OECD and other European Union (EU) member states. However, the country still faces major educational challenges, including high dropout rates, a large share of students who do not achieve baseline levels of proficiency in reading and mathematics, and significant disparities in outcomes based on student background and geographic location.

To address these challenges, Bulgaria introduced several reform goals through the Pre-school and School Education Act (2016), which sets out new approaches to teaching and learning and emphasises the importance of inclusion. However, there are major issues with available evidence to review performance at different levels of the system. Primarily, the NEA cannot support trend analysis, meaning that Bulgaria does not have a national instrument to monitor learning outcomes over time. Moreover, while there are ongoing efforts to streamline data collection and management procedures across the education sector, there are currently parallel processes for collecting data, which reduces efficiencies and capacity to carry out quality checks. This context also limits the amount of information that is disseminated in user-friendly ways. Without more reliable and accessible education data and clearer lines of accountability, it will likely remain difficult for system-level actors in Bulgaria to direct policy and educational resources as well as monitor progress towards achieving national education goals.

Bulgaria compiles most of the key education statistics collected internationally. The NSI also prepares an annual statistical yearbook on Education in the Republic of Bulgaria, which uses administrative data to provide periodic snapshots of the sector’s key features. Historically, there have been issues with the availability and collection of education data but the Ministry is upgrading its EMIS, which will introduce important developments. For example, the new system will use unique student identification numbers that link to Bulgaria’s unique citizenship number. This feature should provide the Ministry with new data on school participation and education outcomes, including by different demographic characteristics.

To optimise its investment in the new EMIS, Bulgaria should ensure that it adopts the approach of most OECD countries, which is to view an EMIS as “a system of people, technology, models, methods, processes, procedures, rules and regulations” (UNESCO, 2008[7]), rather than as a technology solution exclusively. In particular, the Ministry should continue to review the practices and standards it uses to compile and share education data, in partnership with other bodies that could use its data or provide new data to its system – namely the NSI. This will be important to ensure that new data are secure, accurate and can be used for a variety of purposes, making the new EMIS an accessible and insightful tool for policy makers and the public at large.

Bulgaria’s new EMIS represents an important opportunity to modernise and integrate the collection management and use of education data. Nevertheless, planning gaps remain in terms of the protocols for defining and collecting data and verifying its quality. These are new concerns for Bulgaria, as the former EMIS required principals to check data during the process of manually aggregating information into the central database and there were no links to other state databases through civil identity numbers or detailed student files. While these changes will make school reporting more efficient through digitisation and allow for complex analysis, the Ministry will need to prepare for the implementation of this new tool and ensure that it will serve as the official go-to source of information for all education stakeholders. This will require working with other state agencies to ensure the new EMIS uses data definitions that align with national and international reporting standards. Bulgaria will also need to redefine staff roles and provide adequate support to manage the new EMIS. Successfully implementing this tool is key to providing the quantitative information needed to improve system evaluation in Bulgaria.

The launch of Bulgaria’s new EMIS represents an important development that will integrate various databases and make the process of data collection more efficient. The Ministry can maximise the benefits of this new system by establishing common data definitions and collection methods so that new users have a shared understanding of what information to report. These data definitions and protocols should align with national and international reporting standards. Without such efforts, Bulgaria risks different public agencies continuing to collect education data for their own purposes. This is currently the case with the NSI, which collects some of its own data from schools because the Ministry’s definitions of ISCED levels and education personnel do not align with international definitions.

To establish the new EMIS as Bulgaria’s central source of official education data, the Ministry should map the data requests that schools currently receive, as well as required reporting requirements, to identify and eliminate any redundancies. Some indicator mapping has already been done as a part of the development of the new EMIS; however, the Ministry should involve other public agencies in this process to produce a comprehensive set of data definitions and rules about who can request information from schools. Such procedures help restrict outside access to school information, funnel data retrieval to the education database and reduce the reporting burden on schools by limiting outside data collection to information that is not available in the EMIS (e.g. interviews with teachers).

As the Ministry implements its new EMIS, it should continue planned efforts to increase the efficiency of school reporting. At present, schools and classroom teachers collect data in a paper-based or electronic format, depending on the school. Regardless of the collection method, principals must check their school’s data before administrators can upload it to the Ministry’s central database. However, this increases the reporting burden. Moreover, entering data manually can result in missing or incomplete data, despite having the principal review the data. This may contribute to inconsistencies around rates of enrolment and grade repetitions. For instance, surveys suggest that a significant share of students who drop out of Bulgarian schools have emigrated abroad but this trend is not reflected in the Ministry’s database. As a result, Bulgaria’s dropout rate may not only reflect students who have left schooling altogether but also those who are continuing their education elsewhere. To address these issues and generate more timely education data, the Ministry should scale up the digitisation of school reporting by ensuring that all schools are able to upload data directly to the appropriate modules of the new EMIS and receive training on how to use this tool. Similar opportunities should be provided for relevant RED staff so they can support schools in using the new EMIS. As the system matures, ICT literacy should be included in job descriptions for teachers, school principals and relevant RED staff. In addition, the Ministry should consider equipping the new EMIS with a feedback form that could help new users to raise queries around calculation or definitional standards, or to flag any technical issues.

The Ministry should ensure that the new EMIS produces high quality, policy-relevant data. While establishing common definitions and progressively digitising school reporting should make data collection in Bulgaria more efficient, it may also increase the risk of data errors, as more actors upload their data directly. Errors can lead to very different insights on issues, such as the extent to which and reasons why students are dropping out of the school system. Accurate data are essential to ensuring that policy makers correctly understand what is happening in the education system and providing accountability information.

Quality assurance systems are particularly useful in countries like Bulgaria, where sharp capacity disparities exist at the school and local government levels (UNESCO, 2020[8]). Many OECD countries conduct strict data validation and auditing procedures to systematically check data and flag inconsistencies. Quality assurance measures are typically built directly into EMIS systems or/and countries conduct regular quality checks, such as visiting a sample of schools to confirm that the data collected aligns with school records. In Bulgaria, a central body – for instance the National Audit Office – could take responsibility for conducting quality checks on Ministry EMIS data, to ensure that they align with standards used by the EU and other international partners. Moreover, if the new Inspectorate finds disparities between the data reported in the EMIS and the information it encounters when evaluating a school, it should report these disparities to Ministry EMIS staff.

The Ministry should ensure that the new EMIS is equipped with adequate staff who are trained sufficiently to develop and manage the new system. At present, the Ministry plans to assign around six technicians to operate the new EMIS but the precise number of staff and their roles is not confirmed. EMIS staff will also require training and ongoing support. While the Ministry already has programmers who can code education data based on specific requests (e.g. to determine how many textbooks are needed for a given class and school year), there are a limited number of statisticians who can conduct analysis with available data to inform policy. In many OECD countries, staff with different areas of expertise (for instance, in data management and analysis, each with different data access rights) manage the EMIS and have access to professional development opportunities. To build staff capacity to implement Bulgaria’s new EMIS, the Ministry should identify the roles and skills required to complete tasks required to manage the system (Abdul-Hamid, 2014[9]). The actions outlined above (mapping data definitions, regularly identifying new indicators and revising data protection protocols) offer examples of the types of tasks that the EMIS staff could complete. Other tasks might include responding to glitches in the system, developing feedback loops and training for users, as well as looking for partnerships with other data producers. Depending on the tasks identified, the roles within this team could include not only technicians but also statisticians and/or personnel with legal training.

While the finalisation and implementation of Bulgaria’s new EMIS should be one of the Ministry’s main priorities, it is important that this new system evaluation tool is easily accessible and can support accountability and policy making. This warrants reflection on how to ensure that the new EMIS supports robust monitoring of progress against national goals, that its data are accessible to the public and that they can be used in a variety of ways to support system evaluation efforts. This is important to ensure that the government has the information it needs to conduct evaluations and inform policies for system improvement and that the public has the information it needs to participate actively in efforts to improve system performance.

Many OECD countries now provide open access to their education data, meaning that external users (i.e. those outside government agencies) can access data through a public web portal. Open access to government data can strengthen trust and transparency in the education system and it is an effective way to generate new insights on system performance. At the same time, providing access to student and teacher level data makes the anonymisation and protection of this information even more critical (see Recommendation 5.1.1). Bulgaria already publishes some education data through its annual statistical yearbook Education in the Republic of Bulgaria. However, the yearbook provides a limited snapshot of the data available and it presents data in Portable Document Format (PDF) format, preventing cross-source analysis and a user-friendly way to manipulate data and conduct unique analysis. To access other official education data, external parties must submit a written request to the Ministry, which will then share the request with its Center for Information Services (CIS), which will provide data in table form. While it is positive Bulgaria has routes available for researchers to access a variety of education datasets, there are currently no tools to make data a more accessible and functional tool for education stakeholders and the public.

Bulgaria’s investment in the new EMIS will accrue the greatest gains if a wide range of users can easily access and use the data. A public interface, with a range of analytical functions, is an increasingly common feature of EMIS systems in OECD countries and can help generate demand for system evaluation. While it is positive the Bulgarian government contracts external experts to conduct education research for specific projects or issues, there are no tools for public actors to easily review and analyse education data. Creating a public interface with a sophisticated range of functions for the new EMIS would allow users to analyse data, visualise the findings and export information through a variety of formats. The Ministry should prioritise sharing a balanced set of indicators, possibly through a digital dashboard, that not only relate to administrative data (e.g. the number of schools) but also inputs (e.g. levels of funding) and outcomes (e.g. external assessment data). To contextualise this date, the platform should present this information alongside different options, for example by disaggregating anonymised data by student socio-economic background, gender or geographic region. Bulgaria’s existing EMIS cannot generate these types of graphs or other data visualisations and adding these functionalities to the new EMIS would help convince stakeholders, such as school principals, of the utility of accurate reporting.

Without more deliberate data collection, the Bulgarian government will continue to lack the information it needs to conduct system evaluation and inform education policies. Bulgaria has defined a set of indicators to review its long-term education strategies. However, in a number of cases, this indicator framework does not provide adequate information to measure the desired objectives. For instance, to support the national goal of improving inclusive education, Bulgaria should regularly collect data to track learning outcomes among students from different ethnic groups; such information was last available in the 2013 cycle of the NEA. Mapping the national indicator framework against available sources of information can help the Ministry identify information gaps and signal a need for the new EMIS and other data collection tools, like the NEA, to improve data collection in order to better measure progress. This can also help improve accountability for system performance and co-ordinate policy efforts. The Ministry should consider carefully which indicators it would like to retain and which it would like to replace while ensuring consistency for the duration of the strategic planning cycle (i.e. until 2030). New indicators could be included as part of the development of the first mid-term implementation plan under the Strategic Framework for Education.

In particular, the Ministry might construct indicators around student engagement and wellbeing. Research has shown that issues around student engagement could be an important driver of low educational attainment in Bulgaria, with studies reporting that Roma are less likely to see the benefits of education and that truancy levels are significantly higher in Bulgaria than in other PISA-participating economies. Student engagement and wellbeing are likely to become still more important in the future – the government’s decision to add an additional (compulsory) pre-primary year will be costly unless it is accompanied by changes in attitudes towards learning – that sees more students motivated to learn, rather than obliged to.

The stated objectives of Bulgaria’s NEAs (Table 5.3) are broadly positive and reflect the main purposes of national assessment systems commonly found in OECD countries: national assessments as a tool to help monitor student progress against learning standards, inform policy making at the system level and support teaching and learning at the school level. Bulgaria established its NEA in 2007 and has gradually expanded coverage to collect system- and student-level data for three grades of schooling. However, the design and implementation of the NEAs hinder their capacity to serve as an effective system-monitoring and formative tool. Bulgaria’s broader assessment culture, which emphasises assessment as a validation exercise rather than an integrated part of the learning process, further reinforces the perception that NEAs are summative and have high stakes for students. The selection function attributed to the Grade 7 NEA is of particular concern, as it serves as an examination, in reality undermining its intended monitoring and formative purposes.

If Bulgaria wants to rely on these assessments to produce data that can guide system improvement and support learning, the NEA should be decoupled from its selection function. Bulgaria should also prioritise investments in essential psychometric resources to strengthen its national assessment system, as the NEA currently lacks proficiency scales that link to national learning standards, calibrated test items and a criterion-referenced scoring system. Without these elements, the NEA can help rank students in a particular cohort by their achievement levels but cannot meaningfully support learning or inform system evaluation by generating reliable trend data. There is a general awareness of these problems within Bulgaria’s central government and education agencies but reforming the NEA will require political will as well as financial resources and technical capacity.

Bulgaria’s national assessment system currently has multiple purposes, including monitoring system performance and curriculum implementation, measuring individual student progress and selecting students into secondary school. While national assessments can serve a variety of purposes, fulfilling different goals requires different design decisions (Newton, 2007[10]). In Bulgaria, the conflation of purposes attributed to the national assessment system makes it difficult for policy makers and the Center for Assessment to navigate which design options would best ensure the NEA fulfils its stated goals. For example, the Grade 7 NEA has been changed from a census to a sample several times in the last decade with limited consultation, leading to some confusion among stakeholders about the main role of the assessment (i.e. if it is for system monitoring or selecting students). This combination of purposes also leads to a distortion between the assessment’s intention and how it is implemented in practice.

It is therefore crucial that the Ministry, with the support of other stakeholders, such as the Center for Assessment, narrow down the primary purposes of the NEA to focus on monitoring system performance and providing formative information to support teaching and learning. Specifically, the Bulgarian government should remove the selective function of the NEA in all grades. Now is an opportune moment to consider such a major change to the national assessment system, since it would give Bulgaria a chance to align NEA instruments with the new competency-based curriculum that was recently rolled out. However, these changes will need to be communicated to the stakeholders who will be most affected, namely educators, parents and students. Outreach efforts will be crucial to helping build a more comprehensive understanding of student assessment in Bulgaria.

While examinations, like assessments, provide data on student knowledge, there are important differences in the main purpose of these testing instruments. Examinations typically help make decisions about student progression, by certifying achievement and/or selecting students into the next level. At present, Bulgaria’s Grade 7 NEA is a high-stakes examination for students since it determines what secondary school students attend (see Chapter 2). The NEA in Grades 4 and 10 also have some consequences for students, although to a much lesser extent since only a minority of students change schools at these levels. The use of the NEA as a selective tool leads to confusion about its role in the Bulgarian education system and has adverse consequences, such as encouraging private tutoring and putting pressure on young students. If the NEA is to serve as a tool for monitoring and improving the education system, it cannot have any direct consequences for students. To make this distinction, the Grade 7 examination should be replaced with a new low-stakes assessment (see below). A separate selective examination can be administered for students who wish to compete for places in elite schools (see Chapter 2) or earn specific qualifications (i.e. foreign languages).

One of the main barriers to successfully implementing education policy is the lack of recognition that “…the core of change processes requires engaging people” (Viennet and Pont, 2017[11]). When introducing any changes to the NEA system, the Bulgarian government should actively engage with a range of stakeholders to communicate clearly the objectives, rationale and processes of reform. These efforts will help ensure that changes to the NEA system are well understood and that the new national assessment is considered an integral part of Bulgaria’s shared vision for student assessment (see Policy issue 2.1. in Chapter 2). The most important change to the NEA system recommended by this review would be to discontinue its selective function, especially in Grade 7. Since the Grade 7 NEA has been historically perceived as a selection tool, changing this understanding will likely be a long and complex process. It is therefore important the Ministry explain to policy makers, school principals and teachers, as well as parents and students, why this change is being made. Specifically, the Ministry should emphasise that having a more formative, low-stakes national assessment will not only help identify levels of student performance but also inform pedagogy and help measure progress towards national education goals. For example, the Ministry could organise virtual workshops with principals and teachers to discuss how such a tool would be particularly important to help assess and address learning losses following the COVID-19 pandemic.

The main purposes of national assessment systems in most EU and OECD countries are to support system monitoring, provide formative information about learning and serve as an accountability tool (OECD, 2013[2]). Bulgaria’s NEA has similar stated purposes, in addition to a student selection function. However, the design of the NEA instruments does not provide data to monitor progress over time nor does it support accountability. For example, the NEA is not included as an indicator in the national goals set out in the Strategic Framework for Education and the lack of reliable trend data prevents the NEA from monitoring curriculum implementation. Moreover, considering the NEA is a census assessment, the absence of comprehensive reporting of results and support for schools and teachers to use this information, represents a missed opportunity for the NEA to inform pedagogy. As Bulgaria works to strengthen its national assessment system, the government will need to reflect on several key decisions about the NEA’s design, as outlined in Table 5.4. The following section provides recommendations on how Bulgaria could reinforce the assessment’s system monitoring function and maximise its formative potential as a tool for driving system improvement.

Comparability is key for an assessment whose main purpose is to monitor educational progress. Currently, NEA results cannot be compared across years because test items are classical and scores are reported as non-transformed raw points (Danchev et al., 2015[14]). Center for Assessment officials acknowledge that this scoring system prevents the NEA from generating trend data to monitor the education system over time. However, there have been no changes to the design of NEA instruments, due to capacity constraints within the centre, as well as the fact that the NEA continues to be used for selective purposes, i.e. school admission. Moving to a criterion-referenced scoring method for the NEA should be a top priority, as this would allow the assessment to fulfil its stated purposes of measuring the progress and implementation of education reforms, as well as the extent to which students are achieving national learning goals.

To develop and implement the NEA as a criteria-referenced assessment, the Center for Assessment must define performance levels and align these with Bulgaria’s national learning standards. Such details should be described in technical documents alongside proficiency scales, rules on developing items and other test specifications. This practice will promote greater transparency in the assessment system and allow researchers and experts to critically evaluate and provide feedback that can lead to further improvements in the NEA’s instruments. The Bulgarian government should ensure the Center for Assessment has the adequate financial capacity and assessment expertise to implement this important change and develop the associated technical documents.

To better support the formative and monitoring purposes of its national assessment system, Bulgaria should consider changes to the NEA’s target population and administration timeline. Such changes would help distance the assessment from its previous role as a selection instrument and send a strong signal about the refined purposes of the NEA system. This review team recommends the following configuration:

  • Move the census-based primary school NEA to Grade 2. At present, Grade 4 marks the end of the initial stage of primary education in Bulgaria and the first academic year that students take an external standardised assessment. While results from international assessments (PIRLS and TIMSS) and the Grade 4 NEA can support system monitoring, NEA results at this level also select a minority of students into elite schools. This review recommends that Bulgaria eliminate the Grade 4 NEA and replace it with a full cohort assessment in Grade 2. Importantly, the new Grade 2 NEA should not serve as a selection instrument but rather a formative tool to support system monitoring and student learning. The Grade 2 assessment also needs to be appropriate for young learners (age 8) and have a faster turnaround of results so that teachers can use them to support student learning (see Recommendation 5.2.3).

    Many OECD countries already administer national assessments in at least one grade of primary education and having student-level results earlier will give teachers in Bulgaria more time to identify and address learning gaps before they become problematic. Since most students in Bulgaria do not change schools during the initial stage of primary education, eliminating the Grade 4 NEA will also help delay student tracking until the beginning of secondary school. This change may require elite primary schools to develop new admission systems, which could be based on the number of places available, grade point average or other pre-established criteria. Keeping the Grade 2 NEA as a census will not only complement diagnostic and classroom assessments but also provide reliable information about system performance at a different level of primary education because policy makers will already have system-level data for Grade 4 students through international assessments.

  • Consider administering the NEA during lower secondary education. Once the Grade 2 NEA is well established, Bulgaria could consider re-introducing a national assessment during lower secondary education. While the Grade 7 NEA marks the end of lower secondary education (ISCED 2), it is mainly used to select students into the most competitive upper secondary schools in the country. This review recommends several changes to the way that Bulgaria allocates students into secondary schools (see Chapter 2). However, keeping the NEA in Grade 7 will make it difficult to signal that the new national assessment is a purely formative and system-monitoring tool. Moving the lower secondary NEA to a different grade (e.g. Grade 6) and making it a sample would help reinforce the understanding that this assessment does not carry any stakes for students while still providing valuable information about learning outcomes close to the end of a curriculum cycle.

  • Consider having an NEA during upper secondary education. Bulgaria should reconsider the design of the Grade 10 NEA. This is an important transition point from a policy perspective and having achievement data at this level can provide information about the extent to which students have mastered national learning standards at the end of compulsory schooling. At the same time, Bulgaria already implements its Grade 12 Matriculation examination, which provides learning data at the end of upper secondary education (see Chapter 2). In the long term, if Bulgaria would like to have reliable data in the first years of secondary school (PISA could play this role but does not provide information on the extent to which students are mastering the national curriculum), it could continue implementing its Grade 10 census assessment. The Grade 10 assessment would not only inform policy at the system level but its results could also feed into Bulgaria’s external school evaluation framework (see Chapter 4). However, in the context of limited financial resources and also to avoid testing fatigue, this assessment should not take place annually but on a three-year basis.

Since teachers administer the school-based diagnostic tests at the start of each academic year, Bulgaria should consider moving the NEA’s administration to the middle of the school year to ensure the assessments do not overlap and create a testing burden on schools and students. Positioning the administration of the NEA in the middle of the school year would also further distinguish the national assessment as a system monitoring tool with low stakes for students. Table 5.5 provides a summary of the proposed changes to Bulgaria’s national assessment system.

It is common for national assessments to cover literacy and numeracy, as these skills provide a foundation for learning. Among OECD countries with national assessments at the lower secondary level, around 64% test students in literacy and 60% test students in mathematics on an annual basis (Maghnouj et al., 2019[15]; 2015[16]). This review recommends that Bulgaria continues, over the medium term, to cover mathematics and Bulgarian language and literature in all grades assessed by the NEA. This arrangement will ensure consistency across testing instruments and allow researchers to conduct longitudinal analysis.

If Bulgaria administers the NEA as a sample to older cohorts, the optional foreign language examination should be removed from the suite of national assessment subjects. At present, foreign language tests in English, French, German, Italian, Russian and Spanish are an optional part of the Grade 7 and 10 NEAs and lead to a qualification in accordance with the Common European Framework of Reference for Languages (Ministry of Education and Science, 2021[4]). However, the inclusion of these tests as part of the NEA creates stakes for students and seems to reflect Bulgaria’s historical emphasis on elite foreign language schools that prepare students for study and work abroad. Considering the costs associated with developing and implementing tests in various foreign languages, these subjects should be distinct optional examinations that are separate from the NEA system to avoid detracting investments from core subject tests that align more closely with national priorities.

Instead of focusing on foreign languages, Bulgaria could consider broadening the knowledge areas covered in later grades of the NEA to increase the assessment’s validity in terms of curriculum coverage. For example, if Bulgaria chooses to have a national assessment during upper secondary education, this assessment could cover digital literacy, which is considered a fundamental 21st-century competency, or civic, ecological and intercultural education, which are among the core competency areas identified in Bulgaria’s Pre-school and School Education Act (2016). The government already has legislation that enables such subjects to be covered within the NEA. Bulgaria could administer these subjects on a rotating basis. Australia’s uses a similar approach to measure different subjects each year, which helps to reduce the cost of developing and administering multiple tests at the same time.

The current NEA system already includes a mix of multiple-choice, open-ended and (in Grades 7 and 10) essay writing tasks. Having a range of item types can help measure a wider range of skills, including the higher-order thinking skills reflected in Bulgaria’s new curriculum. This is positive considering the country’s Strategic Framework for Education identifies strengthening the competency-based approach to education as a national priority. However, while this review did not examine sample questions from the existing NEA, the Center for Assessment informed the OECD review team that the content of NEA questions does not currently align with Bulgaria’s competency-based curriculum and tend to focus on memorising and recalling knowledge rather than applying critical thinking skills. Bulgaria also acknowledged the need to develop more complex open-ended questions in the State Educational Standard for the Evaluation of the Results of Student Learning (Ordinance 11).

These changes will require revising the NEA’s framework to ensure that test items do not encourage memorisation and that proper item-writing convention is followed, such as reviewing the tests and items for potential bias and varying the placement of distractor choices (i.e. incorrect options in a multiple-choice test) (Anderson and Morgan, 2008[17]). Distractor choices should also represent common mistakes made by students. These changes will enable the NEA to monitor the implementation of the national curriculum and learning goals, which are among its stated purposes. Moreover, despite the fact that Bulgaria has in theory already finalised the implementation of the new curriculum, teachers still struggle to integrate new educational approaches in their classroom practice (see Chapter 2). Aligning the NEA to Bulgaria’s competency-based curriculum can therefore provide a helpful model for how teachers can draft test questions that assess transversal and higher-order competencies.

Many factors influence the learning process, from the classroom environment and teacher quality to students’ socio-economic background and school location. As a result, most national assessments include background questionnaires to collect information about students, teachers and schools, which can be analysed to help contextualise results. By identifying where interventions could help improve performance and the overall learning experience, this information can inform policy, which is one of the primary purposes of Bulgaria’s NEA. Currently, the NEA does not collect comprehensive background information on factors that may influence learning outcomes. Some contextual information (e.g. level of parental education) is available from other administrative databases; however, these are not easily linked to the NEA database. As Bulgaria develops and implements a new NEA system, the Center for Assessment could identify what kind of contextual information is already (or will be) collected through Bulgaria’s EMIS system and make sure it is used when presenting NEA results. For areas or topics that might not be covered by the EMIS, the Center for Assessment could create background questionnaires to address topics of interest. For example, having questions on student wellbeing in the post-COVID-19 context could reveal insights on how students have coped with disruptions to schooling. Targeted background questionnaires (for instance, questions to help classify the student’s socio-economic background) could also provide more regular and timely data to monitor Bulgaria’s national education goals, thereby reducing the country’s reliance on discrete indicators from international assessments and surveys.

Although most countries still administer their national assessments via paper and pencil, a growing number are introducing computer-based assessments (CBAs) (Clarke and Luna-Bazaldua, 2021[18]). The advantages of moving to CBAs are significant, especially in terms of increasing test reliability since a CBA is less likely to be affected by human error and integrity breaches. It is also considerably cheaper to administer CBAs in the long term and they have the advantage of delivering results more quickly. Bulgaria has recently moved to mark all NEA submissions electronically and this will now be done by a randomly-selected regional commission, rather than a commission from the same region as the school, increasing the reliability of the national assessment system. In the future, Bulgaria could conduct a feasibility study to evaluate the system’s readiness for administering a computer-based assessment. Once any technical and financial concerns have been addressed, Bulgaria should transition the NEA from paper to computer delivery.

National assessments are only as valuable as the extent to which different stakeholders understand and use their results. While revising its national external assessment system, Bulgaria should consider how to disseminate NEA results so they inform policy and support school improvement efforts. While it is positive that the Ministry commissions ad hoc analysis of NEA data, there is no regular report that summarises results and provides relevant insights for policymaking. At present, the Center for Assessment shares school level NEA results in a digital format that compares a particular school with regional and national averages. Since 2018, schools have also been required to publish their average NEA score on their website, which aims to promote transparency and accountability. However, schools do not receive detailed information about how their students performed and stakeholders cannot make comparisons based on similar characteristics like socio-economic background. This also leads to media outlets ranking school performance, which perpetuates a narrow understanding of school quality and undermines the formative potential of the national assessment system.

For individual students, NEA scores are presented via a private online portal as well as in the students’ relevant certificate of completion at the end of each education phase (i.e. initial primary; basic education; initial high school). Providing a raw score on student certificates contributes to the perception that NEA results have consequences, which in reality is only true for students who wish to compete for places in elite schools. Moreover, while teachers have access to their students’ NEA scores, the results do not inform teaching and learning processes. For example, teachers receive no analysis of how students perform on particular test items to identify common errors. Such materials can serve as a basis for developing strategies to address areas of low performance. Moreover, the lack of information about the NEA’s proficiency levels and other technical details prevent the assessment from helping develop teacher assessment literacy. To make the most of its investments in strengthening the NEA, Bulgaria will need to develop a comprehensive strategy to disseminate results in ways that support the assessment’s potential as a formative tool for system evaluation.

Census-based testing generates data that allow schools to compare their average performance with other schools. While this level of comparison allows for greater transparency and can support broader school accountability measures, it often results in schools with the greatest concentration of students from more advantaged backgrounds continually being considered the most effective. It also undermines the potential formative function of these assessments. Media outlets sometimes rank regions and schools without any contextual information using Bulgaria’s census-based NEAs. Although it is hard for the Ministry and the Center for Assessment to impede such actions, it is important they monitor how NEA data are presented and make it easier for actors to report more contextual information by, for example, allowing access to comprehensive information on the NEAs that can be found online (see previous section). For example, it would be more appropriate to compare the NEA results of schools located in the same geographic location, with similar student populations (i.e. students with similar socio-economic backgrounds) as well as comparable structures (i.e. compare multi-grade schools with each other) (Box 5.1). This would encourage more meaningful benchmarks of performance. The Ministry should also take steps to reduce the association of NEA results with high stakes for schools and teachers. Anecdotal evidence suggests that it is common for school principals to use student NEA results primarily as a way to assess the quality of teachers, which can encourage practices that are detrimental to student learning, such as teaching to the test (see Chapter 3). More appropriate use of census-based NEA data would be to inform the Inspectorate’s risk assessment formula to identify which schools it should prioritise for external evaluation (see Chapter 4).

Reporting results to the public is an important part of using the NEA to promote transparency and inform education policies. While it is positive that Bulgaria publishes NEA results online, the information available to the public is very limited, with no in-depth analysis or user-friendly platforms that allow users to explore the data. Moreover, Bulgaria does not have a national report that assembles NEA results and other education data to inform policy making. To fulfil its purpose of being a system monitoring tool, NEA data should be part of a comprehensive state of education report (see Recommendation 5.3.1). The NEA results should be presented at the national and regional level but also disaggregated by characteristics relevant to education policy, such as gender, school type and student socio-economic status (see Recommendation 5.2.2). This type of analysis and reporting can help identify learning disparities and provide evidence for targeted policy interventions. Once the NEA scoring system has become criterion-referenced, trend analysis of the NEA data should also be included in the state of education report to inform stakeholders about progress over time. Such efforts would help generate greater public accountability for system performance.

In addition to the state of education report, Bulgaria should also consider other ways to make data from the new national assessment system more accessible to stakeholders. As the Ministry develops and implements its new EMIS system (see Policy issue 5.1), there are plans to link data from the NEA and other information on student performance with the administrative database. NEA results could then be linked to figures such as student-teacher ratios, teacher qualification levels and school funding, etc. This kind of information should be made available in the suggested EMIS public interface with built-in analytical functions (see Recommendation 5.1.2), allowing for more comprehensive dissemination of NEA results. Finally, in order to make NEA information even more accessible and visible, the Ministry could dedicate a page on its official website to the national assessment system. This webpage would allow the public to find technical documents regarding the NEAs and a link to the new EMIS system where NEA results, along with other indicators, will be available.

In addition to monitoring system progress, national assessments can also inform pedagogy by providing timely and reliable feedback on student learning. At present, Bulgaria’s NEA system provides limited information to support improvement in teaching and learning processes. For example, teachers struggle to understand how the NEA can support their teaching practices. This is partly caused by the lack of targeted dissemination mechanisms that would allow teachers to better understand and use NEA results to support learning in their classrooms. There is also an absence of support to help schools use average results to inform improvement efforts. Census-based testing like the NEAs (for Grades 2 and 10) could generate reports for a more formative use at different levels of the education system (OECD, 2013[2]) (see Box 5.2 for different report level examples). Bulgaria could consider having the following reports:

  • School-level reports, presenting the performance of individual schools with benchmarks for contextualised comparisons such as schools from the same region, same district and socio-economic level. These reports can also contain analysis of individual questions, topics or skills so that teachers and school principals can identify in what areas and with what competencies students struggle the most. To ensure that detailed school information is not used for narrow accountability purposes, detailed school-level reports should only be accessible to school-level officials, while the Inspectorate can continue to use school averages to inform its risk assessment formula (see Chapter 4).

  • Report for teachers, at the classroom level, containing data on the extent to which each student in a class has achieved national learning standards. The report should contain information on how students perform on each item of the NEA (i.e. item-level analysis), emphasising areas in need of improvement. This information should be presented alongside contextualised comparison groups, such as gender, linguistic minorities, socio-economic background, etc. Providing such data to teachers would be crucial to help them engage with the results in a formative manner. Teachers can use this report to identify and support the individual learning needs of students and benchmark their in-classroom assessments. As this report contains confidential information about individual students, it should only be available to school-level officials.

  • Report for students, for Grade 2 assessment. This report should provide information about the extent to which an individual student has achieved expected learning goals, as set out in the curriculum. Results could be compared to national, regional and other relevant benchmarks. Care should be taken to avoid the perception that results carry stakes and results should no longer be included in the students’ relevant certificate of completion at the end of each education phase. Instead, results could be discussed as part of regular parent-teacher meetings. Teachers might be provided guidance on how to discuss the results within broad categories of the student meeting or not meeting national expectations, rather than focusing solely on specific scores. These reports should only be accessible to individual students (or their parents) and relevant teachers.

Bulgaria has undertaken fundamental education reforms. At the same time, financial resources are limited and the Ministry will need to make hard choices about where money is spent. These decisions are further complicated by the COVID-19 pandemic, which has worsened existing education challenges and introduced new ones. A rapid assessment conducted by the United Nations Children's Fund (UNICEF), for example, suggested that at least 50 000 school-age children in Bulgaria experienced significant learning disruption during the crisis and a fifth of surveyed students reported performing worse (UNICEF, 2020[21]). The closure of schools is likely to have had the biggest impact on certain vulnerable groups, such as children from lower-income households or with disabilities who may not have had access to parental support, resources for learning or customised pedagogical support. These students will likely need additional support to overcome learning gaps. Moreover, new funding streams have emerged to support recovery, such as the Next Generation EU COVID recovery package, and the Bulgarian government will need to decide how to allocate this funding outside pre-existing planning cycles. Moving forward, these policy choices may be contentious, making trust and clarity in decision making ever more important.

Positively, Bulgaria already disseminates basic information on the features of the education system and carries out ad hoc policy evaluations and research. These efforts demonstrate a commitment to transparency and improving policy through evidence. However, Bulgaria’s education data are not regularly analysed, disseminated and used to guide system improvement strategically. To achieve its 2030 vision and the objectives of the Strategic Framework for Education, Bulgaria needs to keep sight of its stated goals and more clearly communicate its reform agenda to education stakeholders across the country, not least the REDs. These efforts will be critical to building trust with the wider public, to avoid major roadblocks and help the Ministry crowdsource new solutions – in light, for instance, of the fact that the country’s education reforms are likely to affect certain school environments differently.

The Bulgarian government does not produce regular reports on how the education system is performing. This is problematic because it means that different education actors – not least, the Ministry and its REDs – do not have timely and comprehensive analysis to flag issues, track progress and support evidence-based policy debates. Performance monitoring has been hindered by data constraints but also by capacity constraints. Currently, system monitoring and evaluation is the responsibility of the Ministry’s Strategic Planning Unit (SPU) but this unit does not have sufficient resources to monitor system performance regularly. The unit’s location within the Ministry also makes it vulnerable to political influence, which would undermine confidence in its analysis and reporting. Many OECD countries have established independent bodies to ensure regular, objective monitoring of the education system and to commission research on major policies and issues. The independence afforded to these bodies enables them to conduct rigorous, objective analyses of data and present messages that may challenge education authorities. Their autonomy strengthens trust in their findings and the likelihood that their research will be used to inform constructive debates between different stakeholder groups – particularly among those that are distrustful of the government or with a specific interest to defend.

The body should be mandated to produce regular reports on the performance of the education system. It would also be well placed to produce evaluations of major education policies and carry out research on progress towards strategic goals. Its reporting should cover both general education and VET, in order to track how students perform and move across different pathways. Since resource constraints may preclude the creation of a new stand-alone agency, the Bulgarian government should consider establishing a high-level board or commission that would set a research agenda, commission the work and ensure that this work is robust. This body should comprise representatives of different education agencies (such as the Ministry, the National Agency for Vocational Education and Training and the Inspectorate) to ensure that its work gets buy-in from the government as well as independent, non-state actors, to ensure objectivity. It should have a clear legal footing that supports its independence.

The research itself should be conducted by independent experts or by existing bodies such as the Inspectorate. The Inspectorate should be considered, in particular, to compile regular reports on system performance. This has been the approach in the Netherlands, where the Dutch Inspectorate for Education compiles the country’s periodic state of education report and other system evaluations and research. The benefit of this second approach is that the Inspectorate can capture a more qualitative picture behind headline quantitative metrics and thus translate education data into an analytical narrative. Tasking the Inspectorate with conducting system-level evaluation and research could also help the body to plan its annual activities and place inspections in a broader context of how the system is performing.

The high-level board or commission should establish a multi-year research agenda. A multi-year research agenda would strengthen the legitimacy of the body, particularly if it is linked to Bulgaria’s national development strategy and national budgeting processes. Setting a multi-year research agenda could also help to provide a new, independent organisation with a sense of mission from the start. The research agenda should include a regular analytical report on the education system as well as a set of a discrete tasks, such as ex ante or ex post impact assessments of major government policies and/or reports on important thematic issues.

The body may choose to focus initially on ex post impact assessments. The Bulgarian government has recently implemented two policies that entail significant costs – namely, a policy to increase teacher salaries by a significant margin and the introduction of an additional (compulsory) pre-school year. The research body could conduct evaluations on these policies a couple of years after they were first implemented, to ascertain whether they are achieving their aims and are an efficient use of taxpayer money or if they should be adjusted. In determining how to increase teacher salaries, for instance, a set of complex factors need to be taken into account (Li et al., 2019[24]), which include:

  • Projections on retirement.

  • The long-term fiscal impact of more teachers moving up the salary scale.

  • How the new salary structure can best incentivise improvement and reinforce other initiatives.

  • The trade-offs between salary increases and other investments that could support system goals.

Over time, the body could also develop reports on important thematic issues. These could focus on topics associated with significant public funding or on recurrent issues highlighted in the annual report. For instance, Bulgaria has recently split its primary and secondary education into two additional stages and there are concerns that this decision will track students too early and lead to the closure of schools in rural areas. A thematic study on the causes and consequences of student dropout, using longitudinal data to gauge the impact of different policy measures, could be highly informative for system planning.

One of the priority items in this research agenda could be to produce an annual or biannual analytical report on the education system. This report would provide insights on education system trends, including NEA results (see Recommendation 5.2.3) and could demonstrate how the system is performing in relation to national strategic goals. Most OECD countries regularly publish these reports and they provide important information to a variety of stakeholders and researchers. The report could also provide qualitative information, for instance insights from school evaluations and feedback from the sector’s main stakeholders. In particular, Bulgaria might consider a section on student engagement and wellbeing. Research has shown that issues around engagement could be an important driver of low educational attainment in Bulgaria – with studies reporting that Roma are less likely to see the benefits of education and that truancy levels are significantly higher in Bulgaria than in other PISA-participating economies. Student engagement and wellbeing are likely to become still more important in the future – the government’s decision to add an additional (compulsory) pre-primary year will be costly unless it is accompanied by changes in attitudes to learning that see more students feeling encouraged to learn, rather than obliged to. In Norway, the Education Mirror (their annual sectoral report) uses its national Pupil Survey and PISA data to monitor information about student-teacher relationships, student motivation, the levels of home support that students receive and student wellbeing (Norwegian Directorate for Education and Training, 2014[25]). Another model for this report could be the state of education report compiled by the Dutch Inspectorate of Education (Box 5.3).

The research body’s work should be a valuable resource for different system-level actors in Bulgaria and help to inform national policy debate around education. This resource, in turn, could generate more demand for system-level evaluation and encourage more information sharing. OECD countries have thus worked to increase stakeholder engagement with their national evaluation institutions – for instance by inviting them to propose research topics, provide comments on research and evaluation findings and/or co-finance research projects.

The Ministry plays a paramount role in education policy in Bulgaria. It carries out most policy planning, regulates the sector, and territorial education authorities and schools rely heavily on central funding. At the same time, the Ministry must design education policies for implementation in very different regional contexts, which bring particular challenges and opportunities. To ensure that policies centrally planned by the Ministry meet their goals, Bulgaria must be able to track how different parts of the system are performing on a regular basis. Stronger performance monitoring would enable the Ministry to see how its policies are playing out in different contexts, identify emerging issues and recalibrate its policies and/or extend additional support. This is also important for accountability purposes – sending a signal to stakeholder groups and local communities that the government is committed to accountability, which should help to build trust and co-operation towards reform. Tools to monitor system performance against national goals would not only be helpful for the Ministry. The latter should also consider developing customised tools for regional and municipal authorities, which would help to guide support for improvement in school sub-systems.

The Ministry needs more information from the REDs on how their reforms are affecting teaching and learning in schools and the specific challenges that REDs are facing. While the Ministry reports very positive communication with the REDs, the review team encountered some uncertainty within the REDs around the implementation of Pre-school and School Education Act (2016) reforms, capacity constraints and uneven quality of engagement between the REDs and the schools in their jurisdiction. To ensure that the Ministry gets the information it needs, it should regularly consult with the REDs in a structured and open discussion around how each region is performing against national development goals and additional support that could be provided. These consultations could be used to gather information for the analytical state of education report Recommendation 5.3.1, providing qualitative information on the state of education – for instance critical challenges faced in certain regions and noteworthy activities to address them. Finally, it could provide space for the Ministry to request additional data required to monitor against the Strategic Framework for Education (OECD, 2020[26]).

Bulgaria’s REDs would benefit from having more access to information on how their region is performing relative to others, to guide their work with local schools. This resource would help regional authorities to target support, signal needs to the central government and identify other regions for peer learning. Regional authorities in poorer regions may find it harder to identify peers that have successfully tackled major policy issues such as student dropouts or teacher shortages. This resource could enable REDs to provide more strategic support to schools and connect to good examples for peer learning.

A regional report on key outcome indicators could take the form of a regional “state of education” profile, based on input provided by the REDs and analysis conducted by an independent evaluation institution (see Recommendation 5.3.2). It should include national assessment data if changes are made to the NEA, as outlined in Recommendation 5.3.2. NEA results should be reported against a national average and against the average of a group of regions with similar characteristics. This report would reinforce accountability and transparency further if it is made public. The REDs could be provided with additional information that is not included in a regional profile, such as NEA results disaggregated by sub-groups within the region (e.g. municipalities).

Bulgaria has made a significant step to improve its education data and this chapter proposes further measures to strengthen the NEA (see Policy issue 5.2), which would produce valuable data on learning outcomes. Bulgaria should optimise this investment by ensuring that the results of system evaluation are used to inform policy. Implementation planning based on evidence should be strengthened and the use of evidence to inform policy should be made visible to the public. These two measures will help the Ministry to target limited resources and build trust in its reforms among the public.

Regular implementation planning can help policy makers to sequence and adjust policy interventions, keep the implementation process on schedule and facilitate co-ordination. This is particularly important when resources are limited, reforms are new and policies may be contentious. Regular implementation planning can also help to prevent sudden policy changes, which may lead to the abandonment of planned reforms.

The Ministry should consider establishing annual or biannual action plans, linked to its mid-term strategy. This practice could help the Ministry to adjust its policies in response to new challenges and opportunities, while still keeping sight of its long-term objectives and planned reforms. This has been an approach taken in Ireland, for instance, where the government produces an Annual Action Plan on Education. The action plan is formulated in consultation with important stakeholders, through a variety of means – including an online call for submissions, input from other departments, regional fora, thematic workshops and other meetings with key stakeholders (Department of Education, 2017[27]).

Bulgaria could consult with the public and make the case for policy choices clearer, by dedicating a parliamentary session to discuss the findings of a state of education report (outlined in Recommendation 5.3.1). This is a practice used in many OECD countries to hold the government accountable and embed the use of evidence in the policy-making process. The report could be discussed in a meeting of the Parliamentary Committee on Education and Science, and critical stakeholders should be invited to attend. Transparency would be increased by publishing a video of proceedings on the Ministry’s website. Orienting the session around the findings of a state of education report would help the government to collect constructive feedback. The session should be attended by a high-level representative of the Ministry, who would give insights on how the results of system evaluation are being used to inform policy.

References

[9] Abdul-Hamid, H. (2014), “What matters most for education management information systems: A framework paper”, SABER Working Paper Series, No. 7, World Bank, Washington, DC, https://openknowledge.worldbank.org/handle/10986/21586 (accessed on 16 July 2018).

[17] Anderson, P. and G. Morgan (2008), Developing Tests and Questionnaires for a National Assessment of Educational Achievement, World Bank, Washington, DC, https://elibrary.worldbank.org/doi/abs/10.1596/978-0-8213-7497-9.

[18] Clarke, M. and D. Luna-Bazaldua (2021), Primer on Large-Scale Assessments of Educational Achievement, World Bank, Washington, DC, https://openknowledge.worldbank.org/handle/10986/35494 (accessed on 29 June 2021).

[14] Danchev, P. et al. (2015), Bulgaria: Piloting Statistical Models for Estimation of Schools’ Value-Added Using the Results from the National Assessments, World Bank, Washington, DC, http://hdl.handle.net/10986/24456 (accessed on 15 July 2021).

[27] Department of Education (2017), Action Plan for Education 2017, Government of Ireland, https://www.education.ie/en/Publications/Corporate-Reports/Strategy-Statement/Action-Plan-for-Education-2017.pdf.

[12] DFID (2011), “National and international assessment of student achievement: A DFID practice paper”, Department for International Development, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/67619/nat-int-assess-stdnt-ach.pdf (accessed on 13 July 2018).

[3] EBRD (2019), Bulgaria Country Diagnostic: December 2019, European Bank for Reconstruction and Development, https://www.ebrd.com/documents/comms-and-bis/country-diagnostic-paper-bulgaria.pdf?blobnocache=true.

[24] Li, R. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Georgia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/94dc370e-en.

[15] Maghnouj, S. et al. (2019), OECD Reviews of Evaluation and Assessment in Education: Serbia, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/225350d9-en.

[6] Ministry of Education and Science (2021), Distance Learning in the Electronic Environment 2020-2021: Implications and Looking Ahead, Ministry of Education and Science of Bulgaria, https://www.mon.bg/bg/144?__cf_chl_jschl_tk__=pmd_oV9LS5oGXcNLJoldf9T9IAl528prZY51d7ljZiv1Fgw-1631627194-0-gqNtZGzNAhCjcnBszQcR (accessed on 14 September 2021).

[4] Ministry of Education and Science (2021), OECD Review of Evaluation and Assessment: Country Background Report for Bulgaria, Ministry of Education and Science of Bulgaria.

[10] Newton, P. (2007), “Clarifying the purposes of educational assessment”, Assessment in Education: Principles, Policy & Practice, Vol. 14/2, pp. 149-170, https://doi.org/10.1080/09695940701478321.

[25] Norwegian Directorate for Education and Training (2014), The Education Mirror 2014: Facts and Analysis of Kindergarten, Primary and Secondary Education in Norway, Norwegian Directorate for Education and Training.

[22] NRO (2021), Netherlands Initiative for Education Research, Nationaal Regieorgaan Onderwijsonderzoek, http://www.nro.nl/en/ (accessed on 23 September 2021).

[19] Nusche, D. et al. (2011), OECD Reviews of Evaluation and Assessment in Education: Sweden 2011, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264116610-en.

[20] NWEA (2021), MAP Suite, https://www.nwea.org/the-map-suite/ (accessed on 9 June 2021).

[1] OECD (2021), Decentralisation and Regionalisation in Bulgaria: Towards Balanced Regional Development, OECD Multi-level Governance Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/b5ab8109-en.

[26] OECD (2020), Education Policy Outlook: Czech Republic, OECD, Paris, https://www.oecd.org/education/policy-outlook/country-profile-Czech-Republic-2020.pdf.

[5] OECD (2018), Education Policy Outlook 2018: Putting Student Learning at the Centre, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264301528-en.

[16] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2015-en.

[2] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.

[13] OECD (2011), Education at a Glance 2011: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2011-en.

[23] Santiago, P. et al. (2012), OECD Reviews of Evaluation and Assessment in Education: Portugal 2012, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264117020-en.

[8] UNESCO (2020), Efficiency and Effectiveness in Choosing and Using an EMIS: Guidelines for Data Management and Functionality in Education Management Information Systems (EMIS), http://tcg.uis.unesco.org/wp-content/uploads/sites/5/2020/09/EMIS-Buyers-Guide-EN-fin-WEB.pdf.

[7] UNESCO (2008), Education for All by 2015: Will We Make It? EFA Global Monitoring Report.

[21] UNICEF (2020), Rapid Assessment of COVID-19 impact on education in Bulgaria: Deepening learning loss and increasing inequalities, https://www.unicef.org/eca/rapid-assessment-covid-19-impact-education-bulgaria (accessed on 6 October 2021).

[11] Viennet, R. and B. Pont (2017), “Education policy implementation: A literature review and proposed framework”, OECD Education Working Papers, No. 162, OECD Publishing, Paris, https://dx.doi.org/10.1787/fc467a64-en.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2022

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at https://www.oecd.org/termsandconditions.