3. Guidance on improving Latvia’s indicator system and selecting EDG indicators

A robust indicator system is one that provides accurate, reliable and timely information on all aspects of the education and skills system. Such a system is necessary for monitoring and evaluating whether reforms are having the desired impact (OECD, 2014[1]). The information gathered through an indicator system will allow Latvia to monitor and adapt the implementation of the policy actions in the EDG and therefore support progress towards achieving EDG policy objectives (see Chapter 2). The benefits of a robust indicator system include informing decisions made by all relevant actors, enabling smart investments and effective resource allocation, and promoting the accountability of all stakeholder groups to improve learning outcomes for all.

This chapter provides guidance on improving Latvia’s indicator system and selecting relevant EDG indicators. It is organised as follows:

  • Section 2 describes the elements of an effective process for selecting indicators for Latvia’s EDG. It also features examples of key indicators found in effective indictor systems.

  • Section 3 assesses Latvia’s indicator system, provides an overview of the main indicator data sources and highlights important indicators that need to be developed.

  • Section 4 presents a list of potential indicators for the EDG and an overview of further considerations for each indicator.

  • Section 5 makes suggestions for how Latvia could strengthen its indicator system.

  • Section 6 provides a summary of the chapter and its recommendations.

An indicator system for an education and skills system allows a country to assess whether it is achieving its objectives through information on the human and financial resources invested in skills, how the skills system operates and evolves, and the returns on investments in skills (OECD, 2018[2]). Such an indicator system is used to monitor, evaluate and guide the implementation of education and skills policies and strategies, such as Latvia’s EDG.

This section identifies the key steps for selecting indicators:

  1. 1. Consider a comprehensive set of indicators.

  2. 2. Choose indicators that are based on high-quality data.

  3. 3. Choose indicators based on their fitness for use.

  4. 4. Prioritise and document indicators.

Having a comprehensive set of indicators is important, as different indicators measure different parts of the education and skills system. A diverse set of indicators allows policy makers to obtain a comprehensive picture of skills outcomes. Based on the information from the indicators, policy makers can assess the adequacy, effectiveness and efficiency of resources invested in education; the quality and equity of education opportunities and outcomes; and the effectiveness of education policy measures (OECD, 2018[2]).

Indicators are critical for strategies such as Latvia’s EDG as they allow decision makers to steer education and skills policies based on the information provided by the indicator system. As shown in Figure 3.1, indicators can be categorised largely into four groups with the following framework:

  1. 1. Indicators of the inputs into the education and skills system.

  2. 2. Indicators of participation and progression within educational institutions.

  3. 3. Indicators of the outputs, outcomes and impacts of the education and skills system.

  4. 4. Indicators of the contextual factors that influence education and skills policy.

While descriptions, considerations and limitations of potential indicators are presented in this section, Section 4 discusses in greater detail which indicators could be relevant in what way for Latvia’s EDG.

Input indicators provide information on the policy levers that shape participation, progression, outputs and outcomes at each education level. Policy levers here refer to the resources invested in education, including financial, human (such as teachers and other school staff) or physical (such as buildings and infrastructure). Policy levers also include policy choices regarding the instructional setting of classrooms, pedagogical content and delivery of the curriculum. Indicators analyse the organisation of schools and education systems, including governance, autonomy and specific policies, to regulate participation of students in certain programmes. Table 3.1 provides an overview of key indicators with their descriptions, considerations and limitations. These represent the OECD’s list of the key indicators most commonly used in education and skills systems across the OECD. In selecting indicators for its EDG, Latvia should ensure that it consults the considerations for and limitations of each one.

Indicators of participation and progression within educational institutions assess the likelihood of students accessing, enrolling in and completing different levels of education, as well as the various pathways followed between types of programme and across education levels. Table 3.2 provides an overview of a selection of key indicators with their descriptions, considerations and limitations. These represent the OECD’s list of the key indicators most commonly used in education and skills systems across the OECD. In choosing indicators for its EDG, Latvia should ensure that it consults the considerations for and limitations of each one.

Indicators of the outputs, outcomes and impacts of the skills system analyse the characteristics of individuals exiting the system, such as their educational attainment. Outcome indicators examine the direct effect of the output of education systems, such as the employment and earning benefits of pursuing higher education. Impact indicators analyse the long-term indirect effect of the outcomes, such as knowledge and skills acquired, contributions to economic growth and societal well-being, and social cohesion and equity. Table 3.3 provides an overview of a selection of key indicators with their descriptions, considerations and limitations. These represent the OECD’s list of the key indicators most commonly used in education and skills systems across the OECD. In choosing indicators for its EDG, Latvia should ensure that it consults the considerations for and limitations of each one.

Policy levers typically have antecedents, which are external factors that define or constrain policy but that are not directly connected to the policy topic at hand. Demographic, socio-economic and political factors are all important characteristics to consider when interpreting indicators. The recent financial crisis, for example, had a significant impact on the level of public funds available for education. COVID-19 is likely to have a similar impact, given its secondary effects on economies.

The characteristics of the students themselves, such as their gender, age, socio-economic status or cultural background, are important contextual factors that influence the outcomes of education policy.

Analysis of the contextual factors and the interplay between them and the indicators on input, participation, progression, outputs, outcomes and impacts contribute to understanding a variety of policy perspectives. These include the level of quality and equity of skills outcomes and education opportunities; the adequacy, effectiveness and efficiency of resources invested in education; and the relevance of skills policy measures to improve skills outcomes. Table 3.4 provides an overview of a selection of key indicators with their descriptions, considerations and limitations. These represent the OECD’s list of the key indicators most commonly used in education and skills systems across the OECD. In choosing indicators for its EDG, Latvia should ensure that it consults the considerations for and the limitations of each one.

Choosing the right indicators to be included in a strategic document such as the EDG is critical to the success of a strategy. In addition to ensuring the relevance of the indicators to the selected policy objectives and policy actions, it is important to ensure that they are quality indicators. The quality of an indicator is determined by the data on which they are based and their characteristics. To facilitate the prioritisation process for selecting indicators, a quality framework can be used.

The OECD has developed a framework and guidelines for OECD statistical activities that provides useful guidance on the main dimensions of quality indicator data. Within this framework, quality itself is defined as “fitness for use” in terms of user needs. Table 3.5 lists the OECD’s seven dimensions of indicator data quality, with the addition of cost-efficiency. Although cost-efficiency itself is not a dimension of indicator data quality, it should be considered throughout the assessment of any indicator data source and indicator development.

The quality dimensions listed in Table 3.5 are relevant for assessing indicator data sources. However, some additional dimensions must be taken into account when specifically assessing the quality of indicators. One of the common frameworks used to assess the quality of indicators is the S.M.A.R.T framework, which stands for Specific, Measurable, Attributable/Actionable, Relevant and Timely

The last two characteristics (relevant and timely) are also listed in Table 3.5 and are common to assessing both the quality of the raw data and the indicators. The other three characteristics (specific, measurable and attributable/actionable) are particularly important for the development of indicators, and are described in more detail below:

  • Specific: All of the terms which comprise an indicator must be carefully defined. Even seemingly clear concepts such as “schools” and “students” can be interpreted differently and have an impact on the data collection. For example, an indicator such as “share of higher education students enrolled in a mobility or exchange programme” needs to specify what is meant by enrolled (formal programmes only? Is there a minimum programme duration?). It is also important to clarify what is meant by “mobility or exchange programmes” (degree mobility or credit mobility?). A good starting point for defining concepts could be to examine internationally agreed definitions and adapt them as necessary (see (OECD, 2017[6])). Moreover, the indicator should be specific in terms of the most appropriate level of disaggregation.

  • Measurable: The indicator should have the capacity to be counted, measured, analysed or tested.

  • Attributable/actionable: The indicator should allow targeted stakeholder groups to act on their results. This means that the indicator must be designed/selected and it should be kept in mind how the relevant stakeholders might be able to act on it.

For the EDG, Latvia should ensure that all selected indicators are based on data that fulfil the quality criteria outlined above and that are also specific, measurable and attributable/actionable. Having such indicators will ensure that progress in implementing the policy actions and achieving the policy objectives can be sufficiently measured, monitored and evaluated.

Besides considering the quality aspect of an indicator, it is also necessary to consider other aspects of indicators that have a bearing on their fitness for use, including the possibility of disaggregation (e.g. by different subgroups), international comparability, level of analysis (e.g. student, school, municipality, national), whether they are a single or composite indicator, and whether they are based on quantitative or qualitative data.

Indicator disaggregation can provide important information on different subpopulations. The most appropriate disaggregation depends on the context, but some of the most common subpopulations that should be explored are gender, location (urban vs. rural), immigrant background and socio-economic background. Indicator disaggregation is the main channel through which policies and strategies can evaluate the issue of equity.

Disaggregation can also help hold stakeholders to greater account. If a given outcome is being measured at the school level, school principals may feel more engaged in the process and the community may find it easier to hold them accountable for the results.

The ability to disaggregate an indicator is closely linked to the data source. Administrative data tend to contain fewer disaggregation opportunities than sample surveys or assessments. Disaggregation of sample surveys, however, may run into issues of small sample size, representativeness of the subpopulation and reliability. Surveys and assessments may also be costly.

Although the international comparability of indicators is not essential, and is certainly not necessary or feasible for all indicators, it may be interesting to consider comparability for at least a subset of indicators. Being able to compare education systems across borders can bring new perspectives and aid in the identification of good practices. The added benefit of an international perspective may be worth minor adjustments to some indicators to facilitate comparability.

Not every national indicator lends itself to international comparability, which often implies a loss of precision when compared to national indicators. Moreover, not every topic has well defined internationally accepted definitions of concepts. For example, it is very challenging to collect internationally comparable data on special needs education because of the different national definitions of “disability”. However, this important area must be monitored nationally.

In addition to international comparability, it may be relevant to assess indicators’ coherence with already established and approved indicators at the international level, such as the Sustainable Development Goals (SDG) and EU-level indicators and targets.

The level of aggregation of indicators must take into account both data availability and the policy relevance of measuring the indicator at that level. There are no particular advantages or disadvantages to a specific level of analysis, but it is important to ensure that the indicator is being measured and reported at the appropriate level. Many features of education systems have varying impacts at different levels of the system.

For example, at the level of students within a classroom, the relationship between student achievement and class size may be negative if students in small classes benefit from improved interactions with teachers. At the class or school level, however, weaker or disadvantaged students are often intentionally grouped and placed in smaller classes so that they receive more individual attention. At the school level, therefore, the observed relationship between class size and student achievement is often positive, suggesting that students in larger classes perform better than students in smaller classes. At higher levels of aggregation, the relationship between student achievement and class size is further confounded by the socio-economic intake of individual schools, or by factors relating to the learning culture in different regions.

Comparisons across time are at the core of education monitoring exercises. However, ensuring education statistics are comparable over time is often a challenge. Changes in the coverage of the data collection or in the methodology adopted might compromise the interpretability of results (OECD, 2017[6]).

The following are some of the important steps to ensure that trend data are comparable and reliable:

  • Each data collection exercise should be accompanied by detailed metadata that describes the concepts, definitions and methods used. This will ensure that all future data collection exercises will follow the same methods and will allow for the detection of any changes.

  • Trend data should be revised and re-collected whenever there has been a change in coverage or methodology. It is advisable that trend data be revised yearly to ensure that any adjustments to previous data have been considered in the most current data collection.

  • If a change in methodology or coverage is detected and there is no possibility of recollecting/recalculating past data there must be clear documentation of breaks in time series to avoid comparisons between the two periods (before and after the change).

A composite indicator is formed when individual indicators are compiled into a single index, based on an underlying model of the multi-dimensional concept being measured. A composite indicator is meant to measure multi-dimensional concepts that cannot be captured by a single indicator. Ideally, a composite indicator should be based on a theoretical framework/definition that allows individual indicators/variables to be selected, combined and weighted in a manner that reflects the dimensions or structure of the phenomena being measured (OECD, 2008[7]).

The main advantage of composite indicators is that they are able to summarise complex or multi-dimensional issues and provide an easier way to communicate with the general audience. Their communication power makes them particularly useful for advocacy purposes, giving policy makers one figure/target on which to focus.

However, there are some shortcomings with composite indicators, for example they may invite stakeholders to draw simplistic conclusions, and they provide less “actionable” information as a change in a composite indicator could have been caused by a change in any of the sub-indicators or even a combination of changes across indicators. Composite indicators may even disguise failings and/or successes in some parts of the system. In more complex and overarching composite indicators there is also a risk that they ignore dimensions of performance that are not measurable.

Although composite indicators may be useful for advocacy purposes, for the reasons above they have a limited use in monitoring a country’s education strategy or priorities.

Although most indicators used for monitoring purposes will be quantitative, qualitative data can provide useful information to help policy makers better understand and contextualise findings.

This is especially true when the goal is to monitor the existence or application of a policy. For example, one of the SDG 4 indicators tries to measure the “extent to which explicit formula-based policies reallocate education resources to disadvantaged populations”. Given the complexity of this topic there is an important question regarding how it can best be measured, i.e. using a quantitative index (see above on composite indicators) or a qualitative rating (from “no policy” to “fully developed mechanism”).

Some indicators may also be best served by a combination of quantitative and qualitative data. In the school-funding example presented above, it may be useful to assess both the existence of such policies and their main characteristics (e.g. what is the programme’s reach? How is the targeting done? Does the policy involve direct funding or resource provision?).

Another qualitative indicator in the SDG agenda measures the “extent to which (i) global citizenship education and (ii) education for sustainable development, including gender equality and human rights, are mainstreamed at all levels in: (a) national education policies (b) curricula (c) teacher education and (d) student assessments”. The current aim is for countries to monitor this indicator through self-reporting to the UNESCO’s 1974 Recommendation concerning Education for International Understanding, Co-operation and Peace Education relating to Human Rights and Fundamental Freedoms, which occurs every four years. However, there has been a push to require the inclusion of evidence (laws, regulations) to increase the reliability of this qualitative data collection.

Once a potential list of indicators is selected based on the quality of the data and their fitness for use, indicators should be finalised based on their ability to measure progress in implementing the policy actions and achieving the policy objectives. For this purpose it can be helpful to use a logical framework (see Chapter 2) that identifies the link between all policy objectives and policy actions and can be further extended to include the relevant indicators. In this way it is possible to clearly see whether all policy objectives and policy actions are covered by indicators. Each policy objective should have at least one to three indicator(s), which are typically output or outcome indicators. Each policy action should have at least one output indicator (Vági and Rimkute, 2018[8]). Impact-level indicators may also be used to regularly measure the wider impact of the EDG on the skills system and broader context they aim to affect. Impact indicators are best measured through impact assessment during the evaluation.

After selecting relevant indicators there may still be indicator gaps for certain policy actions. Some potential indicator gaps for each level of education are presented in Section 3. Although existing indicators could suffice for most policy actions, there may be cases where no existing relevant indicator can be used. In such cases, sufficient budget should be set aside to cover the cost of designing a methodology and/or collecting the necessary data to create such indicators.

Once the final list of indicators has been confirmed, indicators should be well documented. This can be done by developing indicator profiles, also referred to as an indicator passport or indicator technical notes, which provide detailed information about each indicator to ensure that they are robust and reliable (Vági and Rimkute, 2018[8]). The purpose is to clarify the definition, interpretation, scope and methodology for calculating each indicator. This fosters agreement among all involved actors about what is measured, how it is measured, and by whom. In general, the following information can be useful to include in indicator profiles:

  • Title of indicator.

  • Link to policy objectives and policy actions.

  • Brief definition of indicator.

  • Data source, collection method and collection frequency.

  • Name of institution(s) in charge of collecting the required data.

  • Methodology of calculation of indicator values (as a formula where necessary).

  • Indicator baseline, mid-term and final target values.

  • Anticipated difficulty of data collection and possible solutions.

  • Performance trend information for previous years.

Once the indicator profiles have been created, this information should be made publicly available to help increase the credibility and transparency of the EDG. The OECD/EU SIGMA initiative has produced a template of what this could look like in practice.1 The indicator profiles support the EDG by helping relevant actors and the general public to understand the indicators and the performance they are measuring. For monitoring and evaluation purposes, the indicator profiles clarify performance information and allow for an assessment of performance against specific targets.

As Latvia prioritises indicators for its EDG it is important to ensure that there are not too many indicators. If one indicator is sufficient to measure a particular policy action, then there is no need to have a second indicator for the same purpose. Indicators are costly to measure in terms of time and resources, and having too many can make it more difficult to report on them clearly. At the same time, it is important to have a sufficient number of indicators to measure progress towards achieving the EDG’s policy objectives.

Based on the lessons learned from Latvia’s previous EDG 2014-2020 and other international best practices, Latvia should consider improving and adapting its indicator system for the new EDG 2021-2027.

This section provides an overview of Latvia’s available indicator data sources, which are already available for Latvia to use for its EDG. It also assesses Latvia’s current indicator system and highlights specific missing indicators that Latvia should consider for its EDG. Where available, relevant country examples have also been included.

The main data sources for indicators in Latvia are the State Education Information System (SEIS), the State Education Quality Service (SEQS), the State Examinations System (SES), international surveys, and a graduate tracking system. The responsible authority and coverage across levels of education are presented in Table 3.6.

The State Education Information System (SEIS), established in 2009, provides information on educational institutions, licensed and accredited educational programmes, students, teachers, education documents, and national statistics. The SEIS is composed of the Educational Institution Register, the Teacher Register, the Educational Programmes Register, the State Unified Database of Children of Mandatory Education Age, and the Academic Staff Register. The system provides users with comprehensive information about students, including children in early childhood education and care (ECEC), teaching staff, and the performance rating of teachers (OECD, 2016[9]). In addition to administrative data, the database system contains information from organisational self-assessment reports. Data on the accreditation and licensing of educational institutions are also available. As part of the educational quality monitoring and the SEIS improvement project, work is underway to improve the usability of this data.

The State Education Quality Service (SEQS) collects information on compulsory school age children who are out of school. The Office of Citizenship and Migration Affairs reports to the SEQS four times a year on data relating to children of compulsory school age (i.e. 5 to 18 years of age). The SEQS monitors school enrolment by comparing this data with the information in SEIS, which is provided by school principals. It is a municipal responsibility in Latvia to ensure that all school age children are attending school and, with the participation of relevant municipal services, to identify why compulsory age children are out of school. Based on this data, SEQS provides an annual review on the number of out-of-school children and the underlying reasons.

The State Examinations System (SES) is operated by the National Centre for Education (NCE) under the Ministry of Education and Science. The SES contains information about the state exams in general education programmes, including information about centralised exams in foreign languages which are substituted with international foreign language tests, such as the Test of English as a Foreign Language. The SES also includes information about state exams in professional education programmes. It contains data on individuals who need to take state exams, educational institutions where the state exams take place, teachers who supervise and evaluate the exams, the results of state exams, and certificates issued for general secondary and basic education. Data in the SES system are entered by educational institutions and the NCE. Data on individuals who need to take the state exams, and teachers engaged in this process, are fed into the SES from Latvia’s SEIS. Once the state exams are evaluated, the results are transferred to the SEIS and the state services portal Latvia.lv, through which individuals can apply for admission to tertiary education institutions in Latvia.

International surveys, such as the Labour Force Survey (LFS), provide annual information on adult learning. The LFS has several benefits, including a large sample size and regular implementation across the calendar year. Questions on adult learning in this survey cover the area of education, the purpose of education, the duration of educational activities, and whether the educational activities took place as part of paid employment. Although the LFS provides rich information, the analysis performed with these data is limited. Other survey data, such as the Adult Education Survey and the Continuing Vocational Training Survey, provide more detailed information on adult learning. However, these data are available only every five years. Latvia is participating in the second cycle of the OECD’s Survey of Adult Skills, a product of the Programme for the International Assessment of Adult Competencies (PIAAC), which measures the key cognitive and workplace skills of individuals (aged between 16 and 65 years) and is expected to publish results in 2023.

Latvia is introducing a graduate tracking system that covers vocational education and training (VET) and higher education. Under an European Structural Fund project called “Establishment of a system for monitoring education quality” there is ongoing work to develop a centralised VET graduate tracking system by the end of 2020. Until this system comes into effect, VET institutions will continue to collect graduate data via annual surveys conducted within three months of graduation. These surveys include information on whether graduates find employment in line with their specialisation, continue education within the same specialisation, continue education within a different specialisation, work in a sector different from the specialisation undertaken (without information about the specific sector), work abroad, etc. VET institutions submit these data to the Ministry of Education and Science.

The Register of Students and Graduates of Higher Education, introduced in 2017, tracks the employment of higher education degree holders aggregated by study programme and higher education institution. Information from the databases of the Central Statistical Bureau (CSB), the State Revenue Service and the State Employment Agency (SEA) feed into the register. It is planned that aggregated data from the Register of Students and Graduates will become publicly available with information on each cohort of graduates remaining available for a period of 10 years. When operational, the graduate registers for VET and higher education will contain individual-level data about graduates’ employment status; field of work and salary; education institution, study programme and degree-related information; and demographic characteristics. The register will be administered by the SEIS, with individual education institutions importing data on their graduates. The SEIS will share these data with the CSB, which will process and prepare statistical reports.

The missing indicators are presented below based on whether they relate to: 1) inputs; 2) participation and progression; 3) outputs, outcomes and impacts; and 4) contextual factors. An assessment of the missing indicators in Latvia, implications for Latvia’s EDG and relevant country examples are also presented. These specific missing indicators were emphasised during consultations that the OECD had in Latvia with government officials and stakeholders.

Funding is one of the most critical inputs into an education and skills system; however, the Latvian government currently lacks sufficiently detailed information on funding for lifelong learning by municipalities, employers and individuals to track this with relevant indicators. There is no centralised system for monitoring municipal expenditure on education and training beyond state transfers. This is partly due to municipalities’ reservations, and at times reluctance, to share detailed educational expenditure at the school level. Firms, which typically record expenditure on in-house or external training for employees in their own accounting systems, are not required to report this particular expenditure to the State Revenue Service as a separate item. Many individuals report their education and training expenditure to the State Revenue Service in their annual tax returns to receive a personal income tax deduction; however, reported expenditure on education and training is currently conflated with other expenditure, such as health and childcare. Thus, the current accounting and tax reporting standards do not support the aggregation of skills expenditure data from municipalities, employers and individuals.

Detailed information on lifelong learning expenditure from various sources would enable Latvia to develop relevant indicators and identify how municipalities, employers and individuals are investing in lifelong learning, and whether there are any significant differences by socio-economic criteria. In cases where there is a significant difference, the national government may step in and target funding to the municipalities, employers or individuals at a socio-economic disadvantage.

If Latvia wants to develop detailed indicators on lifelong learning expenditure for the EDG it could consider introducing some changes that would make it easier for expenditure data to be collected from municipalities, employers and individuals. For municipalities, Latvia could consider legal changes that make it mandatory for municipalities to make available this expenditure information. A similar requirement was introduced in Chile, which made data collection on expenditure easier and promoted transparency (Box 3.1). Similarly, firms and individuals could be requested to report their education and training expenditure to the State Revenue Service as a separate item. This would give Latvia a comprehensive view of lifelong learning expenditure and, based on that, help it identify where there may be greater funding gaps and make strategic funding decisions accordingly.

While a wide range of indicators on participation and progression exist, there are some missing indicators that Latvia could address for its EDG.

First, in terms of participation in general education, one important concern in Latvia is how to measure out-of-school children and interpret the data collected on them. Currently, data on children of compulsory education age who are not registered in any educational institution are collected by the SEQS as an indicator and published every year. In the previous 2014-2020 EDG, the baseline value for out-of-school children was 5.4%, while the aspirational mid-term value for 2017 was 4.4% and the aspirational final value for 2020 was 3%. However, during the mid-term evaluations of the 2014-2020 EDG the actual values for this indicator were 6.6% for 2017 and 6% for 2018. Thus, instead of the out-of-school children rate going down, which was the aim of the previous EDG, it went up. The interpretation has been that the rate increased due to families emigrating with their children and the current data system not being able to differentiate between children who are out of school due to emigration or due to drop-out.

If Latvia wants to continue to monitor this indicator in the new EDG it will therefore be important to clearly identify why children are out of school. The SEQS updates information in the SEIS about children not registered at any educational institution in Latvia four times a year. Municipalities are required to report in the SEIS the reasons why children are not enrolled in school. According to these data, reasons were identified for 93% of out-of-school children, thereby enabling differentiation between children out of school due to drop-out or due to emigration. However, as there remains around 7% of out-of-school children for whom no data are available on their reasons for leaving school, the SEQS has asked municipalities to increase their efforts in identifying the reasons for children being out of school, which would help to improve the currently used indicator for out-of-school children in Latvia.

As a complementary measure, Latvia could consider creating an academic index instrument that combines data from academic factors impacting school success, such as absence, discipline, and assessment scores, to create a new indicator. Such an academic index could allow Latvia to identify at an early stage which students are at risk of dropping out. In Maryland, United States, such an academic index has been developed and used. The index puts students into high risk, medium risk and low risk groups based on their absence, discipline and assessment score patterns (see Box 3.2). Having such a tool would allow Latvia to introduce targeted support measures for at-risk students early on and prevent drop-out. The information on students at risk of dropping out could also be used in combination with the data on out-of-school children to cross-reference and check whether any of the out-of-school children had previously been at risk of dropping out. In such cases, these out-of-school children are more likely to have left school due to drop out rather than for other reasons, such as emigration.

Second, in terms of progression, further improvements could be made to monitor students’ progression throughout the education system. Student progression could be enhanced by using a number of different data sources. Besides basic student background information and student reports, Latvia could also consider using national assessments to track student progression. Currently, national assessments play an important role at the end of secondary education when students take the upper secondary school graduation exams. The assessment results are then used by the National Centre for Education to analyse the distribution of students and their results by gender, school location and type of school. However, beyond the assessment results at the end of compulsory education, the various assessment results during the entire education trajectory of a student could be useful as an indicator for monitoring a student’s progression. This would provide complementary information that would allow for early detection of potential learning difficulties and targeted support for those in need (OECD, 2016[9]).

For its EDG, Latvia could consider using the student assessment data for an indicator to track students throughout their entire education trajectory, as has been done in Maryland, United States, where students’ statewide assessment data has been used to track their progression. Maryland has also used a tool called SchoolNet to cultivate a data-driven culture and to encourage teachers and principals to use data to improve their practices. This tool facilitates teachers in creating, sharing and using formative assessments to track students’ progress on a continuous basis. The results of these formative assessments are used by teachers to learn from one another, to improve their teaching, to identify students in need and to provide those students with targeted support (Box 3.2). Formative assessments, which are used for monitoring student learning on an ongoing basis and provide more immediate feedback, complement summative assessments such as national student assessments, which are typically at the end of the school year and are used for determining whether a student progresses to the next grade. For the EDG, Latvia could consider using both the national student assessment results (summative assessments) and encouraging regular assessments (formative assessments) for an indicator to track and support students’ progression through the education system.

Availability of indicators on the outputs, outcomes and impacts of the education system varies across levels of education. For example, at the ECEC level, indicators on outcomes could be developed to measure the quality of ECEC. The previous EDG 2014-2020 had only one ECEC indicator, which was on participation and was defined as the “share of children between the age of four and the age for starting compulsory primary education participating in pre-school education”. The baseline value for this indicator was 92.7% in 2011 and the target value was 95% by 2020. Based on the mid-term evaluations, the target was already achieved in 2016 (95.5%) and even exceeded in 2017 (96.3%) (Latvian Ministry of Education and Science, 2019[12]). This is encouraging progress. However, there remains uncertainty about the quality of ECEC provision, which is essential to ensure that the benefits of ECEC participation materialise (OECD, 2011[13]). Currently, there are no indicators that track the quality of ECEC. This is due to a lack of a national assessment instrument and external evaluations to monitor child development (OECD, 2019[14]).

For the new EDG 2021-2027, Latvia could consider developing an ECEC quality indicator based on a national assessment instrument. A national agency such as the SEQS could be given responsibility for regularly evaluating the quality of ECEC institutions. This would support efforts to maintain quality standards in ECEC institutions once they have received their license for operation (OECD, 2019[14]). An example of such an assessment instrument is the Early Development Index used in Ontario, Canada to measure the quality of ECEC (Box 3.3). Such an instrument enables the assessment of how well children develop relative to other children based on their physical health, well-being, social competence, emotional maturity, language and cognitive skills, communication skills, and general knowledge. Information like this could allow Latvia to identify which ECEC institutions would need additional support if a disproportionally large number of children in these institutions are not doing well relative to other children. Efforts like these would help Latvia to promote equity and give all children the opportunity for a quality education early on.

Information on contextual factors is available to some extent across levels of education in Latvia. For example, data on ECEC students from the CBS allows for the identification of the number of preschools and children enrolled in preschools in the nine largest cities and in five regions covering rural areas and smaller towns. Information is also provided on the age of preschool children and the language of instruction in preschools.

In general education, national assessments provide information on the average achievement and the degree of achievement variability by subject and gender of students, school location by level of urbanisation, and the type of school such as state secondary and mainstream. In higher education, data are available on the prior education of students, gender, full-time or part-time enrolment, place of residence, and age.

While this is useful contextual information, the challenge is that the contextual indicators used for each education level are not always used consistently. Consistent definitions and methodologies of measuring contextual information would make it possible to analyse how education access and outcomes may vary across the education trajectory of students.

Specific contextual information would allow Latvia to develop indicators to monitor equity issues for specific groups, including students with culturally diverse backgrounds and students with special needs.

For students with culturally diverse backgrounds, language spoken at home is a common indicator used across OECD countries to identify students in need of additional instructional support (Schleicher, 2019[15]). Information about home language would allow Latvia to identify students who may need additional support to cope with an education provided in a language different from their home language. As Latvia seeks to transition gradually to a position where education in all general subjects at the upper secondary level is taught in Latvian by 2022/23, students whose home language is not Latvian may face challenges (OECD, 2019[14]). This affects children from ethnically diverse families, many of whom are attending ethnic minority language schools at the primary to lower secondary level. However, it may also include returning Latvian families from the diaspora who may speak another language at home due to their prolonged stay abroad, or because one of the parents has a non-Latvian background. There is currently no indicator to identify students whose home language is not Latvian, which makes it difficult to monitor their progression.

Moving forward with the EDG 2021-2027, Latvia may consider collecting information on the language spoken by students at home. In the Flemish Community (Belgium), the language spoken at home is used as one of the indicators to determine the socio-economic disadvantages of students. Information from this indicator is used to guide compensatory policies that target additional grants and allocate teaching staff to schools with a relatively larger share of those students (Box 3.4).

More contextual information could also be collected on students with special needs. These students can be divided into two groups: those who require additional instructional needs, such as speech therapists, psychologists and social pedagogues; and those who have been diagnosed with mental and physical disabilities. In Latvia, students with special needs from both groups can attend special schools (which specialise in certain types of disabilities), special classes in a mainstream school or mainstream classes (OECD, 2016[9]). Although data on students with additional instructional needs have been available, one of the challenges in the previous EDG was the accessibility of data to track the number of students with disabilities. Due to regulatory barriers it was not possible to collect information in the SEIS on students with disabilities who are attending general education, vocational education and higher education institutions (Latvian Ministry of Education and Science, 2019[12]).

If Latvia wants to monitor more closely students with special needs in the new EDG, regulatory changes are needed to ensure that parents cannot hide their children’s diagnosis and to authorise the SEIS to collect data on students with disabilities. The doctor’s diagnosis should be directly transmitted to schools and the SEIS. This would also require authorisation of the National Health and Work Capacity Review Medical Board (Latvian Ministry of Education and Science, 2019[12]). The ability to track students with disabilities in general education, vocational education and higher education would allow Latvia to target specific support measures for them, as well as monitor their progress and their outcomes (e.g. well-being). Latvia could consider the case in the United States, where legal measures have made the collection of data on students with disabilities mandatory for the Department of Education, and annual reports on students with disabilities are made available (Box 3.4).

This section describes the process through which indicators were identified for Latvia’s EDG and presents potential indicators for inclusion in the EDG. The discussions about which indicators to use were informed by the steps for selecting indicators presented in Section 2, and the available indicator data sources and gaps presented in Section 3.

A Strategy Development Workshop to discuss potential indicators was held in Riga in February 2020. The workshop was held over two days and convened indicator experts from various Latvian ministries and government agencies on the first day, as well as stakeholders from schools, municipalities, business, academia, and civil society on the second day. The aim of the workshop was to identify together a set of indicators that could be relevant for Latvia’s EDG.

Discussions were held in five working groups covering five levels of education: ECEC, general education, VET, higher education and adult learning. The division of working groups by level of education allowed for a more technical discussion of indicators specific to these levels, and helped to avoid repetition and redundancies. Furthermore, participants were often experts in a specific level of education and so the discussion was able to benefit from their education-level specific expertise.

In preparation for the workshop, the OECD examined extensively the data sources for indicators available to the Latvian government from national and international sources and compiled a list of 181 possible indicators (Box 3.5). This list was presented on the first day to the indicator experts who reviewed them and identified those they thought most relevant for Latvia’s EDG. Participants were also encouraged to propose new indicators, where necessary. The OECD facilitated discussions and asked participants to use the SMART quality framework (see Section 2) when considering any potential indicator. Participants also had to identify which of the four policy objectives of the EDG the indicator would link to. The four policy objectives2 were: 1) teaching and academic excellence; 2) accessible and quality education for everyone; 3) future skills for future society; and 4) sustainable education systems and effective resource management. At the end of the day, each of the five working groups prioritised and discussed in-depth between 10 and 12 potential indicators for each level of education, ending up with a total of 54 indicators.

The results of the first day were presented on the second day to a larger group of stakeholders to collect feedback on the extent to which they agreed with the potential indicators, and whether they had any concerns or would suggest any modifications or new indicators. Based on their feedback, the potential list of indicators was further revised. Many of the potential indicators were adopted in Latvia’s EDG.

Potential benchmark values for the indicators were also briefly discussed during the workshop. However, due to time limitations it was not possible to cover these extensively, and so specific benchmark numbers are not featured in this chapter. However, Section 5 makes some practical suggestions for setting benchmarks for indicators.

The list of indicators discussed by workshop participants is presented by level of education below. A summary of the discussion with participants and additional guidance from the OECD is provided for each indicator. The indicators presented in the tables should not be taken as a final list, but rather as a work in progress, and in most cases in need for further refinement, as shown in the participants’ comments and OECD reflections. For some indicators, participants or the OECD have made suggestions for revision or for using alternative indicators, which Latvia may wish to consider. The tables only provide a high-level summary of the discussed indicators. Readers wanting more detail should consult the OECD Strategy Development Workshop Summary Note (OECD, 2020[21]).

For each indicator the source (Box 3.5) is indicated with: (P) previous EDG 2014-2020; (M) monitoring project; (I) International; or (N) New. All indicators are also mapped to one or more single or multiple policy objective: 1) teaching and academic excellence; 2) accessible and quality education for everyone; 3) future skills for future society; and 4) sustainable education systems and effective resource management.

This section presents five actions that Latvia could take to improve the indicator system for its EDG: 1) link indicator databases; 2) improve the quality of indicator data; 3) benchmark indicators; 4) raise capacity to make use of indicator data; and 5) improve dissemination of indicator data. Each of these opportunities is discussed with relevant information on Latvia, practical suggestions of what could be done, relevant country examples and specific recommendations.

The indicator data system should be strengthened by linking various databases. The relevant information for an indicator system is often dispersed across various databases without direct links. This applies, for example, to databases from different ministries and institutions, such as the State Education Information System (SEIS), the Unemployment Accounting and Registered Vacancy Information System, and the databases of specific EU funded projects (e.g. SO 8.4.1 "Improvement of professional competence of employed persons”). The reason for not being able to link them is that the databases have been set up for different purposes and are administered and overseen by different ministries or institutions. By linking these databases it would be possible to get a comprehensive picture of lifelong learning in Latvia.

Without such a link there will continue to be significant challenges when trying to request access to specific data from another database. There are considerable administrative efforts required for those requesting and those providing data. While the administrative efforts can sometimes be a simple additional enquiry, it often involves an extensive data request process. For example, there is a lengthy administrative process for employees at the Ministry of Education and Science to receive any data from the SEIS that is not provided through their standard template. Data requests between different levels of government can stall because of the perceived administrative burden. The Ministry of Education and Science cannot require municipalities to engage in data collection on certain aspects of education which are municipal responsibilities and funded from municipal resources. Municipalities can decline a data request if they argue that it imposes an additional administrative burden. Under the current educational data governance system, even if this municipal level data is a relevant indicator to measure educational development, access to this data is not guaranteed. The issue of data access, especially involving individual level data, needs to be viewed in light of The General Data Protection Regulation 2016/679 implementation (Latvijas Vēstnesis, 2018[22]). There may even be additional administrative issues to overcome regarding data access due to an incomplete understanding in public administration and the wider public of how this regulation should be applied, particularly in cases of research into education and social sciences for an academic or public policy purpose. This underscores the importance of linking databases, which would considerably facilitate the monitoring and evaluation of the new EDG.

Some efforts to link databases are already underway. For example, the SEIS is currently being updated by strengthening its system alignment with other information systems and intensifying data exchange, for example with the State Revenue Service, to improve the tracking of students. There were also linking efforts at the higher education level during the previous EDG with the creation of a single higher education information system that gathers data from registers of academic and scientific staff, student diplomas and accreditation needs.

The further linking of databases in Latvia would be useful. Databases such as the SEIS, the Unemployment Accounting and Registered Vacancy Information System, and the information system of EU funded projects (e.g. SO 8.4.1) should be linked using data matching techniques or unique identifiers at all stages of lifelong learning.

A common approach when linking databases has been the introduction of a unique identification (UID) system that uses a unique ID for each individual and allows their data to be linked across various databases. The unique ID could be based on birth registers, biometrics, or other forms of identification (e.g. chip-based ID card with photograph). Such a UID system allows policy makers to track students’ progression throughout and beyond the skills system, making tracer studies possible and providing insights into the policy outcomes. This approach provides information on the relationship between the different actors and services, as well as offering efficiency gains and simplifying administrative management.

Enabling factors to support the linking of databases through a unique identification system include:

  • Legal and regulatory framework: This involves enabling the implementation of a UID system, determining the types of information that can be tracked and the uses of the UID system, ensuring privacy and data rights, and guarding against data abuse, discrimination and surveillance.

  • Assessment of existing UID systems: This involves reviewing whether other existing UID systems (e.g. birth registers) could be expanded to link with education and leveraging existing technological and infrastructure capacities.

  • Technological capabilities and compatibility: This involves saving data in a secure database; using electronic, digital or biometric data; and using application programming interfaces to link various databases.

  • Finances: This involves supporting the identification system infrastructure and streamlining the process.

  • Data protection: This involves ensuring that the responsible use of data is safeguarded without infringing on individual rights for privacy.

If Latvia wants to link various databases for its EDG it should consider developing and introducing a UID system, which has been implemented successfully in various countries across the world. For example, in Florida, United States, a unique student ID allows the linking of school data from kindergarten to high school in the K12 data system, the Florida college system and the workforce development information system. This makes it possible to track students as they progress through the education system and transition into the workforce, and allows policy makers evaluate the effectiveness of various initiatives and adapt their approach accordingly. The system increases accountability and simplifies reporting across the whole system. Similarly in Estonia, the Estonian Education Information System uses a civil registration system combined with a digital system that issues a chip-based ID card. This applies to every citizen, including students. The system collects comprehensive academic data including grades and assessment scores, but also provides an overview of the teaching plan for individual lessons and homework assignments. It also incorporates information on individual teachers and provides detailed attendance records. Parents have access to their child’s records and can be notified by text message if their child misses a class. All aspects of academia are captured through the website, and students are tracked throughout their academic career (Box 3.6). Introducing such a unique identification system in Latvia would require the resources to build the infrastructure and set up the system. However, this may be a worthwhile investment to improve the indicator system.

Facilitate data exchanges between indicator databases through a unique identification number for each individual that allows data on this individual to be linked across various databases. Such a unique identification number system allows policy makers to track students’ progression throughout and beyond the education and skills system, making tracer studies possible and providing insights into EDG relevant policy outcomes.The unique ID could be based on birth registers, biometrics, or other forms of identification (e.g. chip-based ID card with photograph). Consideration should be given to linking Latvia’s various administrative databases where information relevant to education and skills policy can be found. These include the State Education Information System, the Unemployment Accounting and Registered Vacancy Information System, and databases of EU funded projects (e.g. SO 8.4.1 "Improvement of professional competence of employed persons”). Such efforts might require a legal and regulatory framework that makes the implementation of a unique identification number system possible, determines the types of information that can be tracked, ensures privacy and data rights, and guards against data abuse. There are also requirements for technological capabilities and compatibilities so that personal data can be saved in a secure database and application programming interfaces are available to link the various databases. The financial cost in setting up the infrastructure and streamlining the process must also be considered. While there are substantial initial resource requirements to introduce such a system, it could be a long-term investment for Latvia and support the implementation of the EDG with comprehensive information.

One of the key elements of a robust indicator system is high-quality data for the indicators to ensure that policy makers can make informed decisions. The quality is determined by how the data are collected, saved, produced and used. Data should be accurate, secure and timely (Husein, 2017[11]), and data gaps should be identified and addressed.

There are currently some concerns about data quality in Latvia. For example, the exact number of school leaders, teachers and other educators (e.g. teachers’ assistants, speech therapists, psychologists, methodologists) is unknown and there is limited information about them (E-Klase, 2015[23]; OECD, 2017[24]). Latvia’s SEIS collects, generates and stores information on education institutions, programmes and staff, from ECEC to higher education, but there are concerns regarding the accuracy and reliability of the data (OECD, 2016[9]). On occasion, data drawn from the system has been found to be outdated, conflicting with other data sources or simply flawed. It is difficult to determine the number of teachers because the same teacher can work in several schools, and the reported data only records information about a teacher’s workload. Part of the reason for this situation may be a lack of clarity in definitions and the scope of data collections.

One of the underlying issues that affects the quality of indicators is the lack of a shared understanding of what should be measured and how it should be measured. This issue is less apparent for compulsory education where data reporting is mandatory and there are clearly defined data categories. However, where data reporting is not a mandatory requirement, data quality often becomes problematic. This problem is most acute in adult learning. Most data used nationally is acquired through surveys, and there is no system-wide approach for regular reporting on adult learning and no nationally applied definition of what should be viewed as adult learning for data reporting. Given that adult learning can occur in formal education, non-formal education and informal learning, data reporting on adult learning is complex, and it is not clear what type of adult learning the relevant institutions should report on. Schools offering adult education programmes provide some data, as do municipalities; however, there is no agreement on how to collect and aggregate these data. Thus, the SEIS currently does not provide data on participation in non-formal adult education. A more comprehensive view of adult learning is usually obtained through specific studies commissioned at the national level, but these are only periodic.

Data quality issues are important not only for monitoring national policies, but also for international monitoring purposes, such as in the context of the EU or the SDG 4 agenda. The quality of data needs to be in line with international quality standards to allow for meaningful benchmarking and promote peer learning.

As the policy context for education changes continuously (see Chapter 2), the indicator system for monitoring the education and skills system needs to also be adapted over time so that it does not become outdated or irrelevant. An adaptable indicator system should periodically review the available data and identify whether they are still relevant for emerging data requirements. This may require the data to be aggregated or disaggregated in new ways and for additional functionalities (e.g. new reported data) or new categories (e.g. particular groups of students) to be added (Husein, 2017[11]).

Factors that can help improve the quality of data include:

  • Clear concepts and definitions: These inform a commonly applied methodology for data collection, as well as the use of statistical techniques and interpretation of data results.

  • Validation process: This establishes feedback loops into the data management process (whether paper based or digital) to improve data quality.

  • Integrity: This promotes the transparency of the data management process and ethical standards.

  • Resources: These provide sufficient financial, technological, institutional and human resources to the statistical agency responsible for the data collection and interpretation to carry out its task.

International good practice for improving data quality is to adopt digital technologies that improve the consistency, reliability and timeliness of data being collected, managed and used. The advantage of using such digital technologies is that they simplify the data collection process, allow for various verification feedback loops, and can be easily adapted as the policy context changes and new data needs arise. Digital technologies can be applied in various parts of the indicator system and cover data collection software, school information systems, database management systems and data analytics applications (Table 3.12). When digital solutions are adopted at the national level and implemented across levels of government, as well as by stakeholders, they can strengthen the overall governance of the indicator system.

To address indicator data quality challenges, Latvia is developing a monitoring system for educational quality assessment, with an education monitoring project expected to be completed by the end of 2023. One of the deliverables of this project is the development of clear definitions of key terms such as “educational quality” and explanations of how these will be measured with benchmarks for 2024 and 2027. These measures and benchmarks will also be included in the EDG 2021-2027.

With the development of the monitoring system, Latvia should also consider improving the quality of its indicator system by strengthening validation processes to ensure quality. For example, in North Carolina (United States), the Department of Instruction validates data and conducts annual data auditing of the data collection system (Box 3.7). Latvia could also use various digital technologies in parts of the indicator system to support the regular monitoring process and ensure higher levels of accuracy, reliability and timeliness of data.

Improve data validation processes by conducting regular quality checks of the data collection system and adopting digital technologies. Regular data collection quality checks based on transparent and clear standards ensure that consistent concepts, definitions and methodologies are applied in the data collection so that data from various sources are compatible and can be aggregated. The quality checks can provide a regular feedback loop between data collection and data management processes so that any discrepancies and inconsistencies are quickly identified and addressed. The statistical agency responsible for data collection and management should be provided with sufficient financial and human resources to carry out these tasks. Latvia should explore adopting various digital technologies to improve the consistency, reliability and timeliness of data being collected, managed and used. Such technologies simplify the data collection process, facilitate the validation feedback loops, and can be easily adapted as the policy context changes and new data needs arise. Digital technologies can be applied in various parts of the indicator system and cover data collection software, school information systems, database management systems and data analytics applications. By improving the quality of indicator data, measuring progress in the implementation of the EDG becomes more reliable, and EDG-related policy decisions are enhanced.

Once the indicators have been selected it is important to set the relevant benchmarks against which performance in the implementation of the policy actions and the achievement of the policy objectives can be monitored and evaluated. Benchmarks help hold all relevant actors accountable and provide a clear goal to strive for by quantifying what is expected.

Latvia’s previous EDG 2014-2020 included baseline values, mid-term benchmarks and final year benchmarks. The mid-term evaluation of the previous EDG 2014-2020 was released in 2019. Comparing the evaluated mid-term values with the target values of the mid-term benchmarks reveals Latvia’s performance so far (Figure 3.2). Among the 99 indicators, the target was exceeded for 43 indicators, met for 17, and not met for 25. For the remaining 14 indicators, no suitable data were available.

In cases where the mid-term target value has been exceeded, this could mean that the target was set too low or that more than necessary resources were invested. For example, for the indicator “educational institutions use the eTwinning platform for co-operation with other European schools as a percentage of the total number of educational institutions”, the target mid-term value was 16%, while the evaluated mid-term value was 67%. When the mid-term value is far below the target, this could mean that the target was too ambitious and/or that insufficient resources and efforts were invested, or that other external circumstances made it difficult to achieve. For example, for the indicator “proportion of teachers involved in professional development activities as a percentage of the total number of teachers”, the target mid-term value was 50%, while the evaluated mid-term value was 30%. For the indicators that were on target, many were framed as yes/no options, particularly those related to certain regulatory changes (e.g. state examinations) or the establishment of institutions (e.g. National Agency for Higher Education Quality Evaluation). For some indicators that did not have suitable data for the mid-term evaluation, the main reasons included a lack of updated data, no available data source and regulatory barriers (e.g. inhibiting collection of information on students with disabilities).

There were also a number of indicators with benchmark values that were difficult to understand and interpret. For example, the indicator “the number of teachers working with adults who have received support to learn Latvian as a second and foreign language” had a baseline value of 116 (2013), a mid-term value of 70 and a target value of 90. The evaluated mid-term value was 104. Similarly, for the indicator “public expenditure on education in the year as a percentage of GDP” the baseline value was 4.9% (2013), the mid-term value 3.7% and the target value 5%. The evaluated mid-term value was 5.3%. For both indicators, the mid-term value first goes down and the target value goes up again. Without further context, it is difficult to understand why these values have been set as they were and how to interpret the mid-term value.

For selecting benchmark values for the new EDG 2021-2027, Latvia should consider using the SMART framework introduced in Section 2, which is also relevant for benchmarks. A quality benchmark is specific, measurable, achievable, realistic and timely. In particular, if Latvia wants to include again any of the 14 indicators for which there was no suitable data source available in the mid-term evaluation, a careful review should determine whether a new data source could be found before including the indicator.

Moreover, Latvia should consider the following aspects when setting education benchmarks:

Level or rate of progress

  • Benchmarks can either correspond to a specific level (e.g. 90% secondary education completion rate) or to a rate of progress (e.g. double the completion rate in secondary education). This choice may be conceptual in nature, i.e. all children should be in school. However, there is a more practical issue of different starting points for different stakeholders.

Differentiation by target group

  • Differentiating benchmarks by target group is important for countries to assess performance towards achieving equity objectives. If different subpopulations begin from very different starting points, is it relevant or reasonable to set the same target for everyone? Using the same target for everyone can make it unrealistic for some and not sufficiently ambitious for others. At the same time, it may be politically desirable to hold similar expectations for everyone, regardless of their background. This question is relevant for both level and progress indicators, as advantaged groups are more likely to reach level targets, but less likely to reach progress targets.

  • A compromise may be to set a final overall target for the whole population, but with different contributions from each subpopulation. For example, the country may set a national target, but expect that advantaged and disadvantaged regions contribute differently to this national target, according to their means and starting points.

Quantitative or qualitative measures

  • Quantitative benchmarks can often be seen as “dry” and almost antithetical to the concepts of holistic education and well-rounded students. It is important, therefore, to collect additional qualitative and descriptive information that can help obtain a fuller picture.

The process of setting benchmarks requires finding a balance between the desirable and the feasible. If the benchmarks are unbalanced, with a significant share of indicators ending up with values significantly exceeding or falling below the target, this could mean that resources are not effectively allocated. For example, some portion of the resources used to achieve exceeded targets might have been better spent achieving missed targets. At the same time, since an exceeded target may also mean that the target was set too low in the first place, it should be carefully reviewed and adjusted as needed.

There are a number of criteria that can be considered when setting benchmarks (Table 3.13). These include the extent to which the benchmark is a priority for the government, the peer average for the indicator, the available resources for achieving the benchmark, relevant international performance standards and past trends.

Once the criteria for selecting the benchmark has been decided, it is necessary to identify the base value. Ideally, the current value would be available; however, in cases where no data are available due to the creation of a new indicator it may be necessary to collect new data and calculate the base value from that. If the indicator is qualitative in nature, such as assessing whether a skills council has been established, the base value would be 0.

In order to set the target for the indicators, Latvia should consider the five criteria presented in Table 3.13. An indicator target should be ambitious enough to be inspirational and mobilise action, but not so unrealistic as to demotivate actors. Trend data can be useful when available as it allows for the estimation of a realistic target. Setting a mid-term target value involves identifying the base and target value and calculating the mid-term value between the two. If based on historical trend data, and the rate of improvement has not been linear, adjustments to the mid-term value could be made. In some cases, perhaps based on available resources or for political reasons, the mid-term value could be set more ambitiously or more cautiously.

It is also important to determine the frequency of the benchmarks. The indicators for the EDG should usually have a mid-term value and a final year target value. This applies in particular to the outcome indicators, which can then be used during monitoring and evaluation to determine whether the EDG has achieved the overall policy objectives. However, since these two values are a number of years apart, and the results of the mid-term evaluation are usually only available towards the end of the EDG (i.e. mid-term evaluation results of the EDG 2014-2020 were available in 2019), it may be useful to also have annual targets for important indicators. This would provide more frequent feedback and enable the implementation of policy actions to be adapted before it is too late to correct. In uncertain times, with COVID-19 making it difficult to predict changes in the policy context for the foreseeable future, more frequent feedback may be needed to facilitate making adjustments to policy actions in the EDG. At the same time, more frequent data collection is labour intensive and comes at a cost. The potential benefits and costs should thus be carefully weighed before making a decision about the frequency of targets.

Set the target value to be sufficiently ambitious to inspire and mobilise action, but not so unrealistic as to demotivate actors. Target values should be chosen based on criteria such as government priorities, peer average, available resources, international performance standards and past trends. If any indicators from the previous EDG are being used for the new EDG, their benchmark values should be reviewed in relation to the evaluated mid-term values in order to determine a realistic benchmark target in the new EDG.

Consider adopting annual targets for some indicators. Complementary to the mid-term and final year target values, Latvia could consider annual targets for some important indicators. This would provide more frequent feedback on progress towards the achievement of objectives and, by extension, highlight where corrective action may need to be taken to achieve those targets. In uncertain times, with COVID-19 making it difficult to predict the policy context for the foreseeable future, more frequent feedback may facilitate making adjustments to the EDG. At the same time, more frequent data collection is labour intensive and comes at a cost. The potential benefits and costs should thus be weighed carefully.

Once indicator data have been collected, users need to have sufficient capacity to utilise these data. The sheer number of indicators that can be tracked, and the amount of data they represent, can be overwhelming for data users.

Interpreting indicators to inform public policy decisions requires a nuanced understanding of what the indicators measure and the limitations. Without this understanding, indicators can be misinterpreted or go unused.

The misinterpretation of an indicator occurs when the goal of the indicator is not properly understood. For example, an indicator might measure the number of courses in which more than 10% of classes have been cancelled. If would be a misinterpretation of the goal of this indicator to accept as a success classes in which the cancellation rate is below the 10% threshold.

The non-use of an indicator occurs when an indicator produces data that are vague and difficult to act upon, such as an indicator that tracks the average test scores of students. If the indicator was also available in disaggregated form and showed variations by demographic group it could inform a policy targeted at specific student groups. However, without these details the indicator may not be useful for policy making.

In order for indicator data to be used in policy development, some basic statistical knowledge is required. For example, when analysing relationships between two variables, causality must not be inferred from correlation. It may be possible that one variable occurs with the other (i.e. correlation), without one necessarily causing the other (i.e. causation). This has important implications for how data can be interpreted and what limitations there may be when applying data analysis to policy development.

Factors that impact the capacity to use indicators effectively include:

  • Culture in government of using data for decision making: This involves prioritising data in decision making and policy making, and encouraging transparency and openness in data sharing and data usage.

  • Prioritisation of indicators: This involves identifying the key indicators that are most relevant for the medium- and long-term goals and vision of the education system, which makes them easier to follow and use.

  • Capacity of users to interpret data: This involves understanding the nuances and limitations of indicators and identifying which are most appropriate for the policies considered.

A number of national and international indicators are available but not fully used. For example, the National Centre for Education gathers and analyses the results of the centralised exam at the end of upper secondary education each year. It prepares annual statistical reports that describe students’ achievements and results, as well as trends and correlations among the variables. Some information, such as for specific subjects like English, mathematics and Latvian language, are also provided at the school level. Stakeholder organisations pay a lot of attention to the annual results. However, despite increased interest in the findings from the general public, the exam results were not included as indicators in the previous EDG 2014-2020. For the new EDG, Latvia could consider using this indicator, which would allow it to monitor education outcomes on an annual basis.

Certain international indicators should also be used more. Latvia participates in a number of different international assessment surveys, such as PISA, the Trends in International Mathematics and Science Study (TIMSS), the Progress in International Reading Literacy Study (PIRLS), and soon the Survey of Adult Skills (PIAAC), all of which can be used to create indicators. Some of the data from these surveys have already been used in indicators in the previous EDG 2014-2020, for example students with low/high learning outcomes in literacy, mathematics and natural sciences (PISA); 15-year-old students who suffered from any type of violence several times a month (PISA); and 4-year-old pre-school children who suffered from any type of violence once a month (PIRLS). These surveys can be used as the basis for many other potentially relevant indicators in Latvia’s new EDG 2021-2027. Some examples of these are presented in Section 4.

Latvia has some initiatives that seek to improve the capacity of government officials to use indicators, but these have limitations. Some municipalities have implemented projects, supported by European Structural Funds, to develop their administrative capacity to collect and use indicator data. However, due to the limited amount of available funding, this type of support is not available for all municipalities. One approach adopted by government to overcome capacity constraints has been to seek external expert advice, such as inviting academics to join working groups. However, the government lacks the financial capacity to regularly commission input from the academic community. Consequently, expert engagement is typically given voluntarily without remuneration (Anda Terauda, Auers and Jahn, 2018[26]), which means that it is not always easy for governments to obtain the expert advice they need. More broadly, existing research activities have been highly reliant on European Structural Funds, which is not financially sustainable in the long term as the European Commission’s priorities could change. Educational research is undertaken by a small number of individuals and institutes, which limits the overall capacity for Latvia to analyse and interpret indicator information.

For the new EDG, Latvia should consider supporting independent research institutions to expand their research and evaluation capacity to interpret indicator data for decision making. These could be national bodies with specialist expertise in the area, or Latvia’s universities. Latvia could consider the example of the Chilean Center for Research in the Ministry of Education (Box 3.8), which has been designated to lead and co-ordinate efforts in using indicator information in education.

Support research institutions to provide capacity in fully using the available national and international indicators. These research institutions should support the implementation of the EDG by tracking relevant indicators and analysing progress in implementing the policy actions and achieving the policy objectives. These research institutions should analyse the information generated by indicators and regularly publish reports that explain how it can inform and guide the implementation of the EDG policy actions. They should have multidisciplinary teams of experts with statistical and evaluation backgrounds, as well as expertise across education levels. These teams could provide training to other government officials in the Ministry of Education and Science, related agencies and municipalities in how to use information generated by indicators so that the information is used with the nuances and limitations of indicators in mind, and so the most appropriate indicators are used for the policies considered. These research institutions should promote a culture that prioritises data in decision making and policy making, as well as encourages transparency and openness in data sharing and data usage.

The regular dissemination of indicator information can help increase accountability and the visibility of policy actions. When disseminating indicator information it is important to identify the audience and adjust the messaging and presentation (including visualisations) accordingly. In Latvia, audiences include the Ministry of Education and Science, other ministries and government organisations, local government, schools, teachers, parents, students, research institutes, national non-governmental organisations, and international organisations. Due to the vast differences in the needs, interests and ability to interpret data of users, it may even be necessary to develop different products for different audiences.

Enabling factors for the effective dissemination of indicator information include:

  • Dissemination strategy: This involves co-ordinating the efforts of sharing indicator information with various relevant actors.

  • Multiple dissemination channels: This involves making indicator information available through multiple channels, such as websites, mobile apps, reports, brochures and newsletters.

  • Availability of disaggregated data: This involves allowing users to access the information most relevant to them (e.g. a particular school for parents).

  • Frequency of dissemination: This involves making the information available according to the needs of the respective audience.

In Latvia, the dissemination of indicator data information occurs through a variety of platforms. There are a large number of websites that provide information to users based on indicator information. For example, information on the demand for different professions in the labour market is available through the State Employment Agency’s website, where individuals can access information on short-term trends in the demand for different professions, by region and in the country overall. Another development is a project implemented by the Ministry of Economics, in collaboration with the State Employment Agency, which aims to create a user-friendly platform to communicate medium-term and long-term labour market forecasts. It is planned that this platform will also eventually incorporate information on short-term labour market demand (OECD, 2019[14]).

In order for the forecast information generated by indicators to inform the educational decisions of a lay person, this information could be better disseminated and tailored to the specific needs of the end user. Currently, results are primarily distributed in the form of reports with technical descriptions, which is unlikely to meet the needs of the lay person who may find it more useful to access the information in an interactive online format that uses easy to understand language. Limited dissemination channels have led to a lack of awareness about changes in the labour market and a lack of discussion about labour market trends and future skills needs. The forecasts have also not been used to develop policy at the sectoral level. A two-year study on improving Latvia’s labour market forecasting system found a lack of co-operation between government and stakeholders on interpreting the results of existing forecasts (AC Konsultacijas, 2019[27]). Latvia lacks a user-friendly online platform for different user groups to access the results to inform decision making or conduct research and analysis. The results of these exercises are also not integrated with information on related education and training programmes.

For its EDG, Latvia could consider improving its dissemination infrastructure so that the information being collected also gets effectively disseminated to the end users. A relevant country example is the Job Bank platform in Canada, run by Employment and Social Development Canada, and Denmark’s Education Guide (Box 3.9). These platforms are user-friendly as they are easily accessible in their presentation and language. They are being used by students, parents, guidance counsellors, employers and other government officials.

Improve the dissemination of information generated by indicators through an accessible and user-friendly platform that serves a wide audience of users. In order for information, such as that generated by the forecast indicators and other indicators of the EDG, to inform decisions, it should be well disseminated via a platform accessible through a variety of channels (e.g. website, mobile). The information should be up-to-date, user-friendly and easily accessible in its presentation and language. The platform should centralise information on skills needs and available learning opportunities, as well as career guidance services and funding support. The information should be available in disaggregated format so that it can be tailored to the specific needs of various users, such as students, parents, guidance counsellors, employers and other government officials. The platform information should be part of a larger dissemination strategy that seeks to foster a continuous discussion about future skills needs and progress in the implementation of the EDG between the government and stakeholders.

Latvia’s EDG needs to be accompanied by a robust indicator system to monitor implementation progress. Such a system for education and skills policies provides reliable, accurate and timely information on the human and financial resources invested in skills, how education and skills systems operate and evolve, and the returns on investments in skills.

An effective process for selecting EDG indicators should facilitate the consideration of a comprehensive set of high-quality indicators and help to prioritise indicators on the basis of their ability to assess progress towards the achievement of the objectives and policy actions of the EDG. It is important find the right number of indicators, as too many can be costly and administratively burdensome, and too few may not allow for a comprehensive assessment of progress towards achieving the policy objectives.

An assessment of Latvia’s current indicator system reveals gaps in Latvia’s ability to measure progress towards the achievement of its objectives. For example, indicators could be developed to track funding for lifelong learning, distinguish between drop-outs due to emigration and for other reasons, monitor student progression through education, measure the quality of ECEC, and provide additional background information on students, such as their home language and disability status. Developing these indicators would allow Latvia to identify whether all students are sufficiently supported and have the opportunity to develop their skills.

This chapter has presented a list of potential indicators for the EDG and an overview of further considerations taken during the development of Latvia’s EDG. The OECD, together with government and stakeholder representatives, reviewed a total of 181 possible indicators and afterwards prioritised and discussed in-depth between 10-12 potential indicators across each of the five levels of education,3 resulting in a total of 54 potential indicators for Latvia’s EDG.

Latvia should consider the suggestions for how to improve its indicator system for the EDG. These improvements include linking the various databases with a unique identification number, implementing a strong data validation process, setting ambitious yet realistic benchmark targets, designating a research institution to fully use the information generated by indicators, and disseminating information generated by the indicators through a user-friendly platform serving a wide audience of users. Improving the indicator system in these ways would allow Latvia to make more effective use of the information generated by the indicators to guide the EDG implementation process.

References

[27] AC Konsultacijas (2019), Darba tirgus apsteidzošo pārkārtojumu sistēmas iveides iespējas un vidēja un ilgtermiņa darba tirgus prognožu sasaiste arr rīcībpolitiku [Labor market pre-restructuring system and linking medium and long-term labor market forecasts to policies], https://em.gov.lv/files/attachments/DarbaTirgus_Gala%20zinojums.pdf.

[26] Anda Terauda, V., D. Auers and D. Jahn (2018), Latvia Report: Sustainable Governance Indicators 2018, Bertelsmann Stiftung, http://www.sgi-network.org/docs/2018/country/SGI2018_Latvia.pdf (accessed on 25 May 2019).

[18] Civitta (forthcoming), Report on education quality monitoring system and tools.

[23] E-Klase (2015), “Kā patiesībā mainījies pedagogu skaits šajā mācību gadā?” [Actually, has the number of teachers changed this year?], https://www.e-klase.lv/aktualitates/zinas/ka-patiesiba-mainijies-pedagogu-skaits-saja-macibu-gada?id=10860.

[5] Fletcher, T. (2012), Statistics Directorate: Quality Framework and Guidelines for OECD Statistical Acitvities, OECD, Paris, http://www.oecd.org/sdd/qualityframeworkforoecdstatisticalactivities.htm (accessed on 21 January 2020).

[28] Government of Canada (2020), Job Bank, https://www.jobbank.gc.ca/aboutus.

[11] Husein, A. (2017), Data for Learning: Building a Smart Education Data System, World Bank Group, https://openknowledge.worldbank.org/handle/10986/28336.

[12] Latvian Ministry of Education and Science (2019), Mid-term Evaluation.

[22] Latvijas Vēstnesis (2018), Personal Data Processing Law, https://likumi.lv/ta/en/en/id/300099-personal-data-processing-law.

[10] Library of Congress (2020), Chile: Law and Transparency, https://www.loc.gov/law/foreign-news/article/chile-law-on-transparency/.

[21] OECD (2020), OECD Skills Strategy Latvia Implementation Guidance: Strategy Development Workshop Summary Note, OECD, Paris, http://www.oecd.org/skills/centre-for-skills/Strategy_Development_Workshop_Summary_Note.pdf.

[3] OECD (2019), Education at a Glance 2019: OECD indicators, OECD Publishing, Paris, https://doi.org/10.1787/f8d7880d-en.

[14] OECD (2019), OECD Skills Strategy Latvia: Assessment and Recommendations, OECD Publishing, Paris, https://doi.org/10.1787/74fe3bf8-en.

[19] OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/5f07c754-en.

[20] OECD (2019), TALIS 2018 Results (Volume I): Teachers and School Leaders as Lifelong Learners, TALIS, OECD Publishing, Paris, https://dx.doi.org/10.1787/1d0bc92a-en.

[2] OECD (2018), OECD Handbook for Internationally Comparative Education Statistics 2018: Concepts, Standards, Definitions and Classifications, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264304444-en.

[29] OECD (2018), Skills Strategy Implementation Guidance for Slovenia: Improving the Governance of Adult Learning, OECD Publishing, Paris, https://doi.org/10.1787/9789264308459-en.

[24] OECD (2017), Education Policy Outlook: Latvia, OECD, Paris, http://www.oecd.org/education/Education-Policy-Outlook-Country-Profile-Latvia.pdf (accessed on 26 May 2019).

[6] OECD (2017), OECD Handbook for Internationally Comparative Education Statistics: Concepts, Standards, Definitions and Classifications, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264279889-en.

[9] OECD (2016), Reviews of National Policies for Education: Education in Latvia, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264250628-en.

[16] OECD (2015), “Country Background Report of the Flemish Community of Belgium”, OECD Review of School Resources, OECD, Paris, https://www.oecd.org/education/Country%20Background%20Report_Belgium_Flemish%20Community_FINAL%20with%20stat_for%20OECD%20website_REV%20DN.pdf.

[1] OECD (2014), OECD Framework for Regulatory Policy Evaluation, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264214453-en.

[13] OECD (2011), Starting Strong III: A Quality Toolbox for Early Childhood Education and Care, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264123564-en.

[7] OECD (2008), The OECD-JRC Handbook on Practices for Developing Composite Indicators, https://www.oecd.org/sdd/42495745.pdf.

[15] Schleicher, A. (2019), PISA 2018: Insights and Interpretations, OECD, Paris, https://www.oecd.org/pisa/PISA%202018%20Insights%20and%20Interpretations%20FINAL%20PDF.pdf.

[25] Sorrells, A. (2019), EdExplainer: Education data systems in North Carolina, EducationNC, https://www.ednc.org/edexplainer-education-data-systems-in-north-carolina/ (accessed on 25 August 2020).

[4] UIS (2020), Education and Disability: Analysis of Data from 49 Countries, UNESCO Institute for Statistics, http://uis.unesco.org/sites/default/files/documents/ip49-education-disability-2018-en.pdf.

[17] US Department of Education (2020), Individuals with Disabilities Act, https://sites.ed.gov/idea/data/.

[8] Vági, P. and E. Rimkute (2018), “Toolkit for the preparation, implementation, monitoring, reporting and evaluation of public administration reform and sector strategies: Guidance for SIGMA partners”, SIGMA Papers, No. 57, OECD Publishing, Paris, https://dx.doi.org/10.1787/37e212e6-en.

Notes

← 1. www.sigmaweb.org/publications/SIGMA-Strategy-Toolkit-Annex-2-Indicators.docx.

← 2. The four policy objectives discussed during the Strategy Development Workshop were draft versions. They have since been further developed, as reflected in Chapter 2.

← 3. Five levels of education: 1) early childhood education and care; 2) general education; 3) vocational education and training; 4) higher education; and 5) adult learning.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.