Chapter 2. Raising learning outcomes through student assessment

This chapter looks at how the assessment system of the Republic of North Macedonia measures and shapes student learning. Classroom assessments are not based on established, national learning standards, and therefore do not convey reliable and meaningful information on student achievement. Teachers predominantly rely on summative assessment practices, which are limited to a narrow range of lower-order tasks, thereby providing students with little quality feedback. This chapter suggests that North Macedonia develop national learning standards to provide students with more consistent and accurate information of their attainment. It will also be critical to support and encourage formative assessment practices to support teachers monitor student learning.

    

The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law.

Introduction

The primary purpose of student assessment is to determine what students know and are capable of doing, to help them advance in their learning and take an informed decision on the next step in their education. In the Republic of North Macedonia (referred to hereafter as “North Macedonia”), using assessment in this way is difficult because teachers’ assessment judgements are not based on established, national learning standards, and therefore do not convey reliable information on student achievement. While the OECD Programme for International Student Assessment (PISA) reveals that the majority of 15-year-olds are unable to perform basic cognitive tasks, in their schools those same students are receiving outstanding classroom marks and are scoring highly on the state matura.

A further challenge is that teachers’ classroom assessment practices are predominantly summative and limited to a narrow range of lower-order tasks. Despite recent policy efforts to strengthen formative practices, students receive little quality feedback and few opportunities to demonstrate important dimensions of the curriculum, in particular more applied skills and complex, transversal competencies such as problem-solving and critical thinking.

This chapter suggests how North Macedonia can develop a student assessment system with more educational value. It recommends the creation of national learning standards as a means to help teachers form assessment judgements that are more consistent and meaningful. Stronger understanding of national expectations will help teachers to confidently and accurately monitor student learning. Alongside greater support and encouragement for formative assessment, this will help teachers to identify and address learning gaps early on. Finally, while the matura is renowned across the region for its innovative design and integrity, a decade after its implementation, the model should be reviewed to keep pace with changes in the education system. This review suggests revisions to ensure that the matura is helping to foster higher-order skills and better prepare students to progress to higher levels of education and to enter the labour market.

Key features of an effective student assessment system

Student assessment refers to the processes and instruments that are used to evaluate student learning (see Figure 2.1). These include assessment by teachers, as part of school-based, classroom activities like daily observations and periodic quizzes, and though standardised examinations and assessments that are designed and graded outside schools.

Overall objectives and policy framework

At the centre of an effective policy framework for student assessment is the expectation that assessment supports student learning (OECD, 2013[1]). This expectation requires that national learning objectives be clear and widely understood. Regulations concerning assessment must orient teachers, schools and assessment developers on how to use assessment to support learning goals.

To these ends, effective assessment policy frameworks encourage a balanced use of summative and formative assessments, as well as a variety of assessment types (e.g. teacher observations, written classroom tests and standardised instruments). These measures help to monitor a range of student competencies and provide an appropriate balance of support, feedback and recognition to students to encourage them in improve their learning. Finally, effective assessment frameworks also include assurance mechanisms to regulate the quality of assessment instruments, in particular central, standardised assessments.

The curriculum and learning standards communicate what students are expected to know and be able to do

It is important to have common expected learning outcomes against which students are assessed to determine their level of learning and how improvement can be made (OECD, 2013[1]). Expectations for student learning can be documented and explained in several ways. Many countries define them as part of national learning standards. Others integrate them into their national curriculum frameworks (OECD, 2013[1]).

While most reference standards are organised according to student grade level, some countries are beginning to organise them according to competency levels (e.g. beginner and advanced), each of which can span several grades (New Zealand Ministry of Education, 2007[2]). This configuration allows for more individualised student instruction, but requires more training for teachers to properly understand and use the standards when assessing students.

Types and purposes of assessment

Assessments can generally be categorised into classroom assessments, national examinations and national assessments. Assessment has traditionally held a summative purpose, which aims to explain and document learning that has already occurred. Many countries are now also emphasising the importance of formative assessment, which aims to understand learning as it occurs in order to inform and improve subsequent instruction and learning (see Box 2.1) (OECD, 2013[1]). Formative assessment is now recognised to be a key part of the teaching and learning process and has been shown to have one of the most significant positive impacts on student achievement among all educational policy interventions (Black and Wiliam, 1998[3]).

Box 2.1. Purposes of assessment
  • Summative assessment – assessment of learning, summarises learning that has taken place, in order to record, mark or certify achievements.

  • Formative assessment – assessment for learning, identifies aspects of learning as they are still developing in order to shape instruction and improve subsequent learning. Formative assessment frequently takes place in the absence of marking.

For example, a teacher might ask students questions at the end of lesson to collect information on how far students have understood the content, and use the information to plan future teaching.

Source: (OECD, 2013[1]), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.

Figure 2.1. Student assessment and learning
Figure 2.1. Student assessment and learning

Classroom assessment

Among all types of assessment, classroom assessment has the greatest impact on student learning (Absolum et al., 2009[4]). Classroom assessment supports learning by regularly monitoring learning and progress; providing teachers with information to understand students’ learning needs and guide instruction; and helping students understand the next steps in their learning through the feedback their teachers provide.

Classroom assessments are administered by teachers in classrooms and can have both summative and formative purposes. Classroom assessments can be delivered through various formats, including closed multiple-choice questions, semi-constructed short answer questions and open-ended responses like essays or projects. Different assessment formats are needed for assessing different types of skills and subjects. In general, however, assessing complex competencies and higher-order skills requires the usage of more open-ended assessment tasks.

In recent decades, as most OECD countries have adopted more competency-based curricula, there has been a growing interest in performance-based assessments like experiments or projects. These types of assessments require students to mobilise a wider range of skills and knowledge and demonstrate more complex competencies like critical thinking and problem solving (OECD, 2013[1]). Encouraging and developing effective, reliable performance-based assessment can be challenging. OECD countries that have tried to promote this kind of assessment have found that teachers have required far more support than initially envisaged.

Effective classroom assessment requires the development of teachers’ assessment literacy

Assessment is now seen as an essential pedagogical skill. In order to use classroom assessment effectively, teachers need to understand how national learning expectations can be assessed – as well as the students’ trajectory towards reaching them ‒ through a variety of assessments. Teachers need to know what makes for a quality assessment – validity, reliability, fairness – and how to judge if an assessment meets these standards (see Box 2.2). Feedback is important for students’ future achievement, and teachers need to be skilled in providing constructive and precise feedback.

Box 2.2. Key assessment terms
  • Validity – focuses on how appropriate an assessment is in relation to its objectives. A valid assessment measures what students are expected to know and learn as set out in the national curriculum.

  • Reliability – focuses on how consistent the assessment is measuring student learning. A reliable assessment produces similar results despite the context in which it is conducted, for example, across different classrooms or schools. Reliable assessments provide comparable results.

Source: (OECD, 2013[1]) , Synergies for Better Learning: An International Perspective on Evaluation and Assessment, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264190658-en.

Many OECD countries are investing increasingly in the development of teachers’ assessment literacy, beginning in initial teacher education. In the past, teachers’ initial preparation in assessment was primarily theoretical, but countries are now trying to make it more practical, for example, by emphasising opportunities for hands-on learning where teachers can develop and use different assessments. Countries encourage initial teacher education providers to make this shift by incorporating standards on assessment in programme accreditation requirements and in the expectations for new teachers in national teacher standards.

It is essential that teachers’ initial preparation on assessment is strengthened through on-going, in-school development. Changing the culture of assessment in schools – especially introducing more formative approaches and performance-based assessments, and using summative assessments more effectively – requires significant and sustained support for teachers. Continuous professional development such as training on assessment and more collaborative opportunities when teachers can share effective assessment approaches provides vital encouragement. Pedagogical school leaders also play an essential role in establishing a collaborative culture of professional enquiry and learning on assessment.

Finally, countries need to invest significantly in practical resources to ensure that learning expectations defined in national documents become a central assessment reference for teachers and students in the classroom. These resources include rubrics that set out assessment criteria, assessment examples aligned to national standards and marked examples of student work. Increasingly, countries make these resources available on line through interactive platforms that enable teachers to engage in the development of standards, which facilitates a greater feeling of ownership over the resources and makes it more likely that they will be used.

National examinations

National examinations are standardised assessments developed at the national or state level with formal consequences for students. The vast majority of OECD countries (31) now have exit examinations at the end of upper secondary to certify student achievement and/or for selection into tertiary education, reflecting rising expectations in terms of student attainment as well as the importance of transparent systems for determining access to limited further education opportunities (see Figure 2.2). National examinations are becoming less common at other transition points, as countries seek to remove barriers to progression and reduce early tracking. Among those OECD countries (approximately half) who continue to use national examinations to inform programme and/or school choice for entrants to upper secondary education, few rely solely or even primarily on the results of examinations to determine a student’s next steps.

Figure 2.2. National examinations and assessments in public school in OECD countries
Figure 2.2. National examinations and assessments in public school in OECD countries

Notes: Number of subjects covered in the assessment framework (subjects may be tested on a rotation basis).

Data for the national examinations and assessments in Lithuania are drawn from authors’ considerations based on OECD (2017[5]), Education in Lithuania, Reviews of National Policies for Education, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264281486-en.

Source: OECD (2015[6]), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, http://dx.doi.org/10.1787/eag-2015-en.

While classroom assessment is the most important assessment for learning, evidence shows that the pace of learning slows down without external benchmarks like examinations. National examinations signal student achievement and in many countries carry high stakes for students’ future education and career options, which can help to motivate students to apply themselves (Bishop, 1999[7]). They are also more reliable than classroom assessment and less susceptible to bias and other subjective pressures, making them a more objective and arguably fairer basis for taking decisions when opportunities are constrained, such as access to university or high-demand schools.

However, there are limitations related to the use of examinations. For instance, they can only provide a limited snapshot of student learning based on performance in one-off, time-pressured exercises. To address this concern, most OECD countries complement examination data with classroom assessment information, teachers’ views, student personal statements, interviews and extra-curricular activities to determine educational pathways into upper secondary and tertiary education.

Another concern is that the high stakes of examinations can distort teaching and learning. If examinations are not aligned with the curriculum, teachers might feel compelled to dedicate excessive classroom time to examination preparation instead of following the curriculum. Similarly, students can spend significant time outside the classroom preparing for examinations through private tutoring. To avoid this situation, it is important that items on examinations are a valid assessment of the curriculum’s learning expectations and encourage high quality learning across a range of competencies.

Most OECD countries are taking measures to address the negative impact that the pressure of examinations can have on student well-being, attitudes and approaches to learning. For example, Korea has introduced a test-free semester system in lower secondary education with activities like career development and physical education to develop students’ life skills and reduce stress (OECD, 2016[8]).

National assessments

National assessments provide reliable information on student learning, without any consequences for student progression. Across the OECD, the vast majority of countries (30) have national assessments to provide reliable data on student learning outcomes that is comparative across different groups of students and over time (see Figure 2.2). The main purpose of a national assessment is system monitoring and, for this reason, national assessments provide essential information for system evaluation (see Chapter 5).

Countries might also use national assessments for more explicit improvement purposes, such as to ensure that students are meeting national achievement standards and identify learning gaps in need of further support. In these cases, providing detailed feedback to teachers and schools on common problems and effective responses is critical.

Many OECD countries also use national assessments for school accountability purposes, though there is considerable variation in how much weight is given to the data. This is because student learning is influenced by a wide range of factors beyond a school or teacher’s influence – such as their prior learning, motivation, ability and family background (OECD, 2013[1]).

National assessment agencies

Developing high quality national examinations and assessments requires a range of assessment expertise in fields such as psychometrics and statistics. Many OECD countries have created government agencies for examinations and assessments where this expertise is concentrated. Creating a separate organisation with stable funding and adequate resources also helps to ensure independence and integrity, which is especially important for high-stakes national examinations.

Student assessment in North Macedonia

Since the last OECD review of education in North Macedonia in 2003, the country has made significant advances in several key areas of student assessment. The main national examinations at the end of upper secondary that have been in place for ten years – the state matura, the school matura and the final examination for certification and tertiary placement purposes ‒ are trusted nationally and respected across the region. In recent years, a strong effort has also been made to increase and improve the use of formative assessment in classrooms.

Nevertheless, there is a divergence between the intent of North Macedonia’s assessment strategy and what occurs in the country’s schools. In classrooms, a focus on summative assessment still tends to outweigh formative objectives. Grading also frequently reflects societal expectations for high marks, rather than accurately revealing what students know and can do. This pressure is compounded by relatively weak teacher assessment literacy and the limited resources provided to teachers to evaluate student progress. As a result, students do not receive reliable feedback on their learning, which combined with the pressures of a dense curriculum means that the instruction they receive can quickly outpace their individual learning rhythm. This has contributed to a situation where the majority of students progress through school with good grades and do well in national examinations – nearly all students enrolled in gymnasiums pass the matura (94.3% in 2017) – but, as international assessments suggest, are not mastering basic competencies in literacy and numeracy (OECD, 2016[9]).

Overall objectives and policy framework

A key strength of the assessment framework in North Macedonia is national recognition of the value of assessment for student learning. Assessment features in national policy documents – such as the Comprehensive Education Strategy, and laws on primary and secondary school. There have also been efforts to develop teachers’ awareness and skills for formative assessment. In 2015, the Bureau for Development of Education (BDE) published a Formative Assessment Manual, which encourages teachers to increase the use of formative assessment and has guided professional development activities. Other guidance specifically on the use of assessment in primary has been developed and used for professional development purposes (Gerard et al., n.d.[10]).

However, together the laws and policies do not provide a coherent framework to ensure that assessment consistently supports learning. For example, while formative assessment is valued, the new draft law on the national assessment outlines that one of its functions will be to provide data for school ranking. Ranking schools based on raw assessment results neglects the strong contextual factors that impact learning and can encourage a high-stakes approach to school accountability that risks undermining the assessment’s learning function.

Recent curricula changes emphasise a more competency-based approach to teaching and learning

Recent years have witnessed important attempts at curriculum modernisation, in particular in the early grades. However, there have been significant problems with implementation, creating undue pressure on teachers and students and a lack of continuity in learning expectations. This is particularly evident in sciences and mathematics. In 2014, the Cambridge curriculum was introduced in these subjects from grades 1-9. In many respects, this was a positive development. The Cambridge curriculum gives students more time for content mastery and strong encouragement to engage in critical questioning. It is also less based upon retaining factual knowledge and more focused on applying knowledge and skills to real-world contexts.

However, implementation was rushed, rather than phased in gradually grade-by-grade, and schools and teachers were not provided with adequate support. The government is now again revising curricula in grades 1-3 and will begin piloting new materials in a small number of schools in the near future. However, these reforms are happening without an evaluation of the effectiveness and impact of the Cambridge curriculum.

While there have been significant changes to curricula in the lower grades, curricula in upper secondary and in mother tongue languages across all grades were developed as early as 2001 without further updating. This creates challenges in terms of the consistency of expectations for student learning. It also means that upper secondary education in particular still tends to focus on retaining factual knowledge, with a lack of emphasis on developing critical thinking and other 21st century competencies.

Another challenge, especially in upper secondary where students study 15 subjects, is the curriculum’s density. The review team was told that the rigid nature of the curriculum makes it difficult to adapt and, as a result, such adaptation rarely occurs. Schools in North Macedonia report lowest levels of responsibility for the curriculum compared with other countries participating in PISA (OECD, 2016[11]). A very broad curriculum with many mandatory subjects also encourages surface-level retention and prohibits in-depth learning (OECD, 2013[1]).

Learning standards are comprehensive but fragmented

The BDE has created learning standards for nearly all subjects and grades. The Cambridge curriculum also provides learning standards for mathematics and science, labelled learning expectations. A notable gap in the country’s learning standards however are standards for reading and writing in grades 1-3, which provide the foundations for later learning and in other subjects.

Another concern is that the standards are not aligned with each other. While the Cambridge curriculum provides standards for mathematics and science up to grade 9, the previous standards for these subjects that existed before Cambridge was introduced also co-exist. Such inconsistent learning standards impacts the quality of teaching and student assessment. Importantly, the review team’s interviews revealed that teachers do not have a common set of learning expectations for their students and instead form their own, individual and inconsistent expectations. Students, therefore, receive an education that is not cohesive and lacks a clear reference point that identifies what they should be working towards. Inconsistent expectations also make it difficult to establish meaningful evaluation and accountability practices at school or system level.

Box 2.3. Key definitions on learning standards and progression

Learning standards and performance descriptors: clear statements of expected student learning and the key characteristics of student work by grade in the core domains of reading and writing, and mathematics. This could also include performance descriptors that set out the characteristics of student work at different levels of performance.

Learning progressions: set out how students typically move through learning in reading and writing, and mathematics in line with the expectations set out in the learning standards. These could be accompanied by examples of student work at the different learning stages. Learning progressions signal to teachers the knowledge and skills that students need to develop and be able to draw on so that they are able to meet the expectations of the curriculum and learning standards.

Source: (Kitchen et al., forthcoming[12]), OECD Reviews of Evaluation and Assessment in Education: Student Assessment in Turkey, OECD Publishing, Paris.

Classroom assessment

Students are graded on a five-point scale from grade 4 onwards

In grades 1-3, students do not receive numeric marks. Instead, their performance is reported as a written description of student work in school report cards (referred to as certificates of achievement) that do not include standardised descriptors (such as good, very good, etc.). From grades 4-6, students receive descriptions of their performance and also numeric marks. After grade 6, all students receive numeric marks on a scale of 1 (lowest) to 5 (highest). Students receive their marks at the end of grading periods on report cards and their marks also appear in their “e-dnevnik” online journal. A mark of 1 is considered inadequate while all others are passing marks. In theory, students who receive 1 at the end of a grade must repeat that grade, but in practice this occurs very rarely and the Law on Primary Education states that students in grades 1-5 cannot repeat grades. According to PISA 2015, only 3.1% of 15 year-old students in the country have ever repeated a grade, compared to 11.3% on average across OECD countries (OECD, 2016[11]).

From grade 6 onwards, marks from classroom assessments inform student pathways

The average of a student’s marks from grades 6-9, as well as student preference, informs the type of upper secondary school ‒ gymnasium, vocational or arts secondary school a student attends. While students are free to select their school of preference, students are admitted on the basis of the overall score. This results in students with lower grades tending to be oriented towards vocational high schools while the highest performing students are encouraged to enrol in the most prestigious gymnasiums. However, due to demographic decline there are spaces in all but the most prestigious gymnasiums in Skopje, which creates genuine choice for most students.

Classroom assessment focuses heavily on numeric grades

The five-point grading scheme provides a central focus for teachers, students and parents in North Macedonia. Beyond the formal reporting requirements, students receive grades regularly for most exercises. Grades are also used to manage student behaviour, with students marked up or down on academic tasks based on attitudes and attendance. Teachers interviewed by the review team also stated that they felt pressure from parents, not just to regularly report grades, but also to provide students with high marks.

While summative recognition is important, focusing too much on student marks can mean that the deeper learning not assessed by tests is neglected. A narrow focus on numerical grades as the only measure of performance can also lead to grade inflation. This is a particular concern in North Macedonia, in part because of societal pressures, but also because teachers lack clear, consistent standards to benchmark achievement and because their overall assessment capacity is weak. Too much emphasis on grades also limits the space for formative assessment practices, which policy in North Macedonia states to be a priority and is critical for effective learning.

At the same time, expectations to report high marks, combined with a culture that emphasises performance in academic competitions and Olympiads, encourages teachers to focus on the top performers rather than bringing each student to reach their own potential and national standards. According to the Law on Secondary Education, students receive monetary compensation for performing well in academic competitions, and teachers of competition winners are also rewarded.

There have been efforts to support teachers’ assessment knowledge and skills

While there have been important recent initiatives to develop teachers’ assessment skills, in particular in the area of formative assessment, overall training and support in this area remains relatively limited.

Initial teacher education in North Macedonia is not well-aligned with recent curricula changes and does not ensure that teacher candidates graduate with minimum competencies in assessment (see Chapter 3). Once in the profession, the quality and availability of teachers’ professional development opportunities are relatively limited. For example, while there have been isolated efforts to improve support for formative assessment, teachers and teacher trainers said these have not fully taken root because continuous support is insufficient. Teachers lack sustained support and training to encourage them to integrate new models of assessment. While teachers do provide informal support to each other within and across schools, for example, by sharing best practices and exchanging lesson plans, through the “Teacher Actives” in schools and social media. However, these activities are not resourced at the national level and occur outside the formal guidance of the ministry, which limits their access to materials and other resources.

National examinations

The state matura is a well-respected model across the region

There is one main national examination in North Macedonia, the state matura (see Table 2.2). When it was implemented in 2008, the matura was recognised across the region for its modern design and integrity. Students are examined in a core of mother tongue language, and mathematics or a foreign language, and can choose from a list of electives for the remaining subjects. It also includes a project assignment, providing space to recognise a broader range of competencies than a standardised examination and enables students to engage in a subject that they find particularly interesting. In contrast with many other national examinations across the region where there are frequently issues with integrity, the administration and results of the matura are trusted nationally. These are important strengths.

Table 2.1. State matura

Components

Examinations (40%):

  • Compulsory examination: mother tongue language

  • 1st elective: mathematics or a foreign language

  • 2nd elective: choice from list of general subjects

  • 3rd elective:

    • Gymnasium students: choice from list of general subjects

    • Vocational students: a vocational subject in line with a students’ vocational track

  • Project

    Classroom assessment (60%):

  • Average marks from all subjects, grades 6-9

Eligibility

All students completing gymnasiums and four-year vocational education schools.

Item development

Item development is led by state subject committees, composed of professors and practitioners commissioned by the National Education Centre (NEC).

Individual schools develop items for school-assessed subjects and establish committees to assess these subjects.

Question format

Multiple-choice, closed-format short answers and open-ended questions.

Pen and paper.

Grading

Mark out from 1-5 (1=fail; 2-5=pass).

Students also receive their percentile rank for externally examined subjects.

Marking

Compulsory examination, 1st and 2nd electives marked centrally. Multiple-choice and closed-format questions are marked electronically; open-ended questions marked by human assessors.

3rd electives and project marked at school level.

Results

A student who receives at least “2” has passed and has the right to attend a higher education institution.

Higher education faculties consider marks and percentiles from the state matura subjects (40%) and classroom subject averages (60%) for selection.

Reporting

Individual student results are accessible through an online portal on NEC’s website 30 days after the examination.

Results are not reported at the school or municipal level.

NEC prepares a technical, internal report on the matura reports.

Source: (MoES, 2018[13]), The Republic of North Macedonia - Country Background Report, Ministry of Education and Science, Skopje.

A decade after its introduction, there are aspects of the state matura that could be improved

In the decade since the matura’s initial design and implementation, a number of issues have arisen. These include the need to better align the matura with the country’s ambitions to improve the quality and prestige of vocational education and training (VET). It also includes the range of subjects that students take in the matura. In particular very few, roughly 13% of students in 2017, take the mathematics test. This effectively means that almost 90% of students in North Macedonia are never assessed in mathematics in a standardised manner during schooling.

Student results also tend to be compressed into a small range of scores, which suggests that question items are not effective at discriminating at the top of the ability range. Some subjects – especially electives like biology, physics and chemistry – have mean scores near or above 4. With such a preponderance of high scores, it might be difficult for external parties to discriminate between different students, and the students themselves might sort themselves into fields of study in which they are not strong. Universities do receive a student’s percentile rank in external subjects, which can help to select students with the greatest potential; however, given the small share of students taking some examinations and the clustering of marks at the top end, the percentile may be misleading.

Alternative examinations to the state matura might also require review

Instead of the state matura, students can choose to take a school matura (gymnasium students) or a final examination (vocational students) that certifies completion of upper secondary, but does not enable a student to progress to tertiary. The design of these examinations is similar to the state matura – with mother language as a compulsory subject, an elective subject and a project assignment. All parts of the examinations are marked within a student’s school. Students who pass receive a diploma and have access to post-secondary education.

While providing students who do not wish to progress to tertiary with formal recognition of their schooling is positive, very few, around 2% of gymnasium students, choose this option. Among OECD countries with this kind of examination, it is frequently aimed more directly at students who do not intend to continue to tertiary. This means that the examinations are often earlier in schooling (e.g. end of lower secondary); the content is easier; and the question format is sometimes more applied. There is limited involvement of the business sector in assessing the skills required in VET subjects in the practical assignment.

National assessment

A new national assessment is in the early stages of development

North Macedonia currently does not have a national assessment. Previously, a sample-based national assessment was administered from 2001 to 2006, with primarily a monitoring purpose, and another from 2013 to 2017, which assessed all students from grades 4 upwards annually in randomly selected subjects. The results from the latter were controversially intended to be used to monitor the accuracy of teachers’ classroom assessments marks, with the intention of rewarding or penalising teachers depending on how far their classroom assessment marks aligned with students’ marks on the national assessment.

The new national assessment, if designed with a strong formative function, has the potential to provide teachers with a better understanding of student learning in line with national expectations. Through studying students’ results on the assessment, teachers will also be able to improve their own understanding of how to evaluate student performance vis-à-vis a common reference point. Furthermore, teachers can improve their own assessment literacy by reviewing the questions that were designed for the national assessment and integrating some key concepts into their own classroom assessments (see Chapter 5).

National assessment agencies

The BDE supports teachers’ classroom assessment capacity

The BDE has a long list of responsibilities. It develops national curricula and provides teacher training to teachers in gymnasiums and those teaching general subjects in VET schools. In recent years, it has developed specific supports and training on teachers’ classroom assessment. In the past, the BDE contributed to the development of the state  matura (by developing examination specifications for general education subjects), was responsible regular teacher appraisals, and was expected to undertake educational research.

The BDE does not have sufficient resources to meet the demands of all its responsibilities. It has not financed or accredited teachers’ professional development in recent years; it undertakes limited research; it and does not regularly undertake teacher appraisals. One recommendation of this review is that the BDE formally become the central body responsible for teacher support and policy, which is an area that has been under resourced and under represented at the policy-making level to date (see Chapter 3). As part of these changes, the BDE will need to review its structure so that adequate resources can be devoted to each of its functions, and relevant expertise developed. This might include developing a specific unit devoted to teacher professional development, with an expansion of capacity on assessment.

The NEC is the examination and national assessment agency

The National Examination Centre (NEC) is responsible for developing and administering the state matura and acts as the national centre for international testing. This review also recommends that the NEC assume responsibility for the new national assessment that is being developed (see Chapter 5). The NEC currently employs roughly 30 staff, but lacks statistical skills and information technology capacity. Furthermore, the organisation suffers from frequent turnover in leadership, with some ten different directors in the past ten years, limiting the organisation’s ability to represent its needs at the political levels and contribute its professional competence to policy making. There is some research capacity, for example, internal reports on the matura are produced. However, limited internal capacity and a lack of demand at the policy-making level means that assessment results are not fully exploited. For example, there is little demand to analyse and publish matura results by individual exam question. This kind of information is useful for teachers, since it helps them to understand typical student errors and misunderstandings, and can inform how they teach the similar content or concepts in the future (see Chapter 5).

Policy issues

North Macedonia has already started to put in place supports for teachers to use more reliable and valid assessments. In order for assessment to better support learning, it is imperative that those efforts are expanded and well-resourced. This will entail developing clear benchmarks for learning – national learning standards – and providing teachers with resources and tools to apply them in the classroom. The country has already begun to consider how the matura can be adapted to address changing needs, in particular the importance of improving the quality of vocational high schools. This review provides suggestions on how this can be done, proposing modifications to the existing matura rather than the creation of a separate new matura specifically for vocational schools. It also suggests that now, a decade on from when the matura was first implemented, is a good time to review its overall structure and design to encourage greater breadth in learning, and more meaningful grading.

Developing meaningful reporting of student results

A fundamental concern with student assessment in North Macedonia is that assessment results, from teachers’ classroom marks to state matura scores, do not necessarily reflect what a student knows and can do. This is the result of multiple factors including limited support for teachers’ assessment literacy, and inconsistencies in learning expectation across grades and subjects. This situation reflects the lack of coherence in the national curriculum, which combines resources developed at different times for different purposes.

This has a number of negative consequences for teaching and learning. First, it creates inconsistent expectations, in terms of the kinds of knowledge and skills that students should master, which differ across subjects and grades. The lack of coherent standards risks that judgements about student learning are not reliable; a concern which is exacerbated by pressure on teachers to inflate grading.

Another risk is that assessments might only evaluate a narrow set of skills, since the lack of clear learning expectations makes it difficult to understand the more complex competencies students are expected to demonstrate. This is a particular risk in North Macedonia, where teachers have limited training in assessment design and tend to revert to the simple knowledge-recall tests with which they are most familiar.

Finally, unclear standards make it more difficult to identify students who are struggling. This is especially the case when there is an absence of national progressions, which set out the knowledge and skills students typically need to acquire as they move along the trajectory towards higher skills development.

Develop coherent national learning standards

Learning standards illustrate what students are expected to master at a certain level of education (Kleinhenz and Ingvarson, 2007[14]). In a competence-based curriculum, standards are constructed to depict both what students should know and how they are able to apply that knowledge. Clearly defined standards can inform the development of more effective and valid assessments, and provide more reliable data about student progress. Many OECD countries have introduced learning standards as a policy lever to change teaching and assessment practices and improve outcomes (OECD, 2013[1]).

Review and align national learning standards

Currently, North Macedonia’s learning standards reflect various educational principles owing to their independent conceptualisation. For example, the Cambridge mathematics curriculum’s standards for grade 9 are concerned with competencies that involve mastery of several skills, such as posing research questions using statistical methods. The mathematics standards for the 1st year of secondary (grade 10), which is not based on the Cambridge curriculum, however, are more focused on performing discrete tasks, such as calculating a mean. Table 2.2 illustrates some of these differences.

Table 2.2. Standards from the Cambridge curriculum and other national curricula

Cambridge curriculum for mathematics (Grade 9 and below)

Programme area: Data processing

Learning standards and curriculum for mathematics (other national curriculum) (grade 10 and above)

Programme area: Working with data

  • Propose a research question using statistical methods

  • Identify primary or secondary sources for appropriate data

  • Perform statistical calculations and select statistics that are most related to the problem

  • Interpret tables, graphs and diagrams and make conclusions to support or reject initial assumptions

  • Calculate the arithmetic mean

  • Determine the mode and the median

  • Assess whether the sample is representative

  • Organise data and present them graphically

Source: (Education, n.d.[15]) Standards for high school education, http://bro.gov.mk/?q=gimnazisko-obrazovanie-standardi (accessed on 6 January 2019).

As North Macedonia moves towards introducing a more competency-based curriculum in the upper secondary grades, the learning standards across grades should be made coherent with each other. In particular, standards should focus on the same competencies so students can scaffold learning to develop increasingly complex, higher-order competencies, like effective communication and problem solving. This will also provide clarity for students and teachers in terms of learning expectations.

Prioritise learning standards for reading and writing in grades 1-3

The government is currently introducing a new curriculum in grades 1-3 and developing standards for primary school. It is particularly important that the learning standards in these grades are aligned with later standards, so that students develop the essential skills and knowledge that will enable them to master more complex content in the later grades. The Cambridge curriculum that is currently in place in these grades has learning standards in mathematics and science, providing a useful example that North Macedonia’s curricula can build on. However, other subjects, notably in Macedonian and other mother tongue languages, do not have learning standards.

Priority should be given to standards in reading and writing as a means to improve teaching and learning in these crucial early years. Most OECD countries with standards have emphasised their development in the core subjects of reading and writing, as well as mathematics. In North Macedonia, high quality standards would improve the reliability of the descriptive feedback that teachers provide to students and establish commonly understood expectations for learning. Students can see how they are progressing and gain confidence in themselves, helping to develop the type of metacognitive awareness that provides the foundation for future learning.

Given North Macedonia’s experience with the Early Grade Reading Assessment (EGRA), the expected outcomes from these instruments can inform national learning standards in grades 1 through 3. As international and national results are available for EGRA, these data can be used to create standards that are both ambitious and realistic.

Introduce performance levels that set out how far students have achieved learning standards

Creating performance levels would help teachers of North Macedonia better understand the abilities of their students and adapt their instruction to students’ different levels of competence. For example, if asked to solve a problem, a student might be able to solve some of the problem but not all of it, or demonstrate a correct approach to solving the problem but ultimately arrive at an incorrect answer. Teachers who can create assessments to determine this type of nuance can then adapt their instruction in consideration of these identified differences. Research shows that successfully differentiating students’ education in this manner can lead to improved student outcomes (Dumont et al., 2010[16]).

The performance levels can be used to assess the amount of progress a student has made, and not just whether the student can or cannot perform a particular task. Box 2.4 shows an example of a learning standard that accommodates a range of student performance. North Macedonia will need to determine how the levels should be organised to reflect what is most appropriate at the national level. This is a general issue, but especially important in grades 1-3 where there is no standardised description of student achievement. While the move to end numeric grading in these grades is positive, reflecting trends in most OECD countries, it remains important that students, parents, teachers and schools know how far students are meeting the expectations for their age and grade. Other countries that have introduced a similar approach frequently use three or four levels, corresponding to student work that is well below, below, meeting or above national learning expectations. In North Macedonia these kind of standard descriptions of achievement can be used across all grades, in the absence of numeric marks in the lower grades and alongside numeric marks later on.

Box 2.4. “Working mathematically” learning standard from New South Wales, Australia

New South Wales, Australia, divides its curriculum into different domain areas, which are further categorised into competencies. Each competence is associated with a content standard, which is disaggregated into three levels according to grades. The following example comes from “working mathematically” competence of the mathematics domain.

Standard: Develop understanding and fluency in mathematics through inquiry, exploring and connecting mathematical concepts, choosing and applying problem-solving skills and mathematical techniques, communication and reasoning.

Grades 1 and 2

Grades 3 and 4

Grades 5 and 6

Level 1

Describes mathematical situations using every day and some mathematical language, actions, materials, diagrams and symbols.

Uses appropriate terminology to describe, and symbols to represent, mathematical ideas.

Describes and represents mathematical situations in a variety of ways using mathematical terminology and some conventions.

Level 2

Uses objects, diagrams and technology to explore mathematical problems.

Selects and uses appropriate mental or written strategies, or technology, to solve problems.

Selects and applies appropriate problem-solving strategies, including in the use of digital technologies, in undertaking investigations.

Level 3

Supports conclusions by explaining or demonstrating how answers were obtained.

Checks the accuracy of a statement and explains the reasoning used.

Gives a valid reason for supporting one possible solution over another.

Source: (New South Wales Education Standards Authority, 2018[17]), Mathematics K-10, http://syllabus.nesa.nsw.edu.au/mathematics/mathematics-k10/outcomes/ (accessed on 20 January 2018).

It is important that performance levels should be determined independently of a student’s grade (though advancement through grades should be associated with achieving minimum standards). A student can be in grade 8 but demonstrate a lower level of performance in a particular competence than a student in grade 6. This arrangement is more constructive because it allows for teachers to properly identify a student’s current level of competence, particularly if a student is advanced or struggling, and adapt instruction for that student. To better understand the relationship between proficiency levels and grades, it is helpful to think of levels of performance as being able to span multiple grades. This is illustrated in Figure 2.3, which shows curriculum levels from New Zealand. According to this model, students in different grades may be placed in the same performance level or, students in the same grade can be in different performance levels.

Figure 2.3. Years (grades) and curriculum (performance) levels from the New Zealand curriculum
Figure 2.3. Years (grades) and curriculum (performance) levels from the New Zealand curriculum

Source: (New Zealand Ministry of Education, 2007[18]), The New Zealand Curriculum, http://nzcurriculum.tki.org.nz/content/download/1108/11989/file/The-New-Zealand-Curriculum.pdf.

Align student assessment with national learning standards

Developing national learning standards are an important first step, but North Macedonia also needs to ensure that students are assessed according to the standards. Otherwise, teachers will not know if students have learnt what was intended and some students will be left behind.

International experience shows aligning assessment with learning standards is challenging and that investing in supporting materials is necessary in order for standards to act as a central point of reference for classroom and centralised assessments (Shepard, 2001[19]). Teachers in particular require considerable support to accurately assess students’ vis-à-vis their expected outcomes, especially under a competence-based curriculum like North Macedonia’s curriculum where these are framed as complex constructs that integrated both knowledge and skills.

Support teachers in developing assessments that are aligned with learning standards

Teachers require a range of resources and support to develop classroom assessments that are aligned with the national learning standards. These include:

  • Materials that clearly explain the criteria underlying the different learning standards and their performance levels and provide a rubric for assessing students against the criteria. Furthermore, the materials should also illustrate how the rubric classifies different examples of student work so teachers would be able to apply the rubric to the work that they mark (OECD, 2013[1]). Examples of marked student work could also be included so teachers can see and understand how to provide valuable feedback through their marks.

  • Resources that can help teachers create assessments to evaluate students against the standards. These might be examples of questions, activities, projects, investigations, quizzes and tests that are accompanied by the standard that they assess and how they do so (Shewbridge et al., 2011[20]).

    Previously, the BDE included such examples when they developed training materials for teachers to increase their use of formative assessment. Giving teachers access to an online repository of materials is a more efficient way of placing resources into their hands as the repositories can be updated and expanded with minimal resource investment. Moreover, the repositories can be “crowd sourced,” meaning teachers themselves can contribute to the repositories’ growth by uploading materials that they have created for other teachers to use. In Moscow (Russian Federation) online repositories have a review feature such that teachers can also rate each other’s lesson plans so the most useful ones can be identified and more easily accessed (see Chapter 3 for further description about resource sharing between teachers).

  • Support for teachers’ peer-to-peer collaboration, so they can directly assist each other in creating assessments. Research into educational change has noted that some of the most effective catalysts for implementing reforms can be peer-to-peer relationships between schools (Higham, Hopkins and Matthews, 2009[21]; Fullan, 2004[22]). In North Macedonia, many teachers have already formed informal associations to facilitate collaboration. Chapter 3 discusses how informal teacher groups can be formalised and supported to provide more professional development in assessment and across other areas.

It is important that these initiatives not to be viewed as ad hoc projects, but as permanent resources that teachers use, appropriate and develop further. As part of the BDE’s more formalised role for teacher support and development that this review recommends, it can be tasked with developing these resources and developing the online platform so that teachers can access them. It will be critical to ensure that assessment resources are clearly mapped to the learning standards, to guide teachers in selecting the most appropriate assessments.

In order to fully embed these efforts, they should be linked to teacher appraisal and school evaluation processes. As part of teacher appraisal, teachers’ assessments, for example, can be reviewed internally and externally to ensure that they are aligned with national standards. Integral school evaluations can also review to what extent schools are encouraging their teachers to collaborate with each other and with teachers in other schools (see Chapters 3 and 4). By creating these continuous monitoring mechanisms, North Macedonia can ensure that the support provided to teachers is used and that their assessment practices continuously improve as a result.

Connect classroom assessments with the national assessment

The draft Law for Primary Education provides the legal basis for the development of a new national assessment. Chapter 5 of this report discusses specific decisions for the national assessment, including its alignment with national learning standards.

With national and school-level support, teachers will be able to use the national assessment to improve their own assessment literacy. For example, they can use national assessment items as inspiration for their own assessments and compare their students’ work and marks with results on the national assessment, which would help achieve more accurate and reliable classroom assessment.

One way to encourage teachers to use the national assessment as a resource is to involve them directly in its development. In several OECD countries, including Canada, New Zealand and Norway, teachers are responsible for developing national assessment items and for marking student answers (OECD, 2013[1]). Involving teachers in this way (in North Macedonia, teachers already help mark the state matura) not only gives them a feeling of ownership over the assessment, but also makes them think critically about how the assessment items are created and how student marks are related to national learning standards. Teachers can then bring that experience with them to their classrooms and be better equipped to align their own assessment and marking practices with the national standards.

Enhance the accuracy and educational value of marking and reporting

A final issue that is currently making it difficult for teachers to use marking to meaningfully convey student learning in North Macedonia is the compression of the national marking scale. Inherently, a scale of one to five does not allow for very fine-grained judgement. Exacerbating this problem is the grade inflation that occurs in North Macedonia, linked to strong societal expectations for high marks. As a result, student marks gravitate towards four and five. This means that the marks contain little meaning with respect to what students can do, which prevents teachers from using the marks to help students understand their strengths and weaknesses, and take steps to improve their learning. Consistent inflation of marks can also mislead students, as they might believe that they have mastered a domain and be less motivated to further their learning in that area.

Extend the marking scale and link it with the national learning standards

To make marking more meaningful, the review team recommends that North Macedonia extend its national marking scale. Grading schemes vary across countries, but most feature a greater number of potential marks than five (Eurydice, n.d.[23]). Examples include A through F (allowing for – and + marks, such as B- or C+), 1 through 10 and 1 through 100. Several former Soviet states, including Latvia have also changed the range of possible classroom grades from 1-5 to 1-10. A grade of 10 represents “with distinction,” 5 represents “satisfactory” and 1 represents “extremely weak.” Other states that have adopted a similar approach include Armenia and Belarus (Semyonov et al., 2017[24]). Having more available marks gives teachers more flexibility over how they report student results and relieves some of the pressure they might feel currently with so few marks from which to choose. This review recommends that North Macedonia consider moving to a 10-point marking scale, as it is close to the existing marking scale and will enable the country to draw on the experience of other countries in the region who have implemented a similar change in recent years.

Once the national marking scale has been extended, it will have to be linked to the national learning standards in the materials mentioned in Recommendation 2.1.2. Teachers will need to have a shared understanding, grounded in the national standards, of what type of student performance is considered to meet minimum proficiency and how it can be differentiated from performance that does not.

To support this shared understanding, the new marking scale should be linked to levels of performance within each standard. For example, a numeric mark of 1-3 might indicate that a student was well below expected national standards and equate to a level 1 on the new performance levels, while a mark of 7-10 would indicate that a student was working above national expectations and equate to a higher level in the new performance levels. In grades 1-3 where descriptive grading is used, teachers would report where a student was in terms of meeting national expectations (e.g. well below, below, meeting and above) without providing the numerical grades. Many countries across the OECD and beyond have introduced a similar approach to assessing levels of student learning alongside national learning standards (see Box 2.5).

Box 2.5. Reporting scales in Ontario, Canada

In Ontario, Canada, a six-point letter grade scale is used to report student achievement against provincial curriculum expectations in each subject or course. In grades 1 to 6 (see example below), and six-point numeric scales are used for grades 7 to 8, and grades 9 to 12. Each point on the achievement scale is accompanied by a descriptor and aligns with a provincial standard level, which is the reporting scale used for province-wide student assessments. This information is included in student report cards to help parents and students understand students’ results.

Letter Grade

Achievement of the Provincial Curriculum Expectations

A- to A+

The student has demonstrated the required knowledge and skills with a high degree of effectiveness. Achievement surpasses the provincial standard (Level 4).

B- to B+

The student has demonstrated the required knowledge and skills with considerable effectiveness. Achievement meets the provincial standard (Level 3).

C- to C+

The student has demonstrated the required knowledge and skills with some effectiveness. Achievement approaches the provincial standards (Level 2).

D- to D+

The student has demonstrated the required knowledge and skills with limited effectiveness. Achievement falls much below the provincial standards (Level 1).

R

The students has not demonstrated the required knowledge and skills. Extensive remediation is required.

I

Insufficient evidence to assign a letter grade.

A four-point rating scale is also used to report on students’ learning skills and work habits: E-excellent; G-good; S-satisfactory; and N-needs improvement.

Sources: (Rushowy, 2017[25]), Report card, curriculum changes on the way in Ontario, Toronto Star, https://www.thestar.com/news/queenspark/2017/09/06/report-card-curriculum-changes-on-the-way-in-ontario.html, (London Region MISA PNC, 2011[26]), Comment Framework: Progress Reports and Report Cards, http://www.misalondon.ca/PDF/a&e/Comment_Framework_Feb_2011.pdf; (Ontario Ministry of Education, 2010[27]), Growing success: assessment, evaluation and reporting in Ontario schools, Ontario Ministry of Education, Toronto.

Support teachers to use the new marking scale through consistent moderation

Moderation will be critical if teachers are to achieve a shared understanding of student performance vis-à-vis the new marking scale. Moderation refers to procedures that ensure the quality and comparability of assessment judgements. Examples of moderation include teachers marking each other’s assessments, discussing in groups how to give marks or teachers’ marks being checked by an external organisation (OECD, 2013[1]). These procedures are particularly important in North Macedonia in order to help teachers address potential bias in their marking and support them in withstanding external pressure to deliver high marks.

As the developer of learning standards, the BDE would be well positioned to help teachers start to moderate their own work. The BDE could invite selected teacher representatives from schools to come to sessions where BDE staff and teachers would mark common examples of student work using the new scale and then discuss the different marks to encourage a shared understanding among teachers. During these sessions, the BDE would also explain the purpose behind the new marking scale and how to engage with students and parents who might be confused by it and/or demand the highest marks for their children.

Schools can also support teachers to implement the new marking scheme by protecting teachers from external pressures to inflate marks. Schools should be encouraged to adopt an assessment policy that makes it clear that when teachers provide an assessment mark, it is based on evidence collected from multiple sources and professional judgement – is it not open to negotiation. Schools may also prohibit parents from meeting with teachers in the weeks immediately preceding the time when report cards are sent home and make it clear that report card grades cannot be changed once awarded. In neighbouring Serbia, a country where parental pressure for high marks is also considerable, schools have introduced similar policies.

Emphasise that marks are to be used for monitoring student learning, not for ranking

In North Macedonia, it was reported to the review team that classroom grades are used to classify and rank students, sometimes even based on non-academic criteria such as behaviour and attitude. Classroom assessment marks are most effective when they are used to help teachers and students monitor student learning. By focusing on marks as a tool for judgement, teachers and students miss the opportunity to gain more information about where a student is in his/her learning and how that student’s learning can be improved.

When the new marking scale is introduced, national guidance should emphasise the importance of using grades for formative purposes and not just for summative ranking. A particular focus should be given to ensuring that marks be used to identify struggling students so they can be supported to reach minimum national learning expectations. This approach should be reinforced by teacher appraisal and school evaluations, where teachers and the school would be expected to demonstrate how teaching and learning is organised to help all students to make good progress (see Chapters 3 and 4).

Consider introducing a project assessment at the end of lower secondary to inform students’ choice of upper secondary programme

The absence of a high-stakes assessment at the end of lower secondary can be considered a positive aspect of schooling in North Macedonia, as it prevents the assessment from creating a negative backwash in lower secondary classrooms. Nevertheless, students might find it useful to have more information about their learning at this stage to help them decide what type of upper secondary institution to enter. The country might consider how classroom assessment in lower secondary might be used more effectively to inform student choice of high school programme, motivate all students to apply themselves and reinforce more rigorous standards, especially in core subjects.

At present, a student is oriented towards high school programmes based on his/her average marks from all subjects. While the above recommendations will create a more reliable record of achievement, it is also important that students in these transitional years are given opportunity to explore their interests and that this is recognised as part of formal reporting procedures. For example, in Ireland students at the end of lower secondary education complete two classroom-based assessments in each subject that often takes the form of individual or group project work over three to four weeks (NCCA, 2016[28]) (NCCA, 2019[29]) (NCCA, 2018[30]). Such assignments not only recognise students who might do less well in narrower academic tasks, but also motivate them to pursue their strengths. North Macedonia already includes a project as a core component of the matura. Including this in lower secondary education might provide students with greater opportunity to explore areas where they are particularly interested and talented, and guiding their future high school choice.

Focusing assessment practices on helping students learn

The purpose of assessment is to provide information that can be used to improve student learning. In North Macedonia, achieving this purpose is difficult because of an intensive focus on summative marks and results. Instead of viewing assessment as an integral contributor to learning, students and teachers tend to view assessment only as a judgement of achievement. Educators do not tend to use assessment results to help students better understand their current proficiency and determine how they can develop further their knowledge and skills. This leads to many students moving from grade to grade without meeting expectations for their level.

Using assessment in more formative ways – to guide future learning – is one important way to address the above situation. The Ministry of Education and Science (MoES) has made some efforts to promote formative assessment in North Macedonia. For example, the BDE has delivered formative assessment training based on the primary school assessment and formative assessment manuals that it has developed. However, embedding formative assessment in classrooms is challenging. The experience of OECD countries shows that it requires significant and consistent support for teachers, such as resources related to formative assessment and incentives that encourage its use (OECD, 2013[1]). This review recommends the use of diagnostic assessments as an effective way to anchor more formative methods in the classroom. It also highlights some of the barriers that have prevented progress in this area, and how they can be overcome.

Promote the use of diagnostic assessments, especially in early grades

A diagnostic assessment is a type of formative assessment that is administered at the beginning of a study unit in order to determine a student’s level and to develop a suitable learning programme for that student (OECD, n.d.[31]). Implementing diagnostic assessments would help teachers in North Macedonia better understand how far their students are meeting national expectations and what skills and knowledge they still need to develop. This kind of information is particularly important in North Macedonia, because the data from international assessments shows that as students move through school, major gaps in their learning are not addressed, contributing to very low levels of mastery in basic competencies in the final years of schooling. Diagnostic assessments, particularly administered in early grades, help teachers identify learning needs when students are young, thus reducing the need for resource intensive remediation measures when students are older.

Use EGRA and EGMA as diagnostic assessments for young students

Early diagnostic assessments have been administered in North Macedonia, but not systematically. In 2016, the Step-by-Step foundation in North Macedonia conducted the EGRA and Early Grade Mathematics Assessment (EGMA) with children in grades 2 and 3. The results show that students of North Macedonia struggle with essential skills such as oral reading fluency, reading comprehension skills and subtraction (see Chapter 1). Results also show a gap between what Macedonian, Albanian and students of other ethnic groups can do, and between students from urban and rural areas (Step by Step, 2016[32]).

As EGRA and EGMA have already been adapted to the Macedonian context and have produced baseline data, it would be simple and cost effective to adopt them as key classroom resources that all teachers are expected to use in early grades. Teachers should be required to administer these tests at the beginning of grades 1-31 and encouraged to administer them on an ad hoc basis as they see fit. Using centrally developed resources for this purpose is advantageous because the instruments have already been piloted and deemed fit for use, which is important in a context where diagnostic assessments are relatively new. However, over time, once teachers are comfortable with the concept of diagnostic assessment, they should be encouraged to develop their own assessments, based upon the national learning standards. Teacher-created assessments would be more sensitive to their specific classroom contexts, such as individual learner needs and cultural references, which would provide more accurate diagnosis of student learning.

At the time of the review team’s visit, there was a proposal to reduce class hours in primary school. Given the EGRA and EGMA results of students in lower levels, and the fact that students of North Macedonia already receive fewer instructional hours than their international peers (see Chapter 1), further reducing their hours in class might not be advisable. Students would then have fewer hours for learning and it would be more difficult to introduce certain activities that take more time to organise and execute, such as reading in small groups. Data collected from diagnostic assessments can be used to better inform this decision.

Communicate that the purpose of diagnostic assessment is to support students and not classify them

The value of diagnostic assessments is that the results can be used to identify student progress and tailor subsequent instruction. In order to ensure that this purpose of diagnostic assessments is well-understood, it will be important to accompany the assessments with guidance for teachers on how to best use the results. For example, assessments manuals should explain to teachers what students who have a certain assessment mark can do and what those students should learn next.

Diagnostic assessments should explicitly not be used to classify students for services such as special education. In North Macedonia, it will be important to communicate this distinction because there is a historic tendency to interpret a struggling student as having special learning needs. Therefore, system inertia might compel some teachers to view students who perform poorly on diagnostic assessments as in need of special education as opposed to simply having had less exposure to reading and math in their homes and in need of extra help (Bialik and Fadel, 2017[33]).

To this end, guidance introduced with the assessments might also provide teachers with suggestions of how to support students who do not meet these expectations. For example, materials can discuss how to teach a class of students with different proficiency levels. Teachers might also receive suggestions about additional learning opportunities that can be provided for students who are struggling the most.

Provide and record high-quality feedback to support learning

Providing and recording easy-to-understand feedback is a critical component of using diagnostic assessments and of effective assessment practice in general. In North Macedonia, however, teachers do not habitually provide descriptive feedback to students. Almost one-third of secondary school teachers surveyed for this review reported that they either “never or almost never” or just occasionally provided written feedback to students (in addition to a grade). Creating the expectation that teachers systematically provide descriptive written feedback that they, students and parents, refer to, will help embed formative feedback in North Macedonia.

Update reporting structures to reflect the new marking scale

North Macedonia’s national student report card does not provide space for much descriptive reporting. It lists student’s subjects, his/her numeric mark in that subject and a single-word description of that numeric mark. The description does not explain student’s strengths or weaknesses so much, as it acts as a non-numeric equivalent to a student’s mark (e.g. 1 is insufficient, 2 is sufficient, etc.).

This national report card template will need to be updated to include a comprehensive explanation of the learning targets that students are expected to achieve and what is necessary to receive each of the 10 marks (OECD, 2013[1]). Parents and students will need to be notified of the changes to the marking scale and what the new marks mean. In addition to presenting the students’ numeric marks, the report card should provide space for descriptive feedback (more than a single word) that explains the different aspects of the student’s performance, according to the national standards, that led to receiving his/her specific mark. The feedback should be specific to the student and not a pre-written description that is given to any student who receives a particular mark. With clear feedback, students and parents will understand better what a student’s strengths and weaknesses are and what needs to be done to improve the student’s learning.

Providing feedback to parents

In order for parents to support their children’s learning, they need quality information about their children’s level of competence and what the priorities for further learning are. With this type of information, parents can better understand their children’s needs and discuss progress with their children and their teachers (Absolum et al., 2009[4]). However, research shows that in many countries, including across the OECD, parents often believe they do not receive enough information about their children’s progress from their schools (OECD, 2013[1]).

According to the OECD’s survey as part of this review, teachers of North Macedonia have frequent contact with their students’ parents. However, formative feedback is not always provided to students and parents, and a student’s report card does not provide much descriptive information about student learning, especially after grade 6 when providing this information is no longer mandatory. Teachers also told the review team that parents tend to be more concerned with the numeric marking that their children receive rather than their actual learning.

To improve the quality of feedback provided to students and parents, teachers might structure their contact with parents around key milestones during the year (Shepard, 2001[19]). For example, according to the same OECD survey, the vast majority of teachers in North Macedonia rely on student portfolios as part of their assessment practice. After students receive feedback on their portfolios and bring it to their parents, then a follow-up parent-teacher meeting can be scheduled (OECD, 2013[1]). These procedures would naturally then guide conversations with parents around student strengths and areas to focus on, illustrated by examples of student work.

Recording descriptive feedback

To help ensure that feedback is fully utilised, it should be recorded electronically. Teachers, parents and students can then access the feedback of a student’s previous teachers, even if they are from different schools and municipalities. This continuous documentation would help to ensure that teachers could build upon previous individualised instructional efforts. For example, the Education Management Information System (EMIS) for the state of Maryland in the United States contains a section called “teacher comments”, in which a teacher records his/her descriptive feedback for students, which is separate from the student’s summative marks. This information can be viewed by the student and the students’ parents and is permanently stored in EMIS, meaning it would follow the student should he/she change schools (Abdul-Hamid, Mintz and Saraogi, 2017[34]).

North Macedonia’s EMIS does not hold the descriptive feedback a student receives. Even in grades 1-6, where providing descriptive feedback is mandatory, this information is only recorded physically but not electronically. To maximise the utility of descriptive feedback and further encourage teachers to give it, the MoES should develop EMIS to hold descriptive information and require teachers to record it (see Chapter 5).

Remove barriers to providing formative assessment

North Macedonia has invested in promoting the use of formative assessment in classrooms, which has led to greater awareness of its importance. However, one reason why formative assessment has not become more strongly embedded are systematic barriers like a dense and rigid curriculum that prevent teachers using the formative assessment methods that they have learnt. Removing these barriers and strengthening support systems around formative assessment would reinvigorate the use of formative assessment, contributing to better-informed instruction and improved student achievement.

Remove rigid time expectations in the curriculum

Using formative assessment techniques requires that teachers have some flexibility over how they allocate class time. If assessments reveal that most students are not meeting learning expectations, for example, it is the teacher’s responsibility to tailor his/her instruction to meet student needs instead of introducing concepts for which they are not yet ready to learn (OECD, n.d.[31]; Pritchett and Beatty, 2012[35]).

North Macedonia’s curriculum is very dense, which creates a rigid structure. In grade 8, students are required to take 15 subjects. Given that students in North Macedonia already receive among the least instructional time of all PISA-participating countries, teachers are obliged to rapidly cover lots of material in very little time. This creates little room for flexibility if teachers wish to slow down to help students who are struggling. However, data collected from principals as part of PISA shows that schools in North Macedonia have comparatively little flexibility with respect to deciding how instructional time should be allocated (see Chapter 1) (OECD, 2016[11]). In contrast, in countries where teachers have more flexibility over the curriculum, schools themselves decide how many hours to allocate to each subject during each grade in order to meet a minimum number of hours across several grades (Eurydice, 2018[36]).

If teachers are to use assessment for more formative purposes, they must be able to adapt the curriculum to the needs of their students (OECD, 2013[1]; OECD, 2018[37]). As MoES is already in the process of changing the curriculum in grades 1-3, greater flexibility could be built into the new curriculum by allowing teachers to have more autonomy over how they proceed through the curriculum. The review team was told that schools might be allocated a certain amount of time in their timetable that they can use as they see fit. This could be systematically implemented to lend flexibility to the curriculum, and schools and teachers should be encouraged to use it to this end. Evaluating the effects of greater teacher flexibility within the new grades 1-3 curriculum could then inform a decision about how other curricula can be made more flexible.

Allow for greater flexibility in the teaching of the curriculum in the integral evaluation process

In addition to the curriculum itself being rigid, school external evaluation (called “integral evaluation” in North Macedonia) reviews how closely schools adhere to the curriculum and, in turn, discourages schools to adapt the curriculum to their specific context. For example, one of the school quality indicators refers to whether the curriculum is being implemented according to the ministry’s prescriptions. School officials reported to the review team that they felt pressure to follow the curriculum precisely or they would be penalised through the integral evaluation process. Without external expectation to exercise flexibility, teachers will find it difficult to use results from formative assessment if it requires altering teaching plans.

Instead of inspecting whether schools are strictly following the curriculum, integral evaluation can be modified to focus on the extent to which schools are supporting all students to achieve national learning expectation. This might be indicated by matura results and in the future, national assessment results, as well as the quality of instruction in general. This will ensure that teachers are following the national curriculum, but that they are doing so in an intelligent, flexible way, in order to meet their students’ learning needs. Such a change would encourage schools and teachers to exercise flexibility over how the curriculum is used, which would create space for formative assessment and tailored instruction. Chapter 4 provides more on how North Macedonia can reform the school inspection process to better support quality teaching and learning.

Strengthen support in schools for implementing formative assessment

Some efforts have been made in the past to train teachers in using formative assessment techniques, but frequent turnover in key leadership positions mean that momentum has been lost. Once the curriculum and integral evaluation give teachers the autonomy needed to conduct formative assessment, they will need to be supported to ensure that they are motivated to use formative assessment (OECD, 2013[1]; Fullan and Miles, 1992[38]).

International research shows that the kinds of learning opportunities that are most effective at improving teaching competence are job-embedded, collaborative and sustained over time. School-based professional development opportunities like group discussions about teaching activities, joint preparation of instructional material, classroom observations, and coaching offer these kinds of opportunities (Darling-Hammond and Rothman, 2011[39]). These activities allow teachers to learn and practice over an extended period of time in a context closely connected to their daily work and the challenges that they face to introduce new assessment approaches.

However, school-based professional development opportunities require impetus and support to thrive. North Macedonia already has a tradition of school-based teacher groups – the “Teacher Actives”. With support from the BDE, as recommended in Chapter 3, these groups might be encouraged to focus on the practical issues that will help teachers introduce more formative assessment into their classrooms. School leaders can help to create a school environment where formative assessment is encouraged, for example, by clearly identifying it as a valued activity in the school plan and in teacher development plans. Accompanying activities might include allocating more time for teachers to collaborate and discuss how they are using formative assessment and the challenges they are encountering (Kitchen et al., 2017[40]).

Updating the state matura to encourage and assess better student learning in the key areas

The state matura is one of the strengths of North Macedonia’s assessment system. Its administrative procedures are sound and the results from the examination are trusted. When it was created, it was modern and progressive. Ten years on, it is time to review the matura model, to build on its strengths and address emerging challenges. Reviewing the matura also provides the opportunity to ensure that it is adapted to system changes that have occurred, in particular the country’s desire to improve the quality of upper secondary VET.

Revise the matura design to provide more reliable results in key subjects

When the state matura was originally devised, it was hoped that it would certify that students had met basic minimum requirements for graduation from upper secondary school, and that the results would help university faculties select them for further education. In particular, the design aimed to encourage good coverage of core subjects – like mother tongue language and mathematics. However, in practice very few upper secondary school students now take the mathematics electives (roughly 13% of registered candidates selected mathematics in 2017). This situation makes it difficult to determine if students have basic competencies in key areas, and provides university faculties with limited information to make decisions concerning selection.

Figure 2.4. Differences between the current design of the state matura and its recommended design
Figure 2.4. Differences between the current design of the state matura and its recommended design

Source: Adapted from (MoES, 2018[13]), The Republic of North Macedonia - Country Background Report, Ministry of Education and Science, Skopje, and on information provided in the OECD Review mission.

Make mathematics a compulsory subject

Currently, the only compulsory subject in the matura is mother tongue language. The rest of the subjects are electives, one of which is marked internally. Students thus strategically select subjects where they are likely to obtain the highest marks for externally marked subjects, while reserving those that they find most challenging for internal marking. Consequently, very few students choose mathematics as an externally marked subject. This situation makes it difficult for mathematics and science faculties to select the most qualified students and deprives the labour market of valuable skills. Most importantly, it leaves the majority of students with no firm guarantee that they have reached basic standards in a critical domain.

Internationally, mathematics, alongside reading and writing, are considered to be some of the core competencies that students should acquire at school. Not only are they essential for life and work, but they also provide the foundations for other domains such as the humanities and sciences. For this reason, many OECD countries assess mathematics externally as a compulsory subject in national examinations either in lower or secondary school (OECD, 2015[6]; Ofqual, 2012[41]).

This OECD review recommends that mathematics be made a compulsory subject on the state matura. This would result in two compulsory subjects ‒ mother tongue and mathematics ‒ and two additional elective subjects. This would motivate all students in North Macedonia to ensure that they master at least basic mathematics. The results from the matura would help teachers to better orient their instruction. Universities would be able to make a more informed selection of students into mathematics and other faculties where mathematics is important.

Create two versions of the mathematics exam, at basic and higher levels

A common consideration when assessing mathematics centrally is whether to use assessment time to evaluate the breadth of a student’s understanding across several mathematics concepts or the depth of his/her understanding in a few concepts (Ofqual, 2012[41]). One method that several countries, such as Ireland, the Netherlands and Norway, use to balance these needs is to administer the mathematics examination as two tests, with one assessing more basic concepts and the other assessing more advanced topics. Depending upon a student’s interests, he/she can decide to take either the basic or advanced mathematics test in order to fulfil the mathematics requirement of the national examination.

North Macedonia has implemented a similar approach in the past, with an advanced level mathematics, but the option was not popular among students and was eventually eliminated. However, if mathematics is to be compulsory for all students, the examination content will need to be accessible across the full ability range. Alongside considering the OECD’s proposal to make mathematics compulsory, the ministry and the NEC should consider introducing a two-level examination at basic and advanced levels. Each examination will need to be carefully designed with the intended audience and purpose in mind. Box 2.6 provides examples of how other countries set examinations at different levels.

Box 2.6. Setting examinations at different levels in Ireland

In Ireland, the Leaving Certificate Examinations, the final examinations taken at the end of the secondary school system, are available at two levels - ordinary and higher level, in a variety of subjects including English language, natural sciences, humanities and the arts. In addition, the examinations for Irish language and mathematics are also available at the foundation level. Students can take a combination of higher-level and ordinary-level examinations.

In order to certify school completion, students must pass examinations at any level in five subjects. Students who meet this criterion are also able to access post-secondary non- tertiary courses that usually last one year and, in many cases, provide access to higher education institutions.

Source: (Department of Education and Skills, 2018[42]), The Education System, Ireland, https://www.education.ie/en/The-Education-System/ (accessed on 25 March 2018).

Consider extending further the core subjects that are assessed by the state matura

In the future, North Macedonia should also review the relationship between the subjects that upper secondary students are required to study and those that they are assessed by the matura. At present, students in upper secondary study a significant number of subjects (15). This is higher than many OECD countries, where Denmark is at the upper end with 13 subjects (Ofqual, 2012[43]). In contrast to the range of subjects studied in North Macedonia, the matura only assesses students in four subjects. This structure is uncommon internationally – in countries where students are required to study many subjects, they tend to be examined in a broad range of compulsory subjects too. For example, in France students study 9 to 12 subjects in upper secondary, and are assessed in 9 compulsory subjects by the Baccalauréat. In the Netherlands, students study 9 or 10 subjects with 9 compulsory subjects assessed by the examination in vmo schools (Ofqual, 2012[43]).

In North Macedonia, the mismatch between the number and range of timetabled subjects, and the comparatively limited number of externally examined subjects leaves students with little recognition and no certification for two-thirds of the subjects that they study. (While classroom assessment marks from all subjects contribute to the overall matura score, these marks tend to be inflated diminishing their recognition and certification functions).

In the future, North Macedonia should consider addressing this mismatch. Issues to consider include the depth versus breadth of knowledge and skills that North Macedonia wishes its students to achieve in upper secondary. There is not a single approach that works. Some OECD countries opt for less subjects to enable greater depth and coverage of content within individual subjects, such as England. In contrast, others favour breadth across different domains, such as France. Most countries however require that students study a mother tongue language, mathematics, a social science, science and a foreign language until the end of upper secondary (Ofqual, 2012[43]). If North Macedonia decided to increase the subjects that are assessed externally, one option might be to introduce a general humanities and/or general sciences examination, as part of the compulsory core of subjects.

Mark all subjects externally

At present, one elective subject from the state matura is marked internally at the school-level by markers from the same schools who develop the test themselves. Although schools receive guidance from NEC about how to develop and mark student tests, ultimately this method of marking risks that results are not reliable and comparable across different schools. It also encourages students to choose their weakest subject as this elective because they know that if demand for their chosen tertiary programme is high, higher education institutions will focus on their marks from externally examined subjects and discount their results in the internally marked subject, in order to determine selection.

This review recommends that all matura subjects be marked externally. Having external results would increase the value of the subjects previously taken internally, and improve the overall reliability of the matura results. It would also improve the meaningfulness of student results since they would be given a percentile rank according to the entire pool of test takers in that subject, as opposed to only those who elected to take the test externally. VET students, who must take an internally marked VET subject as an elective, would be exempt from this regulation except in specific VET areas (see Recommendation 2.3.3).

Standardise the project assignment

Including a project assignment in the state matura was an innovative development that demonstrates its progressiveness. The intent of the project assignment is to add an authentic assessment component to students’ certification of upper secondary school. According to assessment theory, the project assignment should require students to use skills they have learnt throughout upper secondary school in a practical and authentic manner, thus providing educational value in addition to acting as a certification instrument.

At the present, however, the project assignment has little educational value because there is little consistency in how projects are conceptualised across schools. The NEC does not systematically review or moderate the projects. However, the NEC did recently review some selected project assignments and found that much of what constituted them did not reflect the intent of the requirement. Despite guidelines, the subject matter of project assignments was very broad (e.g. ranging from biology to ethnic tension to mechanics), as was the format of the project (e.g. reports, speeches, or a conversation with the teacher). With such variety, it is difficult to ensure that the quality of all project assignments meets the same minimum standards and that all students improved their learning by completing a project assignment.

The educational value of project assignments could be improved by standardising their composition. Currently, students decide upon their project assignments individually with their teachers. This process could be made more uniform by limiting the topics and the format (e.g. essays or presentations). External resources could be provided to help support schools, such as online examples of acceptable project assignments and guidelines about how school staff can organise themselves to oversee and assess projects. Regular external moderation, from the NEC or BDE, could also be conducted to serve as quality assurance. These efforts would help to create a common purpose and structure around project assignments, which would ensure that the amount of work students have to do to complete project assignments is similarly rigorous across classrooms and schools. The approach used by OECD countries to ensure the quality and consistency of project assignments could also provide inspiration for North Macedonia (see Box 2.7).

Box 2.7. Project Assignments in England, Wales and Northern Ireland (United Kingdom)

In England, Wales and Northern Ireland (United Kingdom), students completing their “A” Levels at the end of upper secondary can also produce an optional “Extended Project”. The Extended Project provides students with the opportunity to develop and demonstrate their project management skills and extended writing.

  • Subjects: the “Extended Project” can be completed in one or more of the student’s study areas and/or areas of interest related to a student’s main study programme, in agreement with their examination centre (often their school). Examples of acceptable titles for Extended Projects are available online.

  • Outcome: a design, performance, report, dissertation or artefact.

  • Assessment: the Extended Project is internally assessed by a candidate’s examination centre. Candidates are required to produce a written log, verified by a supervisor, a written report, supplementary evidence and a presentation.

Students are assessed against four objectives. Each objective has contributes a specific weight to the student’s overall mark:

  1. 1. Manage - identify, design, plan and complete the individual project or task within a group project, applying organisation skills and strategies to meet stated objectives. Contributes 15-25% to final mark.

  2. 2. Use resources - obtain and select information from a range of sources, analyse data, apply relevantly and demonstrate understanding of any appropriate linkages, connections and complexities of their topic. Contributes 15-25% to final mark.

  3. 3. Develop and realise - select and use a range of skills, including new technologies, to solve problems, to take decisions critically, creatively and flexibly, and to achieve planned outcomes. Contributes 35-45% to final mark.

  4. 4. Review – evaluate outcomes including own learning and performance. Select and use a range of communication skills and media to convey and present outcomes and conclusions. Contributes 15-25% to final mark.

Marking grids are provided to demonstrate student performance at three levels for each assessment outcome, and how marks may be allocated.

  • Learning hours: 120 hours in total. Approximately 50 hours of taught time and 70 hours preparing for assessment.

  • Grades: A* - E

Source: (UCAS, n.d.[44]), Extended Project Qualification (EPQ), https://qips.ucas.com/qip/extended-project-qualification-epq (accessed on 14 January 2019).

Adapt marking and improve item quality to provide greater discrimination of student ability and motivate students to improve their learning

Some subjects on the matura have unusually high student results while others have a more normally distributed range of student results. Figure 2.5 shows the distribution of students on the English, mathematics and biology subject tests of the matura according to the percentage of correctly answered items on each test.

Figure 2.5. Distribution of students according to the percentage of questions answered correctly on different matura subject tests in 2017
Figure 2.5. Distribution of students according to the percentage of questions answered correctly on different matura subject tests in 2017

Source: Author’s calculations based on data provided by NEC.

The different distributions of matura results distort the relationships between the number of questions a student answered correctly, his/her percentile rank and mark. For example, a small difference in percentile rank points could represent a large difference in correctly answered items in some subjects and a small difference in other subjects. In 2017 in English, the difference between the 99th and 95th percentile ranks represents a difference of almost 20% of correctly answered items. In biology, the same difference in percentile ranks represents of a difference of 6% of correctly answered items. These disparities could mislead tertiary education faculties who select whom they believe to be the top students for enrolment.

With respect to marking, the limited range of marks means that in the 2017 English examination, a student in the 89th percentile received the same mark (4) as a student in the 65th percentile. This can affect student motivation and learning opportunities in upper secondary.

The matura’s marking scheme should be changed, but should also be accompanied by a more analytical item development process. A surplus of items that are too easy, too difficult or have poor discriminating ability is producing a skewed distribution of results in which a small range of scores is occupied by a large number of students. Analysing item-level matura results to inform future item development would improve the quality of the items and normalise the distribution of the marks that students receive.

Change the marking scheme to 1 to 10

Similar to student marks on their report cards, the state matura marking scheme should be expanded to 1-10. The current scheme makes too few marks available, which potentially forces vastly different levels of performance into the same mark. While students also receive a percentile rank, which is more precise, the mark still determines whether a student passes or fails the subject and represents a significant motivating factor for the student. It is therefore highly important to confer marks that accurately represent student performance so students are driven to improve their learning.

To implement such a change, the proper communication will need to be created so parents and students understand what each mark represents. NEC staff who are responsible for scaling students’ raw scores will also need to be trained in how to translate raw scores into scaled scores on the new scale.

As well as expanding the marking scheme, North Macedonia could also consider providing a student’s raw results to universities for selection purposes, alongside the percentile ranking and scaled score. While the percentile rank provides universities for further information for selection purposes, since a large number of students achieve very high marks, it can be misleading. Moreover, a percentile rank only reports student achievement in comparison to the performance of others. While universities naturally seek those students with the greatest potential from their cohort, they should also ensure that students meet specified standards. This is particularly important in North Macedonia given the concerns about the low levels of students’ basic competencies towards the end of schooling, and the low quality of tertiary education. Providing universities with a student’s raw score would help them to take a more informed decision based on students’ objective performance in subjects. Over time, it would also reinforce the role of learning standards across the education system.

Produce and analyse item statistics after the state matura has been administered

After a large-scale assessment is administered, student responses represent a source of valuable information about the functioning of the items. Internationally, most OECD countries conduct a post-mortem analysis of national assessments and examinations after their administration in order to learn about how students engaged with the test’s items. For example, the percentage of students that answered questions correctly (p-value) conveys the difficulty level of the items relative to each other. Point-biserial correlations, which indicate to what extent students who answered more questions correctly overall (i.e. high-performing students) are more likely to answer individual items correctly, help identify the discriminating ability of items.

Currently, the NEC does not systematically produce item statistics after administrations of the state matura. The review team recommends that statistics such as these be regularly produced and analysed. Through review of these data, the NEC and item developers will have better understanding of whether they have created appropriate items and how future items can be better written.

Tests from non-compulsory, external subjects might deserve particular analytical attention, as many of these had average scores near or above 4 in 2017. This could be a product of a self-selecting population taking tests that were designed for a general population. In other words, perhaps only the best students in these subjects are electing to take the corresponding matura test, for example in maths, which would naturally lead to higher than expected scores. If, after analysing item statistics, this is indeed the case, then item developers can develop items that are more difficult, understanding their target population is a highly specific one, not a general one. matura scores in these subjects would then follow a more normal distribution and enable a better discrimination of student ability.

Strengthen the VET component of the state matura

In North Macedonia, a sizeable share of upper secondary students are enrolled in VET programmes (approximately 50% of the cohort). The state matura provides these students with important flexibility – enabling them to pursue tertiary education, or to directly enter the labour market if they wish. Providing students with this flexibility is positive, and contributes to the real and perceived rigour of the matura that VET students undertake because it entails a solid academic dimension.

However, there are number of challenges associated with upper secondary VET education in North Macedonia which suggest that it is not equipping students or the economy with important skills. Employers report that graduates of upper secondary VET lack key skills that are important in the workplace, in particular for technical skills (ETF, 2017[45])). Also, after completing four years of vocational education, many students choose to return to more general study when they reach university. While this is not necessarily a problem and underscores the significant flexibility of the structure of schooling in North Macedonia, it does reflect a lack of opportunities for students to continue vocational education to higher levels, such as post-secondary and tertiary VET. As VET education is also more expensive to provide than general upper secondary education, some might question the cost effectiveness of this model. Aware of these challenges, North Macedonia is now implementing a five-year programme to improve the quality of vocational education (see Chapter 1).

Another issue is the certification of skills acquired in VET programmes. A factor contributing to the reluctance of VET students to remain in the VET pathway and the comparatively low prestige of VET, is that students’ vocational skills are not assessed in a way that helps them be recognised by employers or professional tertiary faculties. While four-year VET students can pass a matura to enter higher education, they do not graduate with externally validated certification in their specific vocational field, which would provide more meaningful recognition of their vocational subjects. In effect, studying VET does not confer a professional or vocational advantage to students, which diminishes the attractiveness of pursuing a VET programme in the first place (MoES, 2013[46]).

North Macedonia is now considering the development of a “VET matura”. The justification given for considering such a reform is that VET students take fewer general education courses compared to gymnasium students and therefore are at a disadvantage when taking the state matura. One purpose of the proposed VET matura, would be to assess general education outcomes, containing questions of lower difficulty, for students enrolled in VET. However, given North Macedonia’s goal of raising the overall quality and prestige of VET, this review recommends a single matura model be maintained, to ensure that VET students continue to be evaluated to the same rigorous standards as general education students. The review suggestions changes to the design of the matura model to better certify vocational subjects.

Externally validate the VET subject of the state matura and link the results to certification

Students who attend upper secondary VET institutions and take the state matura are required to take one elective in a VET subject. This subject however, is internally marked at the student’s schools. Results of internal assessments are less likely to be reliable due to inconsistent testing conditions, items and marking criteria across schools (OECD, 2013[1]). It also means that vocational subjects do not carry the same external recognition or certification as general academic subjects. This contributes to a perception of vocational subjects as less prestigious. Practically, the lack of reliability of internally examined subjects makes it difficult to use the results of internal VET subject tests to signal students’ specific skills to professional tertiary programmes or potential employers.

Instead of creating a new examination specifically for VET students, this OECD review suggests that the state matura be revised to externally validate students’ vocational capacities. This would provide vocational studies with more meaningful recognition and certification, helping both employers and tertiary education faculties to identify promising VET students.

Passing the externally validated VET subject should also provide students with a formal VET certification, integrated in North Macedonia’s national qualification framework. This would recognise students’ vocational competencies - signalling readiness to employers and technical tertiary faculties - and provide students with a clearer pathway to professional employment. In turn, the attractiveness of VET would also be enhanced.

Internationally, externally validated vocational qualifications are often conferred to graduates of upper secondary vocational programmes (OECD, 2014[47]). These qualifications play a key role in enhancing the attractiveness of the upper secondary vocational track, though they do not prevent students from pursuing general tertiary education upon graduation.

Determine responsibility for assuring the quality and external validation of the new VET certification

While the VET Centre itself does not have the capacity to mark all VET elective tests, it is important that they continue to establish procedures, such as through their current monitoring visits, to ensure the standardised administration of the new VET examination and certification. Their involvement would act as quality assurance to ensure that testing experiences are common and that final results are comparable. Instead, the design of the assessments, establishment of the standards and assessment of the students against the standards might involve a body of employers or similar professional associations (OECD, 2014[47]). For example, during its visit, the review team learnt that an electro-engineering firm was helping to shape learning outcomes and craft a work-based learning programme.

Several countries have created vocational examination systems that follow this configuration. In Switzerland, professional examinations are led and developed by the relevant industries, while the Federal Office of Professional Education and Training checks the documentation of the examinations (Fazekas and Field, 2013[48]). In Germany, many local chambers of commerce are responsible for determining the assessment content of professional examinations, but their methods must follow frameworks that are established at the national level (Fazekas and Field, 2013[49]). It will be important that the VET Centre still has an overall strategic and validation role, to ensure that the external VET assessments reflect the national interest and do not become too narrowly focused on a niche skills set for certain occupations or declining industries (OECD, 2010[50]).

At present, there are over 150 different VET specialisations, meaning it would not be possible to externally validate all VET electives on the state matura. A more practical course would be to select a small number of subjects related to sectors that have been identified as important by economic and labour assessments. Over time, the current VET specialisations can be consolidated into VET families. This condensed structure would avoid presenting students with options that are too professionally narrow, thereby limiting students’ future employment options, help maintain the quality of the tests themselves and, in time, award certification several programmes that are part of the same professional family.

Require project assignments be related to VET

Like students from gymnasium upper secondary schools, VET students also have to complete a project assignment as part of the state matura. Their projects can be from any field of study. In practice, the review team was told that many VET students do complete a project in a VET related field, but some still focus on a general education subject, which further disincentivises students from focusing on their VET pathways.

North Macedonia should require that VET students complete their project assignment for the state matura in their chosen vocational subjects. In cases where the business community offers apprenticeships, students’ project assignments can be linked to those opportunities, such as by designing a project that is relevant to the student’s place of employment. By having to spend time developing a vocational project, students might become more interested in their subjects and have a stronger likelihood of pursuing their vocations in the future.

Box 2.8. The “EUX” hybrid programme in Denmark

The EUX programme was launched in Denmark in 2010 as a means of improving the attractiveness of VET by encouraging the link between VET and higher education. EUX combines a three-year gymnasium general upper secondary education and a four-year apprenticeship in a single programme. EUX is normally four years and a few months in length, with some variability between fields of study. It is a demanding programme, since the students must follow two curricula, so it will only become a small part of the Danish VET system (2% of students in 2013-14). An evaluation has shown that it can attract a group of mid-performing students into VET. These are students with a stronger academic performance than most VET students, but not as strong as the strongest gymnasium students.

Source: (Musset et al., 2019[51]), OECD Review of Vocational Education and Training in Estonia, OECD Publishing, Paris, https://doi.org/10.1787/g2g9fac9-en.

Conclusion

The matura in North Macedonia is an important achievement. The national trust in its results and its innovative design make it a deservedly recognised example across the region. However, classroom assessment - the aspect of a country’s assessment framework that is most important for student learning – should be a priority. Teachers need more assessment resources and more practical professional development that will help them to integrate effective assessment methods into their classrooms. This will provide the support to raise student learning outcomes in North Macedonia, and especially to ensure that all students do well, regardless of their starting points.

Box 2.9. Recommendations

Developing meaningful reporting of student results

2.1.1 Develop coherent national learning standards that set out what students should know and how they are expected to apply knowledge to promote more valid, reliable assessments. To achieve the latter, the country will needs to review and align national learning standards across different grades so that student learning is scaffolded towards increasingly complex, higher-order competencies. Particular priority should be given to standards in core learning areas, like mathematics and reading and writing, especially because the latter currently do not have standards in grades 1-3.

The development of leaning standards should be accompanied by the introduction of performance levels that set out student achievement against national standards e.g. above, meeting or below national learning expectations. This is especially important in grades 1-3, where there is no standardised description of student achievement at present.

2.1.2 Align student assessment with national learning standards by providing teachers with supports such as clear explanations of the criteria underlying different learning standards and their performance levels, rubrics for assessing students, marked examples of student work and examples of assessments to evaluate students. These materials can be provided via an online platform so that they reach more teachers, can be easily updated and facilitate teachers’ own contributions to online content. Once a new national assessment is developed (see Recommendations 5.2.1 and 5.2.2), teachers should be encouraged to use its items as inspiration for their own assessments and compare their students’ work with results on the national assessment to promote more accurate and reliable classroom assessment.

2.1.3 Enhance the accuracy and educational value of marking and reporting by extending the marking scale of classroom assessment. The scale might be extended to 1-10, reflecting similar practices in the region. The new marking scale should be linked to the new national learning and performance standards (see Recommendation 2.1.1). The BDE can help teachers to use the new marking scale by creating moderation opportunities, like helping teachers to mark each other’s assessments and discussing in groups how to give marks.

The country might also consider introducing a project assignment at the end of lower secondary to inform students’ choice of upper secondary programme, motivate all students to apply themselves and reinforce more rigorous standards, especially in core subjects.

Focusing assessment practices on helping students learn

2.2.1 Promote the use of diagnostic assessments, especially in early grades, to help teachers better understand how far their students are meeting national expectations and what skills and knowledge they still need to develop. Teachers could be required to undertake diagnostic assessments at the beginning of grades 1-3 and on an ad hoc basis as relevant using instruments based on the Early Grade Reading Assessment (EGRA) and Early Grade Mathematics Assessment (EGMA) that have recently been adapted to the North Macedonian context. As teachers become more comfortable with diagnostic assessments, they should be encouraged to develop their own assessments, based on national learning standards. Teachers will also need guidance on how to use the results from the diagnostic assessments to identify student progress and tailor subsequent instruction.

2.2.2 Provide and record high-quality feedback to help students and parents understand a student’s learning needs. The student report card should be updated to provide more space for descriptive feedback that explains why a student received a specific mark. This will help students and parents understand the next steps to improve learning. The country should also ensure that this more descriptive feedback is systematically recorded and shared, for example in the country’s Education Management Information System (EMIS), so that parents, students and other teachers can access feedback from previous teachers. This continuous documentation would help teachers to better understand student needs.

2.2.3 Remove barriers to providing formative assessment by systematically ensuring that all schools can allocate a certain amount of learning time as they wish. This would provide teachers with greater flexibility to use teaching time to respond to the learning needs that assessment results highlight. Greater curricula flexibility should be matched by changes to the school evaluation framework to focus on broader measures like school-wide achievement of national learning standards, rather than detailed implementation of the curriculum.

To take advantage of greater autonomy, teachers will need more support to implement formative assessment. The BDE might support the country’s school-based teacher groups ‒ the “Teacher Actives” ‒ to focus on practical assessment issues, like questioning and feedback techniques and how to use the new diagnostic assessments.

Updating the state matura to better assess student learning in the most important areas

2.3.1 Revise the matura’s design to provide more meaningful results in key subjects, by:

  • Making mathematics a compulsory subject to motivate all students to master at least basic mathematics and help universities make a more informed decision regarding student selection into mathematics and other related courses.

  • Creating two versions of the mathematics exam, at basic and higher levels, to provide mathematics certification that is useful and accessible for all students, while providing those students who wish to pursue mathematics at a higher level with the option to study more advanced concepts.

  • Considering extending further the core subjects that are assessed to ensure a better match between the breadth of subjects studied and those that are assessed.

  • Marking all subjects externally to increase the value of the subjects previously taken internally, and improve the overall reliability of the matura results.

  • Standardising the project assignment e.g. by limiting the topics and the format and providing online examples of acceptable project assignments and guidelines for school staff on how to oversee and assess projects. Regular external moderation, from the NEC or BDE, could also be conducted for quality assurance.

2.3.2 Adapt marking and improve item quality to provide greater discrimination of student ability and motivate students to improve their learning. The NEC should analyse items following each administration of the matura to learn how students engaged with the test’s items. The analysis can inform future item development so that there are not too many items that are too easy, too difficult or have poor discriminating ability. Undertaking these procedures will help to improve item quality and normalise the distribution of the student marks. The country should also consider extending the marking scheme, in line with changes to the marking scheme for classroom assessment, to 1-10 (Recommendation 2.1.3), to enable provide greater scope to discriminate between different levels of achievement.

2.3.3 Strengthen the VET component by externally validating student achievement in the VET subject and linking the results to employer-recognised certification. The externally validated VET subject should provide students with a formal VET certification, integrated in North Macedonia’s national qualification framework, to signal readiness to employers and technical tertiary faculties. VET students should also be required complete their project assignment for the matura in their chosen vocational subjects to provide greater recognition and time for the development of vocational skills.

To make VET certification more feasible the current 150+ different specialisations should be reduced to a small number of subjects related to sectors that have been identified as important by economic and labour assessments. Over time, the current VET specialisations can be consolidated into VET families so that students do not pursue options that are too narrow, limiting their future employment options. The VET Centre should continue to oversee examination procedures to provide quality assurance. Since the Centre does not have the capacity to develop and mark all tests, the design and marking of the assessments might involve a body of employers or professional associations.

References

[34] Abdul-Hamid, H., S. Mintz and N. Saraogi (2017), From Compliance to Learning: A System for Harnessing the Power of Data in the State of Maryland, The World Bank, http://dx.doi.org/10.1596/978-1-4648-1058-9.

[4] Absolum, M. et al. (2009), Directions for Assessment in New Zealand (DANZ) Report: Developing Students’ Assessment Capabilities, Ministry of Education, Wellington.

[33] Bialik, M. and C. Fadel (2017), Overcoming System Inertia in Education Reform, http://curriculumredesign.org/wp-content/uploads/Inertia-in-Education-CCR-Final.pdf.

[7] Bishop, J. (1999), Are National Exit Examinations Important for Educational Efficiency?, Cornell University ILR School, http://digitalcommons.ilr.cornell.edu/articles.

[3] Black, P. and D. Wiliam (1998), “Assessment and Classroom Learning”, Assessment in Education: Principles, Policy & Practice, Vol. 5/1, pp. 7-74, http://dx.doi.org/10.1080/0969595980050102.

[39] Darling-Hammond, L. and R. Rothman (eds.) (2011), Teacher and Leader Effectiveness in High-Performing Education Systems, Alliance for Excellent Education and Stanford Center for Opportunity Policy in Education, Washington DC, https://edpolicy.stanford.edu/sites/default/files/publications/teacher-and-leader-effectiveness-high-performing-education-systems.pdf (accessed on 16 April 2018).

[42] Department of Education and Skills (2018), The Education System in Ireland, https://www.education.ie/en/The-Education-System/ (accessed on 10 January 2019).

[16] Dumont, H. et al. (2010), The nature of learning : using research to inspire practice, OECD Publishing, Paris, https://doi.org/10.1787/9789264086487-en (accessed on 27 February 2018).

[15] Education, B. (n.d.), Standards for high school education, http://bro.gov.mk/?q=gimnazisko-obrazovanie-standardi (accessed on 27 March 2019).

[45] ETF (2017), Tracing Secondary Vocational and Tertiary Education Graduates in the former Yugoslav Republic of Macedonia - 2-16 Tracer Study Results, https://www.etf.europa.eu/webatt.nsf/0/370594378AEE2242C12581C90068FE63/$file/2016%20Tracer%20study%20results%20MK.pdf (accessed on 12 July 2018).

[36] Eurydice (2018), Recommended Annual Instruction Time in Full-time Compulsory Education in Europe, http://dx.doi.org/10.2797/616811.

[23] Eurydice (n.d.), National Education Systems, https://eacea.ec.europa.eu/national-policies/eurydice/national-description_en (accessed on 13 August 2018).

[49] Fazekas, M. and S. Field (2013), Skills beyond School Review of Germany, OECD Publishing, Paris, https://doi.org/10.1787/9789264202146.

[48] Fazekas, M. and S. Field (2013), Skills beyond School Review of Switzerland, OECD Publishing, Paris, https://doi.org/10.1787/9789264062665-en.

[22] Fullan, M. (2004), “Leadership Across the System”, http://michaelfullan.ca/wp-content/uploads/2016/06/13396061760.pdf.

[38] Fullan, M. and M. Miles (1992), “Getting reform right: What works and what doesn’t”, Phi Delta Kappan, Vol. 73/10, pp. p 774-52, http://search.proquest.com/openview/90f371562121c3949aece5d25131b48f/1?pq-origsite=gscholar&cbl=41842 (accessed on 16 February 2018).

[10] Gerard, T. et al. (n.d.), Improving the Assessment of Students in the Primary Schools in Macedonia, Bureau for Development of Education.

[52] Gove, A. and A. Wetterberg (2011), The Early Grade Reading Assessment: Applications and Interventions to Improve Basic Literacy, RTI, http://dx.doi.org/10.3768/rtipress.2011.bk.0007.1109.

[21] Higham, R., D. Hopkins and P. Matthews (2009), System Leadership in Practice., McGraw-Hill International (UK) Ltd, https://books.google.fr/books?hl=en&lr=&id=M7bmArWFgLsC&oi=fnd&pg=PP1&dq=Higham,+Hopkins+and+Matthews,+2009&ots=cxxBJuhdWo&sig=oZqZQnGjzb4FzNJbUYXhg2rJxwg#v=onepage&q=Higham%2C%20Hopkins%20and%20Matthews%2C%202009&f=false (accessed on 8 March 2018).

[12] Kitchen, H. et al. (forthcoming), OECD Reviews of Evaluation and Assessment in Education: Student Assessment in Turkey, OECD Publishing, Paris.

[40] Kitchen, H. et al. (2017), OECD Reviews of Evaluation and Assessment in Education Romania, OECD Publishing, Paris, https://doi.org/10.1787/9789264274051-en.

[14] Kleinhenz, E. and L. Ingvarson (2007), “Standards for Teaching: Theoretical Underpinnings and Applications”, Australian Council for Educational Research (ACER), http://research.acer.edu.au/teaching_standards/1.

[26] London Region MISA PNC (2011), Comment Framework for Progress Reports and Report Cards, http://www.misalondon.ca/PDF/a&e/Comment_Framework_Feb_2011.pdf (accessed on 18 January 2019).

[13] MoES (2018), The Republic of North Macedonia - Country Background Report, Ministry of Education and Science, Skopje.

[46] MoES (2013), Strategy for Vocational Education and Training in a Lifelong Learning Context, Ministry of Education and Science, Skopje.

[51] Musset, P. et al. (2019), OECD Review of Vocational Education and Training in Estonia, OECD Reviews of Vocational Education and Training, OECD Publishing, Paris, https://dx.doi.org/10.1787/g2g9fac5-en.

[29] NCCA (2019), Junior Cycle Mathematics Guidelines for the Classroom-Based Assessments and Assessment Task, NCCA, https://www.curriculumonline.ie/getmedia/f5af815d-5916-4dc9-bfda-4f3d73bc4787/Assessment_Guidelines_Mathematics.pdf (accessed on 18 January 2019).

[30] NCCA (2018), Junior Cycle History Guidelines for the Classroom-Based Assessment and Assessment Task First Edition, https://www.curriculumonline.ie/getmedia/adcacb84-1886-4ea2-9b0a-36063e84cedc/JC_History-Assessment_Guidelines.pdf (accessed on 18 January 2019).

[28] NCCA (2016), Junior Cycle Business Studies Guidelines for the Classroom-Based Assessments and Assessment Task, NCCA, https://www.curriculumonline.ie/getmedia/db49a6b8-0cf8-446f-bc28-b382d37cb53d/AssessmentGL_BS_2018_EN.pdf (accessed on 18 January 2019).

[17] New South Wales Education Standards Authority (2018), Mathematics K–10, http://syllabus.nesa.nsw.edu.au/mathematics/mathematics-k10/outcomes/ (accessed on 17 January 2018).

[2] New Zealand Ministry of Education (2007), The New Zealand Curriculum, http://nzcurriculum.tki.org.nz/content/download/1108/11989/file/The-New-Zealand-Curriculum.pdf.

[18] New Zealand Ministry of Education (2007), The New Zealand Curriculum / Kia ora - NZ Curriculum Online, http://nzcurriculum.tki.org.nz/The-New-Zealand-Curriculum#collapsible4 (accessed on 17 January 2018).

[37] OECD (2018), Curriculum Flexibility and Autonomy in Portugal - an OECD Review, https://www.oecd.org/education/2030/Curriculum-Flexibility-and-Autonomy-in-Portugal-an-OECD-Review.pdf (accessed on 14 August 2018).

[5] OECD (2017), Education in Lithuania, Reviews of National Policies for Education, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264281486-en.

[8] OECD (2016), Education Policy Outlook: Korea, http://dx.doi.org/www.oecd.org/education/policyoutlook.htm . (accessed on 10 January 2019).

[9] OECD (2016), PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264266490-en.

[11] OECD (2016), PISA 2015 Results (Volume II): Policies and Practices for Successful Schools, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264267510-en.

[6] OECD (2015), Education at a Glance 2015: OECD Indicators, OECD Publishing, Paris, https://dx.doi.org/10.1787/eag-2015-en.

[47] OECD (2014), Skills Beyond School: Synthesis Report, OECD Publishing, Paris, http://www.oecd.org/education/skills-beyond-school/Skills-Beyond-School-Synthesis-Report.pdf (accessed on 17 August 2018).

[1] OECD (2013), Synergies for Better Learning: An International Perspective on Evaluation and Assessment, http://www.oecd.org/edu/school/synergies-for-better-learning.htm.

[50] OECD (2010), Learning for Jobs, OECD Publishing, Paris, https://www.oecd-ilibrary.org/docserver/9789264087460-en.pdf?expires=1542285558&id=id&accname=guest&checksum=D6398535F72C413742243CA9C9DEA7B3.

[31] OECD (n.d.), “Assessment for Learning: Formative Assessment”, https://www.oecd.org/site/educeri21st/40600533.pdf (accessed on 16 February 2018).

[41] Ofqual (2012), Comparing international secondary assessment: full report, https://www.gov.uk/government/publications/comparing-international-secondary-assessment-full-report (accessed on 16 August 2018).

[43] Ofqual (2012), International Comparisons in Senior Secondary Assessment: Full Report, Ofqual, http://ofqual.gov.uk/documents/international-comparisons-in-senior-secondary-assessment-full-report/ (accessed on 16 August 2018).

[27] Ontario Ministry of Education (2010), Growing success: assessment, evaluation and reporting in Ontario schools, Ontario Ministry of Education, Toronto, http://www.edu.gov.on.ca/eng/policyfunding/growsuccess.pdf (accessed on 18 January 2019).

[35] Pritchett, L. and A. Beatty (2012), “The negative consequences of overambitious curricula in developing countries”, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2102726 (accessed on 14 August 2018).

[25] Rushowy, K. (2017), Report card, curriculum changes on the way in Ontario, The Star, https://www.thestar.com/news/queenspark/2017/09/06/report-card-curriculum-changes-on-the-way-in-ontario.html (accessed on 18 January 2019).

[24] Semyonov, D. et al. (2017), Student accountability in post-Soviet countries, http://unesdoc.unesco.org/images/0025/002595/259570e.pdf (accessed on 21 September 2018).

[19] Shepard, L. (2001), The role of classroom asssessment in teaching and learning, University of California, Los Angeles, https://scholar.google.com/citations?user=vUYo2XsAAAAJ&hl=en#d=gs_md_cita-d&p=&u=%2Fcitations%3Fview_op%3Dview_citation%26hl%3Den%26user%3DvUYo2XsAAAAJ%26citation_for_view%3DvUYo2XsAAAAJ%3A9yKSN-GCB0IC%26tzom%3D-120 (accessed on 14 August 2018).

[20] Shewbridge, C. et al. (2011), OECD Reviews of Evaluation and Assessment in Education: Denmark 2011, OECD Reviews of Evaluation and Assessment in Education, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264116597-en.

[32] Step by Step (2016), Early Grade Reading and Mathematics Assessment in the Republic of Macedonia: Study Report, USAID, http://www.stepbystep.org.mk/WEBprostor/EGRA_and_EGMA_Study_Report_-_May_2015.pdf (accessed on 11 July 2018).

[44] UCAS (n.d.), Extended Project Qualification (EPQ), https://qips.ucas.com/qip/extended-project-qualification-epq (accessed on 14 January 2019).

Note

← 1. While EGRA and EGMA have only been administered in grades 2 and 3 in North Macedonia, they can, and have been elsewhere, be administered in grade 1 without adopting the materials (Gove and Wetterberg, 2011[52]).

End of the section – Back to iLibrary publication page