# 3. What can students do in mathematics, reading and science?

This chapter presents the various levels of proficiency that students exhibited in PISA 2022 in mathematics, reading and science. It describes what students can do at each level of proficiency in each subject and how many students performed at each proficiency level. It then discusses student performance in specific aspects of mathematics.

For Australia, Canada, Denmark, Hong Kong (China), Ireland, Jamaica, Latvia, the Netherlands, New Zealand, Panama, the United Kingdom and the United States, caution is required when interpreting estimates because one or more PISA sampling standards were not met (see Reader’s Guide, Annexes A2 and A4).

Some 69% of students attained at least baseline proficiency Level 2 in mathematics on average across OECD countries. Over 85% of students in Estonia, Hong Kong (China), Japan, Macao (China), Singapore and Chinese Taipei performed at this proficiency level or above.

Roughly three out of four students attained at least baseline proficiency Level 2 in reading on average across OECD countries. A similar proportion attained at least Level 2 in science.

On average across OECD countries, some 9% of students attained the highest proficiency levels, Level 5 or 6, in mathematics. In 16 out of 81 countries and economies participating in PISA 2022, more than 10% of students attained Level 5 or 6 proficiency; by contrast, in 42 countries and economies, less than 5% of students attained Level 5 or 6 proficiency in mathematics.

Some 7% of students attained the highest proficiency levels, Level 5 or 6, in reading on average across OECD countries. A similar proportion attained Level 5 or 6 proficiency in science.

This chapter describes what students are able to do in mathematics, reading and science. Chapter 2 describes students’ performance through their score on the PISA scale; scores, however, do not indicate what students are actually capable of accomplishing in each subject. This chapter translates PISA scores into proficiency levels to allow for a substantive interpretation of the kinds of tasks that students scoring higher or lower in PISA can complete successfully. For a detailed explanation of the way in which PISA scores are translated into proficiency levels, please see Annex A1.

### Percentage of students at different levels of mathematics proficiency

In PISA 2022, the mathematics scale is divided into eight proficiency levels1. Figure I.3.1 shows how students are distributed across the eight levels of mathematics proficiency. In PISA, proficiency Level 2 is considered the baseline level of proficiency students need to participate fully in society. At this level, students begin to demonstrate the ability and initiative to use mathematics in simple real-life situations. Students who do not attain baseline Level 2 are referred to in this report as “low performers”. Low-performing students are less likely to complete higher education and attaining better-paying and prestigious jobs in the future (OECD, 2016[1]; OECD, 2018[2]). The percentage of students performing at Level 1a or below (i.e. below Level 2) is shown on the left side of the vertical axis in Figure I.3.1.

PISA 2022 results show that 31% of students performed below Level 2 in mathematics on average across OECD countries. 19% of students scored at proficiency Level 1a in mathematics, 10% at proficiency Level 1b, 2% at proficiency Level 1c, and 0.3% below proficiency Level 1c on average across OECD countries.

Some educational systems have few low performers in mathematics. In six countries and economies, 15% or less of students performed below Level 2 in mathematics (Estonia, Chinese Taipei, Hong Kong [China]*, Japan, Macao [China] and Singapore, in descending order of the percentage of low performers). In these countries, most low-performing students scored at Level 1a rather than at proficiency Level 1b, Level 1c or Below Level 1c. This means that these systems are close to achieving universal basic proficiency in mathematics.

By contrast, some educational systems have many low performers in mathematics. In 35 educational systems more than half of students scored below proficiency Level 2, and in 12 of them more than 80% of students scored below proficiency Level 2. In 18 countries and economies, at least 30% of students performed at proficiency Level 1a; in 15 countries and economies, at least 30% of students performed at proficiency Level 1b; and, in 19 countries and economies, at least 10% of students performed at proficiency Level 1c.

The percentage of students performing at Level 2 or above in mathematics in PISA 2022 is shown on the right side of the vertical axis in Figure I.3.1. These are students who reach or surpass basic proficiency in mathematics. On average across OECD countries, 69% of students scored at Level 2 or above.

More students performed at proficiency Level 2 (23%) and Level 3 (22%) than at Level 4 (15%) on average across OECD countries. Furthermore, only a small proportion of students scored at Level 5 (7%) and Level 6 (2%) on average across OECD countries.

Students who attained proficiency Level 5 or Level 6 are referred to in this report as “top performers”. Only in eight countries and economies was the share of students scoring at proficiency Level 5 in mathematics higher than 10%. In most countries or economies (46 out of 81), the share of students scoring at proficiency Level 5 is lower than 5%. And, in 30 countries or economies only 1% or less of 15-year-olds scored at proficiency Level 5.

The share of students scoring at Level 6 is higher than 10% only in Hong Kong (China)*, Macao (China), Singapore and Chinese Taipei. In a great majority of countries or economies (75 out of 81), the share of students scoring at Level 6 is lower than 5%. In 46 countries or economies only 1% or less of students scored at this level in mathematics.

Results on student performance in mathematics subscales (i.e. mean score and proficiency levels) are available in tables included in Annex B1 (for countries and economies) and Annex B2 (for regions within countries).

### The range of proficiencies covered by the PISA mathematics test

Table I.3.1 provides descriptions for all proficiency levels for mathematics2; it also shows the average share of students performing at each level across OECD countries.

Table I.3.2 presents the proficiency level of several released test items from both the PISA 2022 main study (i.e. items that were actually used in the assessment) and the PISA 2022 field trial. These items are presented in full in Annex C. Items that illustrate the proficiency levels applicable to the paper-based assessment were presented in the PISA 2012 Initial Report (OECD, 2014[3]).

Question 1 in the TRIANGULAR PATTERN unit is an easy item at proficiency Level 1a. It illustrates the capacity of students to employ a simple algorithm to solve a clearly formulated question with all information shown. Students are presented with a drawing made of rows using alternating red and blue triangles. The drawing shows the first four rows of the pattern and students are asked to compute the percentage of blue triangles shown in these four rows. There are six blue triangles and 16 total triangles, so the percentage of blue triangles is 37.5% (6 ÷ 16 = 0.375). This question measures the e*mploying mathematical concepts, facts and procedures *process subscale, and *quantity *in the content subscale.

Question 2 in the same TRIANGULAR PATTERN unit is at proficiency Level 2 (Figure I.3.2). It builds off the first item of the unit by, again, asking students to compute the percentage of blue triangles. However, this time it is based on five rows of the pattern. Since the fifth row is not shown, students must extrapolate how many red and blue triangles this fifth row would contain based on the pattern established in the previous four rows and then calculate the new percentage of the total number of blue triangles. This item requires extending the pattern beyond what is shown. This question measures the *formulating situations mathematically* process and *change and relationships* in the content category.

An example of an item at proficiency Level 3 is the first item in the SOLAR SYSTEM unit. It illustrates students’ capacity to use data provided in a table to respond to explicit instructions. For this task, students need to determine which three planets have the average distances in Astronomical Units (au) between them that are shown in the model. To do this, students need to use the table in the stimulus that gives each planet’s average distance from the Sun in au. This question measures the *interpreting, applying, and evaluating mathematical outcomes *process, and *quantity* in the content category.

Question 1 in the DVD SALES unit is a task at proficiency Level 4 (this item was not administered in the main study but only in the field trial). It illustrates students’ capacity to evaluate whether a statement is supported by information shown in a graph. The item shows a scatterplot with the number of years after 2008 in the x-axis and the number of DVDs sold in millions in the y-axis. Students also see a table containing three statements about DVD sales in the United Kingdom for the years 2008 through 2014. To verify these statements and obtain full credit, students need to compute percentages, ratios, and differences, and interpret the slope of the graph in the linear model as a constant rate of change. This question measures the *formulating situations mathematically* process, and *uncertainty and data* in the content category.

The FORESTED AREAS unit provides examples of tasks at proficiency Levels 5 and 6. The unit has an introduction screen that provides information about the context of the unit and lets students know that they will be using a spreadsheet tool to assist with answering the questions. After the introduction screen, students come to a practice screen where they must perform several actions to familiarise themselves with the functionality of the spreadsheet. After the practice screen, students come to an instruction screen, which lets them know that instructions for using the spreadsheet are available in each item. The data used for all items in this unit comprise the amount of forested area as a percentage of the total land area for 15 countries in the years 2005, 2010, and 2015. The spreadsheet also has columns that are always empty when students first navigate to each item, and the default ordering of the countries is alphabetical.

Question 1 in the FORESTED AREAS unit is a task at proficiency Level 5. It asks students to identify the countries that had the greatest gain, the greatest loss or no overall change in its percentage of forested area between 2005 and 2015. To answer this question, students need to determine what calculation(s) to perform, how to use the spreadsheet to perform them, and, lastly, interpret the results with respect to the context. This question measures the *formulating situations mathematically* process, and *uncertainty and data* in the content category.

Question 3 in FORESTED AREAS is a task at proficiency Level 6 (Figure I.3.3). Students are told to consider the data in terms of two time periods: 2005 to 2010 and 2010 to 2015. They must identify the two countries that had biggest change in their percentage of forested area from one time period to the other. To answer this question, students need to calculate the change in the percent of forested area for each time period and then compute the change between the two time periods; they might also find it helpful to sort the results. Students have to devise a strategy for using the spreadsheet, which requires performing multiple operations before being able to evaluate the results. Possibly contributing to the difficulty of this item is recognising that “biggest change” in this context does not just mean an increase but it can also mean a decrease in the percentage of forested area between time periods. This question was allocated to the *interpreting, applying and evaluating mathematical outcomes *process category, and to the *uncertainty and data *content category.

The first step in defining a reporting scale in PISA is developing a framework for each subject assessed. This framework provides a definition of what it means to be proficient in the subject; delimits and organises the subject according to different dimensions; and suggests the kind of test items and tasks that can be used to measure what students can do in the subject within the constraints of the PISA design (OECD, 2023[4]). These frameworks were developed by a group of international experts for each subject and agreed upon by the participating countries.

The second step is the development of the test questions (i.e. items) to assess proficiency in each subject. A consortium of testing organisations under contract to the OECD on behalf of participating governments develops new items and selects items from previous PISA tests (i.e. “trend items”) of the same subject. The expert group that developed the framework reviews these proposed items to confirm that they meet the requirements and specifications of the framework.

The third step is a qualitative review of the testing instruments by all participating countries and economies to ensure the items’ overall quality and appropriateness in their own national context. These ratings are considered when selecting the final pool of items for the assessment. Selected items are then translated and adapted to create national versions of the testing instruments. These national versions are verified by the PISA consortium.

The verified national versions of the items are then presented to a sample of 15-year-old students in all participating countries and economies as part of a field trial. This is to ensure that they meet stringent quantitative standards of technical quality and international comparability. In particular, the field trial serves to verify the psychometric equivalence of items across countries and economies (see Annex A6).

After the field trial, material is considered for rejection, revision or retention in the pool of potential items. The international expert group for each subject then formulates recommendations as to which items should be included in the main assessments. The final set of selected items is also subject to review by all countries and economies. This selection is balanced across the various dimensions specified in the framework and spans various levels of difficulty so that the entire pool of items measures performance across all component skills and a broad range of contexts and student abilities.

### Percentage of students at different levels of reading proficiency

Figure I.3.4 shows the distribution of students across the eight levels of reading proficiency.

On average across OECD countries, the percentage of low performers in reading was 26%. 17% of students scored at proficiency Level 1a in reading, 8% at proficiency Level 1b, 2% at proficiency Level 1c, and 0.2% below proficiency Level 1c in PISA 2022.

Some educational systems have few low performers in reading. In Singapore, Ireland*, Macao (China), Japan, Estonia, and Korea (listed in ascending order of the proportion of low performers), 15% or less of students performed below baseline proficiency Level 2 in reading. In these countries, most of the relatively few low-performing students scored at no lower than Level 1a, meaning that these systems are close to achieving universal basic proficiency in reading.

A larger number of educational systems have many low performers in reading. In 30 education systems, more than half of students performed below baseline proficiency Level 2 in reading. In 21 countries and economies, at least 30% of students performed at proficiency Level 1a; in 9 countries and economies, at least 30% of students performed at proficiency Level 1b; and in 10 countries and economies, at least 10% of students performed at proficiency Level 1c.

The percentage of students performing at Level 2 or above in reading in PISA 2022 is shown on the right side of the vertical axis in Figure I.3.4. On average across OECD countries, 74% of students scored at Level 2 or above. In 10 countries and economies, more than 80% of students scored at Level 2 or above but in another four countries and economies less than 20% of students reached baseline proficiency Level 2 in reading.

More students performed at proficiency Level 2 (24%) and Level 3 (25%) than at Level 4 (17%) on average across OECD countries. Moreover, only a small proportion of students scored at Level 5 (6%) and Level 6 (1%) on average across OECD countries.

Some 7% of students attained the highest proficiency levels, Level 5 or 6, in reading on average across OECD countries. In 13 countries/economies, the share of top performers in reading is higher than 10%.

Only in seven countries and economies (Canada*, Japan, Korea, New Zealand*, Singapore, Chinese Taipei and the United States*) is the share of students scoring at proficiency Level 5 higher than 10%. In 55 countries or economies, the share of students scoring at Level 5 is lower than 5%.

The share of students scoring at Level 6 in reading is zero in 11 countries and economies, and is 5% in Singapore. In 46 countries/economies the percentage of students scoring at Level 6 in reading is greater than zero but smaller than 1%, in five countries/economies it is 3%, and in the United States* it is 4%.

### The range of proficiencies covered by the PISA reading test

The eight proficiency levels used in the PISA 2022 reading assessment are the same as those established for the PISA 2018 assessment. Table I.3.3 illustrates the range of reading competencies covered by the PISA test and describes the skills, knowledge and understanding required at each level of the reading scale.

### Percentage of students at different levels of science proficiency

Figure I.3.5 shows the distribution of students across the seven levels of science proficiency.

On average across OECD countries in PISA 2022, the percentage of low-performing students in science was 24%. 17% of students scored in science at proficiency Level 1a, 6% at proficiency Level 1b, and 1% below proficiency Level 1b.

A small number of educational systems have few low performers in science. In seven countries and economies, less than 15% of students performed below baseline proficiency Level 2 in science (Macao [China], Singapore, Japan, Estonia, Chinese Taipei, Hong Kong [China]* and Korea, in ascending order of the proportion of low performers). In these countries, most of the relatively few low-performing students scored at Level 1a, meaning that these systems are close to achieving universal basic proficiency in science.

A larger number of educational systems have many low performers in science. In 30 countries and economies, at least 30% of students performed at proficiency Level 1a; in 18 countries and economies, at least 20% of students performed at proficiency Level 1b.

The percentage of students performing at Level 2 or above in science in PISA 2022 is shown on the right side of the vertical axis in Figure I.3.5. On average across OECD countries, 76% of students scored at Level 2 or above. In 17 countries and economies, at least 80% of students scored at Level 2 or above but in another 10 countries and economies less than 30% of students reached baseline proficiency Level 2 in science.

More students performed in science at proficiency Level 2 (25%) and Level 3 (26%) than at Level 4 (17%) on average across OECD countries. Moreover, only a small proportion of students scored at Level 5 (6%) and Level 6 (1%) on average across OECD countries.

Some 7% of students attained the highest proficiency levels, Level 5 or 6, in science on average across OECD countries. In 14 countries/economies, the share of top performers in science was higher than 10%.

Only in five countries and economies was the share of students scoring at proficiency Level 5 higher than 10%. In 54 out of 81 countries or economies, the share of students scoring at Level 5 was lower than 5%.

The share of students scoring at Level 6 was as high as 6% only in Singapore. In 60 out of 81 countries or economies, the share of students scoring at Level 6 was no higher than 1%.

### The range of proficiencies covered by the PISA science test

The seven proficiency levels used in the PISA 2022 science assessment were the same as those established for the PISA 2015 assessment and were used again in PISA 2018. Table I.3.4 illustrates the range of science competencies covered by the PISA test and describes the skills, knowledge and understanding required at each level of the science scale.

In September 2015, world leaders gathered to set ambitious Sustainable Development Goals (SDGs) for the future of the global community. The fourth SDG (Goal 4) seeks to ensure “inclusive and equitable quality education and promote lifelong learning opportunities for all” and has ten targets, each of which has at least one global indicator designed to facilitate the analysis and the measurement of the target.

PISA data on student achievement is used to monitor progress towards two of the SDG 4 targets and their accompanying global indicators:

Target 4.1.1: Ensure that all girls and boys complete free, equitable and quality primary and secondary education leading to relevant and effective learning outcomes

Target 4.5: Eliminate gender disparities in education and ensure equal access to all levels of education and vocational training for the vulnerable, including persons with disabilities, indigenous peoples and children in vulnerable situations

## SDG Target 4.1.1: Minimum proficiency level in reading and mathematics

PISA data is a primary source for monitoring progress against the SDG global indicator 4.1.1.c:

Proportion of children and young people at the end of lower secondary education achieving at least a minimum proficiency level in (i) reading and (ii) mathematics, by sex.

In PISA, the minimum level of proficiency is defined as scoring at least Proficiency Level 2 in both reading and mathematics.

## National benchmarks

The Education 2030 Framework for Action (UNESCO, 2016[5]) called on countries to establish ”appropriate intermediate benchmarks for addressing the accountability deficit associated with longer-term SDG4 targets”. According to UNESCO, about 58% of countries have established benchmarks for SDG 4 Targets (UNESCO, 2022[6]). These include 48 countries/economies that took part in PISA 2022. This box presents PISA data showing how countries and economies are progressing towards achieving their national benchmarks and international SDG 4 targets.

National benchmarks for Target 4.1.1 define the proportion of young people at the end of lower secondary education who are expected to achieve at least a minimum proficiency level in mathematics and reading by 2030, according to the commitments of each country. Figure I.3.6 shows national benchmarks expressed in terms of share of students scoring below proficiency Level 2 (i.e. low performers) in PISA and the actual share of low-performing students in mathematics in 2015, 2018 and 2022, according to PISA data.

The figures show wide variation in national benchmarks across countries, ranging from an expected share of low performers of over 70% in El Salvador, Guatemala and Indonesia, to less than 10% in Finland. Countries set national benchmarks based on national processes and challenges. In El Salvador and Indonesia, for example, enrolment rates in secondary education have been increasing since 2015 but there is still no universal coverage at this level of education (World Bank, 2023[7]). In Finland, on the other hand, coverage has been high for several decades. These factors influence how achievable national targets are defined.

None of the countries included in the figure have made net progress since 2015 when the SDG agenda was set. In 29 out of 39 countries with comparable data, the share of low performers in mathematics increased between 2015 and 2022. Of the 25 OECD countries shown in Figure I.3.6, the share of low performers increased significantly in 16 of them (by at least five percentage points). In five OECD countries the share of low performers has not changed significantly over this period.

While the COVID-19 pandemic explains some of the setbacks experienced by countries, PISA data clearly show that this downward trend began before the pandemic started in a number of countries.

When analysing changes in the share of low performers across countries/economies, it is important to consider differences in the proportion of 15-year-olds represented by the PISA sample in each country in 2015, 2018 and 2022 (the Coverage Index 3, “CI3” in short). For example, in Indonesia, the percentage of low performers in mathematics increased by 13 percentage points between 2015 and 2022. However, part of this change is likely related to the increase in the coverage of the PISA sample from 68% to 85% over the same period. Lower coverage rates are often due to early dropout; late or discontinuous enrolment; or grade repetition. Therefore, an increase in the coverage of the PISA sample implies the expansion of education to more marginalised populations. Costa Rica, Jordan and Korea are examples of other countries/economies that increased coverage by over 10 percentage points between 2015 and 2022 (Table I.B1.4.1).

## SDG Target 4.5: Gender and socio-economic parity in learning outcomes

While this target encompasses all types of inequalities across education outcomes, PISA 2022 data shed light specifically on gender and socio-economic inequalities. This is measured using “parity indices”, which show a ratio between two populations. Figure I.3.7 shows the parity index for girls and boys, and for socio-economically disadvantaged and advantaged students (i.e. parity in the percentage of students scoring at or above proficiency Level 2 in mathematics).

On average, OECD countries are close to gender parity in mathematics proficiency but the ratio still favours boys over girls (0.98). In seven countries/economies, Belgium, Croatia, France, Israel, Latvia*, Macao (China) and Romania, there is no gap. In five countries/economies, Albania, Jamaica, Jordan, Palestinian Authority and the Philippines, the share of girls with minimum achievement in mathematics is more than 20 percentage points higher than that of boys (parity index at least 1.20). At the other extreme, in El Salvador, Guatemala, Peru, Paraguay, Uzbekistan, and OECD countries Costa Rica and Mexico, there were fewer than eight girls for every 10 boys performing above the minimum proficiency level in mathematics.

## References

[4] OECD (2023), *PISA 2022 Assessment and Analytical Framework*, PISA, OECD Publishing, Paris, https://doi.org/10.1787/dfe0bf9c-en.

[2] OECD (2018), *Equity in Education: Breaking Down Barriers to Social Mobility*, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264073234-en.

[8] OECD (2018), *PISA for Development Assessment and Analytical Framework: Reading, Mathematics and Science*, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264305274-en.

[1] OECD (2016), *Low-Performing Students: Why They Fall Behind and How To Help Them Succeed*, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264250246-en.

[3] OECD (2014), *PISA 2012 Results: What Students Know and Can Do (Volume I, Revised edition, February 2014): Student Performance in Mathematics, Reading and Science*, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264208780-en.

[6] UNESCO (2022), *Setting commitments : national SDG 4 benchmarks to transform education*.

[5] UNESCO (2016), *Incheon Declaration and SDG4 – Education 2030 Framework for ActioN*.

[7] World Bank (2023), *World Development Indicators*, https://data.worldbank.org/.

## Notes

← 1. In previous cycles, only six proficiency levels were used to describe mathematical proficiency. Proficiency Levels 1b and 1c are the two proficiency levels that are new to PISA 2022. Level 1a is equivalent to Level 1 in PISA 2018 as both have the same lower score limit (357.77 points).

← 2. The description of the tasks that students are able to do at proficiency Level 1c is identical to the description used in PISA for Development (PISA-D) (OECD, 2018[8]). It has not been revised for PISA 2022 as there were no new items that scaled at this level.