2. How did countries perform in PISA?

For Netherlands, Newfoundland and Labrador, Alberta, Hong Kong (China), Manitoba, United States, Latvia, Scotland, Quebec, New Zealand, United Kingdom, Northern Ireland, England, Wales, Denmark, Ontario, Panama, Nova Scotia, Australia, British Columbia, Ireland, Jamaica and Canada, caution is required when interpreting estimates because one or more PISA sampling standards were not met (see Reader’s Guide, Annexes A2 and A4).

PISA measures student performance as the extent to which 15-year-old students near the end of their compulsory education have acquired the knowledge and skills that are essential for full participation in modern societies, particularly in the core domains of reading, mathematics, and science.

This chapter examines student performance in PISA 2022. In its first section, the chapter reports the average performance in mathematics, reading and science for each country and economy, comparing it to other countries and economies, and to the average performance across OECD countries. The second section examines variation in performance within and between countries and economies; for example, it shows how large the score gap that separates the highest-performing and lowest-performing students within each country and economy is. It also examines how variation in performance is related to the average performance across PISA-participating countries and economies. A student performance ranking among all countries and economies that took part in PISA 2022 is provided in the third section.

Trends in student performance over time are considered in Chapters 5 and 6 of this report. For short-term changes between PISA 2018 and 2022, see Chapter 5; for long-term trajectories in student performance over countries’ entire participation in PISA, see Chapter 6.

In PISA 2022, the mean mathematics score among OECD countries is 472 points; the mean score in reading is 476 points; and the mean score in science is 485 points. Singapore scored significantly higher than all other countries/economies that participated in PISA 2022 in mathematics (575 points), reading (543 points) and science (561 points).

Table I.2.1, Table I.2.2 and Table I.2.3 show each country’s/economy’s mean score and indicate pairs of countries/economies where the differences between the means are statistically significant1. For each country/economy shown in the middle column, the countries/economies whose mean scores are not statistically significantly different are listed in the right column. In these tables, countries and economies are divided into three broad groups: those whose mean scores are statistically around the OECD mean (highlighted in light grey); those whose mean scores are above the OECD mean (highlighted in blue); and those whose mean scores are below the OECD mean (highlighted in dark grey).

In mathematics, six East Asian education systems (Hong Kong [China]*, Japan, Korea, Macao [China], Singapore and Chinese Taipei) outperformed all other countries and economies (Table I.2.1). Another 17 countries also performed above the OECD average in mathematics, ranging from Estonia (mean score of 510 points) to New Zealand* (mean score of 479 points).

In reading, behind the top-performing education system (Singapore), Ireland* performed as well as Estonia, Japan, Korea and Chinese Taipei; and outperformed all other countries/economies (Table I.2.1). In addition to those six countries and economies, another 14 education systems performed above the OECD average in reading, ranging from Macao (China) (mean score of 510) to Italy (mean score of 482 points).

All countries and economies that performed above the OECD average in mathematics also performed above the OECD average in reading, except for Austria, Belgium, Latvia*, the Netherlands* and Slovenia. Similarly, all countries and economies that performed above the OECD average in reading also performed above the OECD average in mathematics, except for Italy and the United States*.

In science, the highest-performing education systems are Canada*, Estonia, Hong Kong (China)*, Japan, Korea, Macao (China), Singapore and Chinese Taipei (Table I.2.2). Finland performed as well as Canada* in science. In addition to these nine countries and economies, another 15 education systems also performed above the OECD average in science, ranging from Australia* (mean score of 507 points) to Belgium (mean score of 491 points).

All countries and economies that performed above the OECD average in science also performed above the OECD average in mathematics and reading, except for six countries/economies. Austria, Belgium, Latvia* and Slovenia performed above the OECD average in science and mathematics but not in reading; United States performed above the OECD average in science and reading but not in mathematics; and Germany performed above the OECD average in science but not in mathematics or reading. In both of these subjects, Germany’s mean score is not statistically significantly different from the OECD average.

Eighteen countries and economies performed above the OECD average in mathematics, reading and science (Australia*, Canada*, the Czech Republic, Denmark*, Estonia, Finland, Hong Kong [China]*, Ireland*, Japan, Korea, Macao [China], New Zealand*, Poland, Singapore, Sweden, Switzerland, Chinese Taipei and the United Kingdom*).

The gap in performance between the highest- and lowest-performing countries is 153 score points in mathematics among OECD countries and 238 points among all education systems that took part in PISA 2022. In reading, the gap in performance between the highest- and lowest-performing countries is 107 score points among OECD countries and 214 points among all education systems that took part in PISA 2022. In science, the gap in performance between the highest- and lowest-performing countries is 137 score points among OECD countries and 214 points among all education systems that took part in PISA 2022.

The Dominican Republic has the smallest variation in mathematics proficiency (54 score points) while several other countries and economies whose mean performance was below the OECD average also have small variations in performance2. Variation in student performance tends to be greater among high-performing than low-performing education systems. As shown in Figure I.2.3, there is a strong correlation between average performance in mathematics and variation in performance in mathematics. That said, this is not the case for all countries. For instance, Latvia* has a mean of 483 and a standard deviation of 80.

However, among countries that performed above the OECD average, Ireland*, Latvia* and Denmark* stand out for their relatively small variation in performance (standard deviation around 80 score points) (Figure I.2.3). Similarly, among countries that performed below the OECD average, Bulgaria, Israel, Malta, Romania, the Slovak Republic and the United Arab Emirates, stand out for their relatively large variation in performance (standard deviation greater than 95 score points).

Another measure of variation in performance within countries is the score gap that separates the highest- and lowest-performing students within a country (i.e. inter-decile range). In mathematics, the difference between the 90th percentile of performance (the score above which only 10% of students scored) and the 10th percentile of performance (the score below which only 10% of students scored) is more than 135 score points in all countries and economies; on average across OECD countries, 235 score points separate these extremes (Figure I.2.4).

The largest differences between top-performing and low-achieving students in mathematics are found in Israel, the Netherlands* and Chinese Taipei (Figure I.2.4). In these countries, the inter-decile range is 280 score points or more, which means that student performance in mathematics is highly unequal across 15-year-olds.

By contrast, the smallest differences between high- and low-achieving students are found among countries and economies with low (i.e. lower than 370 points) mean scores (the Dominican Republic, El Salvador, Indonesia, Jordan and Kosovo). In these countries, the 90th percentile of the mathematics distribution is below the average score across OECD countries.

Student performance varies widely among 15-year-olds and that variation can be broken down into differences at the student, school and education system levels3. This analysis is important from a policy perspective. Pinpointing where differences in student performance lie enables education stakeholders to target policy4. For example, if a large percentage of the total variation in student performance is linked to differences in student performance between education systems, this means that education system characteristics (e.g. economic and social conditions, education policies) strongly influence student performance. Similarly, if differences between schools account for a significant part of the overall variation in performance within a country/economy, then differences in school characteristics are important for policy to consider.

In PISA 2022, about 31% of the variation in mathematics performance is linked to mean differences in student performance between participating education systems (Figure I.2.5) across all countries and economies. This means that the characteristics of education systems have a great deal of influence on student performance. As shown in Chapter 4, the economic and social conditions of different countries/economies, which are often beyond the control of education policy makers and educators, can influence student performance by means of, for example, wealthier countries spending more on education than mid- and low-income countries. On the other hand, it is education policy makers and educators who determine education policies and practices, including the organisation of schooling and learning, and the allocation of available resources across schools and students.

Across OECD countries, however, only 12% of the variation in mathematics performance is between education systems. In other words, the characteristics of education systems do not play an important role in explaining differences in student performance among OECD countries. This is likely because the economic and social conditions of OECD countries are very similar to each other. It is also possible that education policies and practices vary less across OECD countries than across all PISA-participating countries.

Out of the variation observed within countries in PISA 2022, 32% of the OECD average variation in mathematics performance is between schools (right side of Figure I.2.6); the remaining part of the variation (68%) is within schools (left side of the figure). This means that school characteristics do not play a dominant role in explaining student performance; instead, it is the characteristics of students themselves (i.e. their background, attitudes and behaviour, etc.), and the characteristics of different classrooms and different grades within schools that account for most of the overall variation in student performance.

The extent of between-school variation in mathematics performance differs widely across countries/economies. In six countries and economies between-school differences account for 10% or less of the total variation in performance (Iceland, Saudi Arabia, Ireland*, Finland, Denmark* and Uzbekistan, in ascending order). By contrast, in 10 other countries (Bulgaria, Hungary, Israel, Japan, the Netherlands*, Romania, the Slovak Republic, Chinese Taipei, Türkiye and the United Arab Emirates) differences between schools account for at least 50% of the total variation in the country’s performance.

The goal of PISA is to provide useful information to educators and policy makers on the strengths and weaknesses of their country’s education system, their progress made over time, and opportunities for improvement. When ranking countries’ and economies’ student performance in PISA, it is important to consider the social and economic context of schooling (see next section). Moreover, many countries and economies score at similar levels; small differences that are not statistically significant or practically meaningful should not be considered (see Box 1 in Reader’s Guide).

Table I.2.4, Table I.2.5 and Table I.2.6 show for each country and economy an estimate of where its mean performance ranks among all other countries and economies that participated in PISA as well as, for OECD countries, among all OECD countries. Because mean-score estimates are derived from samples and are thus associated with statistical uncertainty, it is often not possible to determine an exact ranking for all countries and economies. However, it is possible to identify the range of possible rankings for the country’s or economy’s mean performance6. This range of ranks can be wide, particularly for countries/economies whose mean scores are similar to those of many other countries/economies.

Table I.2.4, Table I.2.5 and Table I.2.6 also include the results of provinces, regions, states or other subnational entities within the country for countries where the sampling design supports such reporting. For these subnational entities, a rank order was not estimated. Still, the mean score and its confidence interval allow the performances of subnational entities and countries/economies to be compared. For example, Quebec (Canada*) scored below top-performers Macao (China), Singapore, Chinese Taipei and Hong Kong (China)*, but close to Korea in mathematics.

This section focuses on student performance in two sets of mathematics subscales: process subscales and content subscales. Each item in the PISA 2022 computer-based mathematics assessment was classified into one of the four mathematics-processes subscales of formulating, employing, interpreting, and reasoning. Similarly, each item in the PISA 2022 computer-based mathematics assessment was classified into one of the four mathematics-content subscales of change and relationships, space and shape, quantity, and uncertainty and data.

The relative strengths and weaknesses of each country’s/economy’s education system are analysed by looking at differences in mean performance across the PISA mathematics subscales within the process and content subscales. See Annex A1 for detailed definitions of subscales.

Table I.2.7 shows the country/economy mean for the overall mathematics scale and for each of the four mathematics-process subscales. It also points to which differences along the (standardised) subscale means are significant, indicating a country’s/economy’s relative strengths and weaknesses.

For example, in Japan mean performance in mathematics is 536 score points. Japan’s score is also 536 points in the mathematics-processes subscales of formulating and employing, and the score is very similar (534 points) in the process subscale of reasoning. However, in the interpreting process, the score is considerably higher (544 points). Compared to differences in how students performed in different subscales on average across PISA-participating countries/economies (i.e. hereafter, for simplicity, the “worldwide average”), students in Japan are stronger at interpreting than all other mathematics-process subscales.

On average across OECD countries, students are relatively stronger at interpreting than formulating and stronger at interpreting than employing, compared to the worldwide average. In addition, students are relatively stronger at reasoning than formulating and employing, and relatively stronger at employing than formulating on average across OECD countries compared to the worldwide average. The same pattern of relative strengths was observed in Spain and the United Kingdom*. In Belgium, Canada*, Korea and New Zealand*, the pattern is the same as the OECD average except that there are no significant differences in how students performed in formulating and employing.

In 22 countries/economies, students are relatively stronger at reasoning than formulating; in 23 countries/economies, students are relatively stronger at reasoning than employing; and in 17 countries/economies, students are relatively stronger at reasoning than interpreting, compared to the worldwide average.

In six countries/economies, there are no significant differences in how students performed across different mathematics-process subscales. For example, in Latvia*, overall mean performance in mathematics is 483 score points with 483 points in formulating; 484 points in employing; 485 points in interpreting; and 481 points in reasoning. The same homogeneity in performance across mathematics-process subscales is observed in Malta, Panama*, Qatar, Serbia and Türkiye.

Table I.2.8 shows the country/economy mean for the overall mathematics scale and for each of the four mathematics-content subscales, and an indication of relative strengths in the mathematics content subscales.

On average across OECD countries, students are relatively stronger in uncertainty and data than change and relationships, and relatively stronger in uncertainty and data than space and shape, compared to the worldwide average. In addition, students are relatively stronger in space and shape than change and relationships; and relatively stronger in quantity than change and relationships on average across OECD countries, compared to the worldwide average.

In 27 countries/economies, students are, as in the OECD average, relatively stronger in uncertainty and data than space and shape, compared to the worldwide average. In 13 countries/economies, students are relatively stronger in uncertainty and data than change and relationships, compared to the worldwide average.

By contrast, in 24 countries/economies, students are relatively stronger in space and shape than uncertainty and data. In 19 countries/economies, students are relatively stronger in change and relationships than uncertainty and data.

References

[6] Ashcraft, M. and E. Kirk (2001), “The relationships among working memory, math anxiety, and performance.”, Journal of Experimental Psychology: General, Vol. 130/2, pp. 224-237, https://doi.org/10.1037/0096-3445.130.2.224.

[16] Beilock, S. et al. (2010), “Female teachers’ math anxiety affects girls’ math achievement”, Proceedings of the National Academy of Sciences, Vol. 107/5, pp. 1860-1863, https://doi.org/10.1073/pnas.0910967107.

[18] Borgonovi, F. et al. (2017), Youth in Transition: How Do Some of The Cohorts Participating in PISA Fare in PIAAC?, OECD Publishing, https://doi.org/10.1787/51479ec2-en.

[25] Bråten, I., H. Strømsø and M. Britt (2009), “Trust Matters: Examining the Role of Source Evaluation in Students’ Construction of Meaning Within and Across Multiple Texts”, Reading Research Quarterly, Vol. 44/1, pp. 6-28, https://doi.org/10.1598/rrq.44.1.1.

[4] Carey, E. et al. (2016), “The Chicken or the Egg? The Direction of the Relationship Between Mathematics Anxiety and Mathematics Performance”, Frontiers in Psychology, Vol. 6, https://doi.org/10.3389/fpsyg.2015.01987.

[2] Choe, K. et al. (2019), “Calculated avoidance: Math anxiety predicts math avoidance in effort-based decision-making”, Science Advances, Vol. 5/11, https://doi.org/10.1126/sciadv.aay1062.

[3] Dowker, A., A. Sarkar and C. Looi (2016), “Mathematics Anxiety: What Have We Learned in 60 Years?”, Frontiers in Psychology, Vol. 7, https://doi.org/10.3389/fpsyg.2016.00508.

[14] Dweck, C. (2006), Mindset: The new psychology of success, Random House.

[15] Dweck, C. and D. Yeager (2019), “Mindsets: A View From Two Eras”, Perspectives on Psychological Science, Vol. 14/3, pp. 481-496, https://doi.org/10.1177/1745691618804166.

[7] Elliot, A. and C. Dweck (eds.) (2005), Evaluation anxiety, The Guilford Press.

[5] Goetz, T. et al. (2010), “Academic self-concept and emotion relations: Domain specificity and age effects”, Contemporary Educational Psychology, Vol. 35/1, pp. 44-58, https://doi.org/10.1016/j.cedpsych.2009.10.001.

[9] Ho, H. et al. (2000), “The Affective and Cognitive Dimensions of Math Anxiety: A Cross-National Study”, Journal for Research in Mathematics Education, Vol. 31/3, pp. 362-379, https://doi.org/10.2307/749811.

[20] OECD (2023), PISA 2022 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://doi.org/10.1787/dfe0bf9c-en.

[17] OECD (2021), OECD Skills Outlook 2021: Learning for Life, OECD Publishing, Paris, https://doi.org/10.1787/0ae365b4-en.

[12] OECD (2021), Sky’s the limit: growth mindset, students, and schools in PISA, OECD publishing, Paris.

[19] OECD (2019), OECD Future of Education and Skills 2030 Concept Note.

[22] OECD (2019), PISA 2018 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://doi.org/10.1787/b25efab8-en.

[21] OECD (2018), PISA for Development Assessment and Analytical Framework: Reading, Mathematics and Science, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264305274-en.

[1] OECD (2013), PISA 2012 Results: Ready to Learn (Volume III): Students’ Engagement, Drive and Self-Beliefs, PISA, OECD Publishing, Paris, https://doi.org/10.1787/9789264201170-en.

[24] Perfetti, C., N. Landi and J. Oakhill (2005), “The Acquisition of Reading Comprehension Skill”, in The Science of Reading: A Handbook, Blackwell Publishing Ltd, Oxford, UK, https://doi.org/10.1002/9780470757642.ch13.

[8] Putwain, D., K. Woods and W. Symes (2010), Personal and situational predictors of test anxiety of students in post-compulsory education.

[23] RAND Reading Study Group and C. Snow (2022), Reading for Understanding: Toward an R&D Program in Reading Comprehension, RAND Corporation, http://www.jstor.org/stable/10.7249/mr1465oeri.8.

[13] Yeager, D. et al. (2019), A national experiment reveals where a growth mindset improves achievement, https://doi.org/10.1038/s41586-019-1466-y.

[11] Yeager, D. and G. Walton (2011), Social-psychological interventions in education: They’re not magic, SAGE Publications Inc., http://rer.aera.net.

[10] Zhang, J., N. Zhao and Q. Kong (2019), “The Relationship Between Math Anxiety and Math Performance: A Meta-Analytic Investigation”, Frontiers in Psychology, Vol. 10, https://doi.org/10.3389/fpsyg.2019.01613.

Notes

← 1. When comparing mean performance across countries/economies, only differences that are statistically significant should be considered (see Box 1 in Reader’s Guide).

← 2. The standard deviation summarises variation in performance among 15-year-old students within each country/economy. The average standard deviation in mathematics performance within OECD countries is 90 score points. If the standard deviation is larger than 90 score points, it indicates that student performance varies more from a particular country’s/economy’s average performance than it varies internationally. A smaller standard deviation means that student performance varies less in a country/economy than it varies internationally.

← 3. This analysis was carried out in two steps. In the first step, the share of the variation in student performance that occurs between education systems was identified. In the second step, out of the remaining variation, the between-school and within-school was identified. Within-school variation are differences in performance between students.

← 4. PISA results do not establish causality. PISA identifies empirical correlations between student achievement and the characteristics of schools and education systems, correlations that show consistent patterns across countries. Implications for policy are based on this correlational evidence and previous research.

← 5. The reason for this restriction is the following: while the students sampled in PISA represent all 15-year-old students, whatever type of school they are enrolled in, they may not be representative of the students enrolled in their school. As a result, comparability at the school level may be compromised. For example, if grade repeaters in a country are enrolled in different schools than students in the modal grade because the modal grade in this country is the first year of upper secondary school (ISCED 3) while grade repeaters are enrolled in lower secondary school (ISCED 2), the average performance of schools where only students who had repeated a grade were assessed may be a poor indicator of the actual average performance of these schools. By restricting the sampling to schools with the modal ISCED level for 15-year-old students, PISA ensures that the characteristics of the students sampled are as close as possible to the profiles of the students attending the school. The “modal ISCED level” is defined here as the level attended by at least one-third of the PISA sample. In 15 education systems (Baku [Azerbaijan], Cambodia, Colombia, Costa Rica, the Czech Republic, the Dominican Republic, Hong Kong [China]*, Indonesia, Jamaica, Kazakhstan, Morocco, the Netherlands, the Slovak Republic, Switzerland, and Chinese Taipei) both lower secondary (ISCED level 2) and upper secondary (ISCED level 3) schools meet this definition. In all other countries, analyses are restricted to either lower secondary or upper secondary schools (see Table I.B1.2.14 for details). In several countries, lower and upper secondary education are provided in the same school. As the restriction is made at the school level, some students from a grade other than the modal grade in the country may also be used in the analysis.

← 6. See Annex A3 for a technical note on how the range of ranks were computed in PISA 2022.

Legal and rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

© OECD 2023

Licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 IGO (CC BY-NC-SA 3.0 IGO).