1. What is PISA?

The OECD Programme for International Student Assessment (PISA), now in its seventh cycle, seeks to determine what is important for citizens to know and be able to do. PISA assesses the extent to which 15-year-old students near the end of their compulsory education have acquired the knowledge and skills that are essential for full participation in modern societies.

The triennial assessment focuses on the core school subjects of reading, mathematics and science. Students’ proficiency in an innovative domain is also assessed; in 2018, this domain was global competence. The assessment does not just ascertain whether students can reproduce knowledge; it also examines how well students can extrapolate from what they have learned and can apply that knowledge in unfamiliar settings, both in and outside of school. This approach reflects the fact that modern economies reward individuals not for what they know, but for what they can do with what they know.

PISA is an ongoing programme that monitors trends in the knowledge and skills that students around the world, and in demographic subgroups within each country, have acquired. In each round of PISA, one of the core domains is tested in detail, taking up roughly one-half of the total testing time. The major domain in 2018 was reading, as it was in 2000 and 2009. Mathematics was the major domain in 2003 and 2012, and science was the major domain in 2006 and 2015.

Through questionnaires distributed to students and school principals, and optional questionnaires distributed to parents and teachers, PISA also gathers information about students’ home background, their approaches to learning and their learning environments.

With this alternating schedule of major domains, a thorough analysis of achievement in each of the three core areas is presented every nine years; an analysis of trends is offered every three years. Combined with the information gathered through the various questionnaires, the PISA assessment provides three main types of outcomes:

  • Basic indicators that provide a profile of the knowledge and skills of students

  • Indicators derived from the questionnaires that show how such skills relate to various demographic, social, economic and educational variables

  • Indicators on trends that show changes in outcomes and their distributions, and in relationships between student-, school- and system-level background variables and outcomes.

Policy makers around the world use PISA findings to gauge the knowledge and skills of the students in their own country/economy compared with those in other participating countries/economies, establish benchmarks for improvements in the education provided and/or in learning outcomes, and understand the relative strengths and weaknesses of their own education systems.

This publication presents the theory underlying the PISA 2018 assessment – the seventh since the programme’s inception. It includes frameworks for assessing the three core subjects of reading, mathematics and science (Chapters 2, 3 and 4, respectively), the framework for the third assessment of students’ financial literacy (Chapter 5), and the framework for assessing the innovative domain, global competence (Chapter 6). These chapters outline the knowledge content that students need to acquire in each domain, the processes that students need to be able to perform, and the contexts in which this knowledge and these skills are applied. They also discuss how each domain is assessed. The publication concludes with the frameworks for the various questionnaires distributed to students, school principals, parents and teachers (Chapter 7), and the framework for the new well-being questionnaire distributed to students (Chapter 8).

Box 1.1. Key features of PISA 2018

The content

PISA not only assesses whether students can reproduce knowledge, but also whether they can extrapolate from what they have learned and apply their knowledge in new situations. It emphasises the mastery of processes, the understanding of concepts, and the ability to function in various types of situations.

The PISA 2018 survey focused on reading, with mathematics and science as minor domains of assessment. For the first time, global competence was assessed as an innovative domain. PISA 2018 also included an assessment of young people’s financial literacy, which was optional for countries and economies.

The students

Approximately 710 000 students completed the PISA 2018 assessment, representing over 31 million 15-year-olds in the schools of the 79 participating countries and economies.

The assessment

Computer-based tests were used, with assessments lasting a total of two hours for each student.

Test items were a mixture of multiple-choice questions and questions requiring students to construct their own responses. The items were organised in groups based on a passage setting out a real-life situation. About 930 minutes of test items were used, with different students taking different combinations of test items.

Students also answered a background questionnaire that took 35 minutes to complete. The questionnaire sought information about the students themselves, their homes, and their school and learning experiences. School principals completed a questionnaire that covered the school system and the learning environment.

To obtain additional information, some countries/economies decided to distribute a questionnaire to teachers to learn about their training and professional development, their teaching practices and their job satisfaction. In some countries/economies, optional questionnaires were distributed to parents, who were asked to provide information on their perceptions of and involvement in their child’s school, their support for learning in the home, and their own engagement with reading and with other cultures.

Countries/economies could also choose three other optional questionnaires for students: one asked students about their familiarity with and use of information and communications technologies; one sought information about students’ education to date, including any interruptions in their schooling, and whether and how they are preparing for a future career; and one, distributed for the first time in PISA 2018, examined students’ well-being and life satisfaction.

Countries/economies that conducted the optional financial literacy assessment also distributed a financial literacy questionnaire.

What makes PISA unique

PISA is the most comprehensive and rigorous international programme to assess student performance and to collect data on the student, family and institutional factors that can help explain differences in performance. Decisions about the scope and nature of the assessments and the background information to be collected are made by leading experts in participating countries, and are steered jointly by governments on the basis of shared, policy-driven interests. Substantial efforts and resources are devoted to achieving cultural and linguistic breadth and balance in the assessment materials. Stringent quality-assurance mechanisms are applied in translation, sampling and data collection. As a consequence, results from PISA have a high degree of validity and reliability.

PISA’s unique features include its:

  • policy orientation, which links data on student learning outcomes with data on students’ backgrounds and attitudes towards learning, and on key factors that shape their learning in and outside of school; this exposes differences in performance and identifies the characteristics of students, schools and education systems that perform well

  • innovative concept of “literacy”, which refers to students’ capacity to apply knowledge and skills, and to analyse, reason and communicate effectively as they identify, interpret and solve problems in a variety of situations

  • relevance to lifelong learning, as PISA asks students to report on their motivation to learn, their beliefs about themselves and their learning strategies

  • regularity, which enables countries to monitor their progress in meeting key learning objectives

  • breadth of coverage, which, in PISA 2018, encompasses all 37 OECD countries and 42 partner countries and economies.

The PISA 2018 test

The PISA 2018 assessment was conducted principally via computer, as was the case, for the first time, in 2015. Paper-based assessment instruments were provided for countries that chose not to test their students by computer; but the paper-based assessment was limited to reading, mathematics and science trend items only (i.e. those items that had already been used in prior paper-based assessments). New items were developed only for the computer-based assessment.

The 2018 computer-based assessment was designed to be a two-hour test. Each test form distributed to students comprised four 30-minute clusters of test material. This test design included six clusters from both of the domains of mathematics and science to measure trends. For the major domain of reading, material equivalent to 15 30-minute clusters was developed. This material was organised into units instead of clusters, as the PISA 2018 reading assessment adopted an adaptive approach, whereby students were assigned units based on their performance in earlier units. In addition, four clusters of global competence items were developed for the countries that chose to participate in that assessment.

There were different test forms for countries that participated in the global competence assessment. Students spent one hour on the reading assessment (composed of a core stage followed by two stages of either greater or lesser difficulty) plus one hour on one or two other subjects – mathematics, science or global competence. For the countries/economies that chose not to participate in the global competence assessment, 36 test forms were prepared.

Countries that chose paper-based delivery for the main survey measured student performance with 30 paper-and-pencil forms containing trend items from the three core PISA domains. The reading items in these paper-based forms were based on the 2009 reading literacy framework and did not include any items based on the new 2018 reading literacy framework.

Each test form was completed by a sufficient number of students to allow for estimations of proficiency on all items by students in each country/economy and in relevant subgroups within a country/economy, such as boys and girls, or students from different social and economic backgrounds.

The assessment of financial literacy was offered as an option in PISA 2018 based on the same framework as that developed for PISA 2012, which was also used in 2015. Within PISA-participating schools, a sample of students different from the main sample sat the financial literacy test. In addition to the one-hour financial literacy test, these students also sat either a one-hour reading or one-hour mathematics assessment.

An overview of what is assessed in each domain

Box 1.2 presents definitions of the three domains assessed in PISA 2018. The definitions all emphasise the functional knowledge and skills that allow one to participate fully in society. Such participation requires more than just the ability to carry out tasks imposed externally by, for example, an employer; it also involves the capacity to participate in decision making. The more complex tasks in PISA require students to reflect on and evaluate material, not just answer questions that have one correct answer.

Box 1.2. Definitions of the domains

Reading literacy: An individual’s capacity to understand, use, evaluate, reflect on and engage with texts in order to achieve one’s goals, develop one’s knowledge and potential, and participate in society.

Mathematical literacy: An individual’s capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena.

Scientific literacy: The ability to engage with science-related issues, and with the ideas of science, as a reflective citizen. A scientifically literate person is willing to engage in reasoned discourse about science and technology, which requires the competencies to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically.

Reading literacy (Chapter 2) is defined as students’ ability to understand, use, evaluate, reflect on and engage with text to achieve their purposes.

PISA assesses students’ performance in reading through questions that involve a variety of:

  • Processes (aspects): Students are not assessed on the most basic reading skills, as it is assumed that most 15-year-old students will have acquired these. Rather, students are expected to demonstrate their proficiency in locating information, including both accessing and retrieving information within a piece of text, and searching for and selecting relevant text; understanding text, including both acquiring a representation of the literal meaning of text and constructing an integrated representation of text; and evaluating and reflecting on text, including both assessing its quality and credibility, and reflecting on content and form.

  • Text formats: PISA uses both single-source and multiple-source texts; static and dynamic texts; continuous texts (organised in sentences and paragraphs); non-continuous texts (e.g. lists, forms, graphs or diagrams); and mixed texts.

  • Situations: These are defined by the use for which the text was constructed. For example, a novel, personal letter or biography is written for people’s personal use; official documents or announcements are for public use; a manual or report is for occupational use; and a textbook or worksheet is for educational use. Since some students may perform better in one type of reading situation than another, a range of reading situations is included in the test.

New forms of reading that have emerged since the framework was last updated in 2009, especially digital reading and the growing diversity of material available in both print and digital forms, have been incorporated into the revised PISA 2018 reading framework.

Mathematical literacy (Chapter 3) is defined as students’ ability to analyse, reason and communicate ideas effectively as they pose, formulate, solve and interpret solutions to mathematical problems in a variety of situations.

PISA assesses students’ performance in mathematics through questions related to:

  • Processes: PISA defines three categories of processes: formulating situations mathematically; employing mathematical concepts, facts, procedures and reasoning; and interpreting, applying and evaluating mathematical outcomes. They describe what students do to connect the context of a problem with the mathematics involved and thus solve the problem. These three processes each draw on seven fundamental mathematical capabilities: communicating; mathematising; representing; reasoning and arguing; devising strategies for solving problems; using symbolic, formal and technical language and operations; and using mathematical tools. All of these capabilities draw on the problem solver’s detailed mathematical knowledge about individual topics.

  • Content: These are four ideas (quantity; space and shape; change and relationships; and uncertainty and data) that are related to familiar curricular subjects, such as numbers, algebra and geometry, in overlapping and complex ways.

  • Contexts: These are the settings in a student’s world in which the problems are placed. The framework identifies four contexts: personal, educational, societal and scientific.

Scientific literacy (Chapter 4) is defined as the ability to engage with science-related issues, and with the ideas of science, as a reflective citizen. A scientifically literate person is willing to engage in reasoned discourse about science and technology, which requires the competencies to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically.

PISA assesses students’ performance in science through questions related to:

  • Contexts: These include personal, local/national and global issues, both current and historical, that demand some understanding of science and technology.

  • Knowledge: This is the understanding of the major facts, concepts and explanatory theories that form the basis of scientific knowledge. Such knowledge includes knowledge of both the natural world and technological artefacts (content knowledge), knowledge of how such ideas are produced (procedural knowledge), and an understanding of the underlying rationale for these procedures and the justification for their use (epistemic knowledge).

  • Competencies: These are the ability to explain phenomena scientifically, evaluate and design scientific enquiry, and interpret data and evidence scientifically.

The evolution of reporting student performance in PISA

Results from PISA are reported using scales. Initially, the average score across OECD countries for all three subjects was 500 with a standard deviation of 100, which meant that two-thirds of students across OECD countries scored between 400 and 600 points. These scores represent degrees of proficiency in a particular domain. Scores in subsequent cycles of PISA are calibrated so as to be directly comparable to those in previous cycles; hence the average score across OECD countries in subsequent cycles has fluctuated slightly around the original 500.

Reading literacy was the major domain in 2000, and the reading scale was divided into five proficiency levels of knowledge and skills. The main advantage of this approach is that it is useful for describing what substantial numbers of students can do with tasks at different levels of difficulty. Results were also presented through three “aspect” subscales of reading: accessing and retrieving information; integrating and interpreting texts; and reflecting and evaluating texts.

PISA 2009 marked the first time that reading literacy was re-assessed as a major domain. Trend results were reported for all three domains – reading, mathematics and science. PISA 2009 added a Level 6 to the reading scale to describe very high levels of reading proficiency. The bottom level of proficiency, Level 1, was renamed Level 1a. Another level, Level 1b, was introduced to describe the performance of students who would previously have been rated as “below Level 1”, but who show proficiency in relation to new items that were easier than those included in previous PISA assessments. These changes allowed countries to know more about what kinds of tasks students with very high and very low reading proficiency were capable of completing.

Reading was once again the major domain of assessment in PISA 2018. The three subscales described above were renamed “locating information”, “understanding”, and “evaluating and reflecting”. Two new subscales that describe students’ literacy with single-source and multiple-source texts were also developed. In addition, the reading scale was extended by adding Level 1c, which better describes the proficiency of the lowest-achieving students. These students show minimal reading literacy; what they could do in reading was not described in the previous PISA reading literacy scales.

The context questionnaires

To gather contextual information, PISA asks students and the principals of their schools to respond to questionnaires. These take about 35 and 45 minutes, respectively, to complete. The responses to the questionnaires are analysed with the assessment results to provide at once a broader and more nuanced picture of student, school and system performance. Chapter 7 presents the questionnaire framework in detail. Some countries/economies asked students to complete an additional well-being questionnaire, new to PISA 2018; the framework for this questionnaire is presented in Chapter 8. The questionnaires from all assessments since PISA’s inception are available on the PISA website: www.oecd.org/pisa/.

The questionnaires seek information about:

  • Students and their family background, including their economic, social and cultural capital

  • Aspects of students’ lives, such as their attitudes towards learning, their habits and life in and outside of school, and their family environment

  • Aspects of schools, such as the quality of the schools’ human and material resources, public and private management and funding, decision-making processes, staffing practices, and the school’s curricular emphasis and extracurricular activities offered

  • Context of instruction, including institutional structures and types, class size, classroom and school climate, and reading activities in class

  • Aspects of learning, including students’ interest, motivation and engagement.

In PISA 2018, five additional questionnaires were offered as options:

  • Computer familiarity questionnaire, focusing on the availability and use of information and communications technology (ICT) and on students’ ability to carry out computer tasks and their attitudes towards computer use

  • Well-being questionnaire, new to PISA 2018, on students’ perceptions of their health, life satisfaction, social connections, and in- and outside-of-school activities

  • Educational career questionnaire, which collects additional information on interruptions in schooling, preparation for students’ future career, and support with language learning

  • Parent questionnaire, focusing on parents’ perceptions of and involvement in their child’s school, their support for learning at home, school choice, their child’s career expectations, and their background (immigrant/non-immigrant)

  • Teacher questionnaire, which asks about teachers’ initial training and professional development, their beliefs and attitudes, and their teaching practices; separate questionnaires were developed for teachers of the test language and for other teachers in the school.

The contextual information collected through the student, school and optional questionnaires comprises only a part of the information available to PISA. Indicators describing the general structure of education systems (their demographic and economic contexts, such as their costs, enrolments, school and teacher characteristics, and some classroom processes) and their effect on labour market outcomes are routinely developed and applied by the OECD (e.g. in the annual OECD publication, Education at a Glance).

A collaborative project

PISA is the result of a collaborative effort between OECD and partner governments. The assessments are developed co-operatively, agreed by participating countries/economies, and implemented by national organisations. The co-operation of students, teachers and principals in participating schools has been crucial to the success of PISA during all stages of development and implementation.

The PISA Governing Board (PGB), composed of representatives at the senor policy level from all participating countries/economies, determines the policy priorities for PISA in the context of OECD objectives. It also oversees adherence to these priorities during the implementation of the programme. The PGB sets priorities for developing indicators, establishing assessment instruments and reporting results. Experts from participating countries/economies also serve on working groups tasked with linking PISA policy objectives with the best available technical expertise in the different assessment domains. By participating in these expert groups, countries/economies ensure that the instruments are internationally valid and take into account differences in cultures and education systems.

Participating countries/economies implement PISA at the national level through National Centres managed by National Project Managers, subject to the agreed administration procedures. National Project Managers play a vital role in ensuring that the implementation is of high quality. They also verify and evaluate survey results, analyses, reports and publications.

The reading framework was developed by the reading expert group with the guidance of John de Jong and Peter Foltz from Pearson. The reading expert group was chaired by Jean-François Rouet (University of Poitiers, France). Other experts who contributed to the reading framework are Paul van den Broek (Universiteit Leiden, the Netherlands), Kevin Chung (University of Hong Kong), Sascha Schroeder (Max Planck Institute for Human Development, Berlin, Germany), Sari Sulkunen (University of Jyväskylä, Finland; also served as the liaison to the PISA global competence expert group), and Dominique Lafontaine (Université de Liège, Belgium; also served as the liaison to the PISA questionnaire expert group).

The global competence framework was developed by Mario Piacentini of the OECD Secretariat with Martyn Barrett (University of Surrey, Guildford, UK), Veronica Boix Mansilla (Harvard University and Project Zero, Cambridge, USA), Darla Deardorff (Duke University, Durham, USA) and Hye-Won Lee (Korea Institute for Curriculum and Evaluation, Jincheon, Korea), with additional help from Rose Bolognini and Natalie Foster (OECD Secretariat), Natasha Robinson (University of Oxford, UK) and Mattia Baiutti (Fondazione Intercultura, Colle di Val d’Elsa, Italy and the University of Udine, Italy). This framework built on earlier work from experts who led the first part of the development of the global competence assessment: Darla Deardorff (Duke University, Durham, USA), David Kerr (University of Reading, UK and YoungCitizens, London, UK), Peter Franklin (HTWG Konstanz University of Applied Sciences, Germany), Sarah Howie (University of Pretoria, South Africa), Wing On Lee (Open University of Hong Kong, China), Jasmine B-Y Sim (National Institute of Education, Singapore), and Sari Sulkunen (University of Jyväskylä, Finland).

The framework for the PISA 2018 questionnaires was developed by the questionnaire expert group with the guidance of John de Jong and Christine Rozunick from Pearson. The questionnaire expert group was chaired by Fons van de Vijver (Tilburg University, the Netherlands; the North-West University, Potchefstroom, South Africa; and the University of Queensland, Brisbane, Australia). Other experts who contributed to the development of the questionnaire framework are Dominique Lafontaine (Université de Liège, Belgium), Sarah Howie (University of Pretoria, South Africa), Andrew Elliot (University of Rochester, USA), Therese Hopfenbeck (University of Oxford, UK) and David Kaplan (University of Wisconsin-Madison, USA).

The framework for the well-being questionnaire was developed by Jonas Bertling (ETS). The frameworks for the mathematics and science assessments received their last major updates when they were the major domain of assessment (2012 for mathematics, 2015 for science).

Pearson facilitated the development of the reading and questionnaire frameworks. The Educational Testing Service (ETS) was responsible for managing and overseeing this survey; developing the instruments, scaling and analysis; and creating the electronic platform. Other partners or subcontractors involved with ETS include the Department of Experimental and Theoretical Pedagogy at the Université de Liège (aSPe) in Belgium and the Educational Measurement and Research Centre (EMACS) of the University of Luxembourg in Luxembourg. Westat assumed responsibility for survey operations and sampling with the subcontractor, the Australian Council for Educational Research (ACER). cApStAn Linguistic Quality Control assumed responsibility for ensuring the linguistic equivalence of all language versions.

The OECD Secretariat has overall managerial responsibility for the programme, monitors its implementation on a day-to-day basis, acts as the secretariat for the PGB, builds consensus among countries, and serves as the interlocutor between the PGB and the contractors charged with implementation. The OECD Secretariat is also responsible for producing the indicators, and for the analysis and preparation of the international reports and publications, in co-operation with the contractors and in close consultation with participating countries/economies at both the policy (PGB) and implementation (National Project Managers) levels.

End of the section – Back to iLibrary publication page