copy the linklink copied! 6. Relationship between the Survey of Adult Skills (PIAAC) and the OECD Programme for International Student Assessment (PISA)

copy the linklink copied! picture

This chapter explains how the Survey of Adult Skills (PIAAC) and the OECD Programme for International Student Assessment (PISA) are related. Although there are similarities between the two in how skills are defined, there are significant differences between the two assessments, including the target populations and the measures used to assess skills.

    

In most of the countries/economies participating in the Survey of Adult Skills (PIAAC), respondents aged 16-30 will be members of cohorts that have taken part in the OECD Programme for International Student Assessment (PISA). In addition, both PISA and the Survey of Adult Skills assess ostensibly similar skills – in particular, literacy and numeracy, but also problem solving. Given the overlap in terms of the cohorts assessed and the content of the assessments, it is important that users understand the similarities and differences between the two studies and the extent to which results of the two studies can be compared.

This chapter provides an overview of the relationship between the Survey of Adult Skills and PISA and emphasises two key points. First, the Survey of Adult Skills was not designed to be linked psychometrically to PISA. Even in those areas in which there are the greatest links conceptually (in the domains of literacy/reading literacy and numeracy/mathematical literacy), the measurement scales are distinct. Second, the conceptualisation of the skills of literacy and numeracy in the Survey of Adult Skills has much in common with that of the skills of reading literacy and mathematical literacy in PISA.

copy the linklink copied! PISA cohorts in the target population of the survey of adult skills (PIAAC)

The target population for the Survey of Adult Skills includes the cohorts that participated in PISA 2000, 2003, 2006, 2009 and 2012. The age of the cohorts assessed in the four rounds of PISA between 2000 and 2012 at the time of the data collection for Rounds 1 and 2 of the Survey of Adult Skills is presented in Table 6.1.

copy the linklink copied!
Table 6.1. Age of PISA cohorts in 2011-12, 2014-15 and 2017-18

Age in 2011-12

Age in 2014-15

Age in 2017-18

PISA 2000

26-27

29-30

32-33

PISA 2003

23-24

26-27

29-30

PISA 2006

20-21

23-24

26-27

PISA 2009

17-18

20-21

23-24

PISA 2012

17-18

20-21

PISA 2015

17-18

copy the linklink copied! Differences in the target populations

As noted above, several “PISA cohorts” are included in the population assessed in the Survey of Adult Skills. There are differences in coverage of these cohorts in PISA and the adult survey which need to be taken into account in any comparison of the results from the two assessments. In particular, the target population of the Survey of Adult Skills is broader than that of PISA and the PISA cohorts assessed by it include individuals who were not part of the PISA target population.

The target population of PISA is young people aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period who were enrolled in an educational institution at Grade 7 or above (OECD, 2010a). Fifteen-year-olds who are not enrolled at an educational institution are not tested as part of PISA and, in all countries participating in the four rounds of PISA between 2000 and 2009, a proportion of 15-year-olds were out of school or in grades lower than Grade 7. In 2009, for example, the PISA sample represented between 94% (Belgium) and 82% (United States) of the 15-year-old population in the countries covered in this report (OECD, 2010a, Table A2.1). The target population for the Survey of Adult Skills is the entire resident population. Therefore, the “PISA cohorts” surveyed in the Survey of Adult Skills include, in addition to persons who were at school at age 15 (and, therefore, part of the PISA target population), those who were out of school at the age of 15 (and, therefore, outside the PISA target population). Irrespective of any other considerations, the different rates of coverage of the cohorts are relevant to comparisons of the results of the two surveys for the “PISA cohorts”. In particular, it seems likely that, in most countries, mean proficiency scores for the full 15-year-old cohort would have been lower than those observed for 15-year-olds who were in school,1 as the available evidence suggests that early school-leavers are less proficient than students who continue in schooling (see, for example, Bushnik, Barr-Telford and Bussière, 2003 and Fullarton et al., 2003).

copy the linklink copied! Skills assessed

Table 6.2 shows the skill domains assessed in the Survey of Adult Skills and those assessed in the four rounds of PISA that have been administered since 2000. As can be seen, both studies assess skills in the domains of literacy, numeracy/mathematics and problem solving. The one area in which there is no overlap is that of scientific literacy.

copy the linklink copied!
Table 6.2. Comparison of the Survey of Adult Skills (PIAAC) and PISA: Skills assessed

The Survey of Adult Skills (PIAAC)

PISA

Literacy

Reading literacy (2000, 2003, 2006, 2009, 2012, 2015)

Electronic reading (2009)

Numeracy

Mathematical literacy (2000, 2003, 2006, 2009, 2012, 2015)

Problem solving in technology-rich environments

Problem solving (2003), (2012)

copy the linklink copied! Psychometric links

The Survey of Adult Skills was not designed to allow direct comparisons of its results with those of PISA. Despite similarities in the broad approach to defining the skills assessed, the two surveys include no common items, and the results from the two surveys cannot be treated as being on the same scale in any of the domains that they ostensibly have in common.

An objective of the first round of PISA was to establish a psychometric link between PISA and the International Adult Literacy Survey (IALS) in the domain of literacy (see OECD, 1999, p. 29). Fifteen prose items from IALS were embedded in the PISA 2000 test booklets for the main study. Items from IALS were not included in the assessments of reading literacy conducted in subsequent rounds of PISA, however.

The outcomes of an analysis investigating whether students taking the PISA 2000 assessment could be placed on the IALS prose literacy scale is reported in Yamamoto (2002) and Reading for Change: Performance and Engagement across Countries: Results from PISA 2000 (OECD, 2002). Yamamoto concluded that PISA students could be placed on the IALS prose literacy scale.2 Chapter 8 of Reading for Change (OECD, 2002) presents the distribution of students in participating countries across the five IALS proficiency levels.

copy the linklink copied! The relationships between constructs in the domains of literacy, numeracy and problem solving

While there has been no attempt to link the Survey of Adult Skills to PISA in any assessment domains, the two studies share a similar approach to assessment, both in terms of broad orientation and the definition of the domains assessed.

Both the Survey of Adult Skills and PISA hold an action-oriented or functional conception of skills. The object of interest is the application and use of knowledge and know-how in common life situations as opposed to the mastery of a body of knowledge or of a repertoire of techniques. In defining assessment domains, the emphasis is placed on the purposive and reflective use and processing of information to achieve a variety of goals. To this end, in both studies, the skills assessed are defined in terms of a set of behaviours through which the skill is manifested and a set of goals that the behaviours in question are intended to achieve.

The Survey of Adult Skills and PISA also share a common approach to the specification of the constructs measured.3 The frameworks defining the constructs specify their features in terms of three dimensions: content, cognitive processes and context. The dimension of content (“knowledge domain” in PISA) relates to the artefacts, tools, knowledge, representations, cognitive challenges, etc. that constitute the corpus to which an individual (an adult, in the case of the Survey of Adult Skills; a 15-year-old student in the case of PISA) must respond or that he or she must use. Cognitive strategies (“competencies” in PISA) cover the mental processes that individuals bring into play to respond to or use given content in an appropriate manner. Context (“context and situation” in PISA) refers to the different situations in which individuals read, display numerate behaviour, solve problems or use scientific knowledge.

The similarities and differences between the conceptualisation of the domains of literacy, numeracy and problem solving in the Survey of Adult Skills and those of reading literacy, mathematical literacy and problem solving in PISA are discussed below through a comparison of the respective assessment frameworks. The discussion focusses on the assessment frameworks that guided the PISA assessments over the period 2000-2012 – i.e. those relevant to the development of the PIAAC frameworks which took place over 2008-09.

Literacy

Table 6.3 provides a summary of the definition and the content, processes and context dimensions of the literacy framework of the Survey of Adult Skills and the reading literacy framework for PISA.

copy the linklink copied!
Table 6.3. Comparison of the Survey of Adult Skills (PIAAC) and PISA: Literacy

Survey of Adult Skills (PIAAC)

PISA

Definition

The ability to understand, evaluate, use and engage with written texts to participate in society, to achieve one’s goals, and to develop one’s knowledge and potential.

The capacity to understand, use, reflect on and engage with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society.

Content

Different types of text. Texts are characterised by their medium (print-based or digital) and by their format:

  • Continuous or prose texts, which involve narration, argumentation or descriptions, for example

  • Non-continuous or document texts, for example, tables, lists and graphs

  • Mixed texts, which involve combinations of prose and document elements

  • Multiple texts, which consist of the juxtaposition or linking of independently generated elements

The form of reading materials:

  • Continuous texts, including different kinds of prose such as narration, exposition, argumentation

  • Non-continuous texts, including graphs, forms and lists

  • Digital and print (from 2009)

Cognitive processes

Access and identify

Integrate and interpret (relating parts of text to one another)

Evaluate and reflect

Retrieving information

Interpreting texts

Reflecting on and evaluating texts

Contexts

Personal

Work

Community

Education

Personal (e.g. a personal letter)

Occupational (e.g. a report)

Public (e.g. an official document)

Educational (e.g. school-related reading)

Content

The Survey of Adult Skills and PISA (2000-12) share a common conceptualisation of the texts forming the corpus of written materials to which test-takers respond. Text formats are categorised as continuous (prose), non-continuous (document), mixed and multiple texts. In terms of their type or rhetorical character, there is considerable overlap in the categorisations used. Both frameworks identify description, narration, exposition, argumentation and instructions. The framework for the Survey of Adult Skills also includes the additional category of “records” (the documentation of decisions and events) and the PISA framework (OECD, 2010b, p. 33) identifies the text type, “transaction” (a text that aims to achieve a specific purpose outlined in the text, such as requesting that something is done, organising a meeting or making a social engagement with a friend). There is some variation in the distribution of the texts used in the actual assessments by format. Mixed texts are the most frequent text format found in the Survey of Adult Skills whereas continuous texts are the format most frequently found in PISA.4

Cognitive processes

PISA 2000 identified five types of cognitive process required to understand and respond to texts that were grouped into three broader categories (“access and retrieve”, “integrate and interpret” and “evaluate and reflect”) for the purpose of analysis. By PISA 2009 only the three broader categories were retained. The framework for the Survey of Adult Skills uses the same three categories to organise the cognitive operations used in reading. In the actual assessments, the Survey of Adult Skills includes a greater share of access and retrieve tasks than does PISA, while PISA includes a greater proportion of items requiring evaluation and reflection. This reflects the different expert groups’ judgements as to relative importance of the different types of tasks performed by 15-year-olds and adults in their ordinary reading.

Contexts

Reading is a purposeful activity that takes place in a context. While the actual contexts cannot be simulated in an assessment, the frameworks of both assessments seek to ensure that a reasonable coverage of such contexts is represented in the respective assessments. While using slightly different wording, the contexts in which reading takes place are conceived in similar ways (see Table 6.3 above) with a broadly comparable distribution of items by type of context.

Response formats

The two assessments differ in terms of the format in which test-takers respond to test items. In the adult reading assessment, respondents provide answers by highlighting sections of text (selected response) in the computer-based version of the assessment, or by writing answers (constructed response) in the appropriate location in the paper-based version. The PISA reading assessment uses a wider variety of response formats, including standard multiple choice, complex multiple choice (where several selected response tasks have to be completed for a correct response), simple constructed response (where there is a single correct answer) and complex constructed response (where there are many possible ways to state the correct answer).

Numeracy

Table 6.4 provides a summary of the definition and the content, processes and context dimensions of the numeracy framework of the Survey of Adult Skills and the mathematical literacy framework for PISA. The similarities and differences are explored in more detail below.

copy the linklink copied!
Table 6.4. Comparison of the Survey of Adult Skills (PIAAC) and PISA: Numeracy

Survey of Adult Skills (PIAAC)

PISA

Definition

The ability to access, use, interpret and communicate mathematical information and ideas, in order to engage in and manage the mathematical demands of a range of situations in adult life.

The capacity to identify and understand the role that mathematics plays in the world, to make well-founded judgements and to use and engage with mathematics in ways that meet the needs of that individual’s life as a constructive, concerned and reflective citizen.

Content

Quantity and number

Dimension and shape

Pattern, relationships, change

Data and chance

Quantity

Space and shape

Change and relationships

Cognitive processes

Identify, locate or access

Act upon and use (order, count, estimate, compute, measure, model)

Interpret, evaluate and analyse

Communicate

Reproduction (simple mathematical operations)

Connections (bringing together ideas to solve straightforward problems)

Reflection (wider mathematical thinking)

Contexts

Everyday life

Work-related

Community and society

Education and training

Personal

Educational and occupational

Public

Scientific

Content

Both assessments cover closely related content areas in mathematical literacy/numeracy (e.g. “dimension and shape” in the Survey of Adult Skills and “space and shape” in PISA). The spread of items across the content areas is very similar in both assessments, although the Survey of Adult Skills puts a slightly greater emphasis on “quantity and number” than on “pattern, relationships and change”. The content descriptions in the PISA frameworks include more knowledge of formal mathematical content than do those of the Survey of Adult Skills. Some items in PISA require formal, school-based mathematics (e.g. identify the gradient of a linear equation), while this type of knowledge is not required in the Survey of Adult Skills. PISA and the survey also differ slightly in the breadth of content they cover. As PISA measures the skills of 15-year-old students only, it focuses on secondary school-level mathematics. In contrast, the Survey of Adult Skills assesses skills across the entire adult population and, as a result, includes items that assume low levels of completed schooling (e.g. the early primary years). For example, some of the easiest items in PISA require comparing and interpreting data in complex tables of values, which include numbers into the tens and hundreds of thousands. In the Survey of Adult Skills, one of the easiest items requires recognising the smallest number in a one-column table of numbers less than one hundred.

Cognitive processes

The cognitive processes respondents are expected to display are similar in the two assessments. However, unlike in content areas and contexts, the two sets of classifications do not match exactly. One difference is that the Survey of Adult Skills framework includes “communicate” as a category of cognitive process. However, due to the move to computer-based assessments, few items in the survey were classified as belonging to the category of “communicate” in the final assessment.

Contexts

A key feature of both assessments is that proficiency is assessed through problems set in context. Both assessments identify four contexts, with an approximately equal spread of items across each context. The four categories of context are similar in the respective frameworks (e.g. “everyday life” in the Survey of Adult Skills is very similar to “private” in PISA). The category of “education and training” in the survey does not exactly mirror the category of “scientific” contexts in PISA, but there is still a considerable overlap between them. The minor differences between the contexts used in the two frameworks reflect differences in the ages of the target groups for the assessments.

Representation and reading demands

PISA and the Survey of Adult Skills use a similar range of forms to convey mathematical information in real-life situations. These include, for example, objects to be counted (e.g. people, cars), symbolic notation (e.g. letters, operation signs), diagrams and tables. Texts may also play an important role, either by containing mathematical information in a textual form (e.g. “five” instead of “5”, “crime rate increased by half”) or by containing additional information that needs to be interpreted as part of the context. In both the survey and PISA 2012 there was an effort to reduce reading demands to distinguish performance in numeracy more clearly from the other measures of literacy. In both assessments this was achieved by minimising the amount of text and making it less complex, as well as by using supporting photos, images and illustrations. Most items are similar in reading demands, although PISA contains some items with more complex text (e.g. with formal mathematical terminology), while the Survey of Adult Skills includes items with very little text. This reflects the differences in the breadth of content assessed by the two surveys, as described above.

Item formats

There are some differences between PISA and the Survey of Adult Skills in the range of item types used; these are due to some operational constraints for the survey. Given its computer-based adaptive approach, the survey used short, separate tasks and selected-response (multiple choice) items. This still allowed respondents to answer in different modes (e.g. choosing from a pull-down menu, clicking on an area of the screen), but limited the capacity of the survey to assess communication-related skills (e.g. describing one’s analysis of a situation). PISA used a wider range of formats, with both constructed-response and selected-response items. In addition, the optional computer-based component of PISA also used some interactive items (e.g. animation).

Complexity schemes

The frameworks for the Survey of Adult Skills and PISA contain a scheme describing the factors that affect item complexity. These schemes were used for different purposes, including designing items and describing performance levels. The survey scheme contains factors that consider the textual and mathematical aspects of complexity separately. Textual aspects include, for example, whether the problem is obvious or hidden. Mathematical aspects include, for example, the complexity of the data presented and how many operations respondents are expected to perform. The framework for PISA approaches complexity from a different angle. Its complexity scheme is based on a set of mathematical capabilities that underpin mathematical modelling (e.g. mathematising, reasoning and argument, using symbols, and devising strategies for solving problems).

Problem solving

Table 6.5 provides a summary of the definition and the content, processes and context dimensions of the problem solving framework in technology-rich environments of the Survey of Adult Skills and the problem-solving frameworks for PISA 2003 and 2012 (OECD, 2004, 2013).

Of the three domains discussed in this chapter, problem solving is the one where there is least relationship between the constructs assessed. In particular, the domain of problem solving in technology-rich environments and problem solving in PISA 2003 and 2012 conceive the ‘content’ dimension of their respective constructs in very different ways. The Survey of Adult Skills integrates a technology dimension not present in the PISA 2003 framework. Problem solving in PISA 2012 includes a technology dimension. However, this is conceived in different ways in the Survey of Adult Skills and PISA 2012. In particular, the technology dimension is instantiated in PISA 2012 in the form of simulated devices such as a MP3 player or air-conditioning system. In the Survey of Adult Skills, the technology dimension is present in the form of different applications (e.g. web browsers, webpages, email, spreadsheets) through which the information necessary to solve the problem situation is presented and which test-takers must use to solve the problem. In addition, the problem situation is conceived in different terms in the three studies – in relation to complexity and explicitness in the Survey of Adult Skills and by type of problem in PISA 2003 and in terms of interactive and static problems in PISA 2012.

copy the linklink copied!
Table 6.5. Comparison of the Survey of Adult Skills (PIAAC) and PISA: Problem solving

Survey of Adult Skills (PIAAC)

PISA 2003

PISA 2012

Definition

The ability to use digital technology, communication tools and networks to acquire and evaluate information, communicate with others and perform practical tasks. The assessment focuses on the ability to solve problems for personal, work and civic purposes by setting up appropriate goals and plans, and accessing and making use of information through computers and computer networks.

An individual’s capacity to use cognitive processes to confront and resolve real cross-disciplinary situations in which the solution path is not immediately obvious and where the literacy domains or curricular areas that might be applicable are within a single domain of science, mathematics or reading.

An individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.

Content

Technology:

  • Hardware devices

  • Software applications

  • Commands and functions

  • Representations (e.g. text, graphics, video)

Nature of problems:

  • Intrinsic complexity, which includes the number of steps required for solution, the number of alternatives, complexity of computation and/or transformation, number of constraints

  • Explicitness of the problem statement, for example, largely unspecified or described in detail

Problem types:

  • Decision making

  • System analysis and design

  • Trouble shooting

Problem solving situations:

  • Static problem situations

  • Interactive problem situations

Cognitive processes

Setting goals and monitoring progress

Planning

Acquiring and evaluating information

Using information

Understanding

Characterising

Representing Reflecting

Solving Communicating

Exploring and understanding

Representing and formulating

Planning and executing

Monitoring and reflecting

Contexts

Personal

Work and occupation

Civic

Personal life

Work and leisure

Community and society

Setting: Technological or non-technological

Focus: Personal or social

copy the linklink copied! Conclusion

In sum, the Survey of Adult Skills and PISA share a similar broad approach to assessment and there is considerable commonality in the way in which the skills of literacy/reading literacy and numeracy/mathematical literacy are conceptualised and defined in the two studies. The overlap is greater in the case of literacy and reading literacy. The differences between the two studies in these domains relate, at least in part, to the different target populations: adults in the case of the Survey of Adult Skills, and 15-year-old students in the case of PISA. At least in the domains of literacy/reading and numeracy/mathematics, the Survey of Adult Skills and PISA can be regarded as measuring much the same skills in much the same way. At the same time, different measures are used in the two studies. The literacy and the numeracy scales used in the Survey of Adult Skills are not the same as their counterparts in PISA. While it would be expected that a high performer in reading literacy in PISA would be a relatively high performer in the Survey of Adult Skills, it is not possible to identify with any accuracy where a 15-year-old with a particular reading literacy or mathematics score in PISA would be located on the literacy or numeracy scales of the Survey of Adult Skills. In the absence of evidence from a study linking the two assessments, caution is advised in comparing the results of the two assessments.

References

Bushnik, T., L. Barr-Telford and P. Bussière (2003), In and Out of High School: First Results from the Second Cycle of the Youth in Transition Survey, 2002, Statistics Canada and Human Resources and Skills Development Canada, Ottawa.

Fullarton, S., M. Walker, J. Ainley and K. Hillman (2003) Patterns of Participation in Year 12, Longitudinal Surveys of Australian Youth Research Report 33, ACER, Camberwell, www.lsay.edu.au/publications/1857.html.

Gal, I. and D. Tout (2014), “Comparison of PIAAC and PISA Frameworks for Numeracy and Mathematical Literacy”, OECD Education Working Papers, No. 102, OECD Publishing, Paris, https://doi.org/10.1787/5jz3wl63cs6f-en.

OECD (2013), PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy, OECD Publishing, Paris, https://doi.org/10.1787/9789264190511-en.

OECD (2010a), PISA 2009 Results: Learning Trends: Changes in Student Performance Since 2000 (Volume V), OECD Publishing, Paris, https://doi.org/10.1787/9789264091580-en.

OECD (2010b), PISA 2009 Assessment Framework: Key Competencies in Reading, Mathematics and Science, OECD Publishing, Paris, https://doi.org/10.1787/9789264062658-en.

OECD (2004), The PISA 2003 Assessment Framework: Mathematics, Reading, Science and Problem Solving Knowledge and Skills, OECD Publishing, Paris, https://doi.org/10.1787/9789264101739-en.

OECD (2002), Reading for Change: Performance and Engagement across Countries: Results from PISA 2000, OECD Publishing, Paris, https://doi.org/10.1787/9789264099289-en.

OECD (1999), Measuring Student Knowledge and Skills: A New Framework for Assessment, OECD Publishing, Paris, https://doi.org/10.1787/9789264173125-en.

Yamamoto, K. (2002), Estimating PISA Students on the IALS Prose Literacy Scale, www.oecd.org/edu/preschoolandschool/programmeforinternationalstudentassessmentpisa/33680659.pdf.

Notes

← 1. Fifteen-year-olds in home schooling may constitute an exception.

← 2. Some block-order effects (responses were affected by where the items were placed in the assessment) were found in respect of the IALS items in PISA that were not present in IALS.

← 3. This reflects the influence of the IALS frameworks on the development of both the PISA literacy framework (see OECD, 1999) and the literacy framework of the Survey of Adult Skills (PIAAC).

← 4. Multiple texts dominate in the electronic reading assessment of PISA.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/f70238c7-en

© OECD 2019

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.

6. Relationship between the Survey of Adult Skills (PIAAC) and the OECD Programme for International Student Assessment (PISA)