PISA

ISSN :
1996-3777 (online)
ISSN :
1990-8539 (print)
DOI :
10.1787/19963777
Hide / Show Abstract

A series of reports on the OECD’s Programme for International Student Assessment’s (PISA) periodic testing program on student performance. The reports generally compare student (15 year olds) academic performance across countries, or discuss the methodology used to gather the data.

Also available in: French, German
 
PISA 2003 Technical Report

PISA 2003 Technical Report You do not have access to this content

Click to Access: 
Author(s):
OECD
Publication Date :
26 July 2005
Pages :
432
ISBN :
9789264010543 (PDF) ; 9789264010536 (print)
DOI :
10.1787/9789264010543-en

Hide / Show Abstract

The PISA 2003 Technical Report describes the complex methodology underlying PISA 2003, along with additional features related to the implementation of the project at a level of detail that allows researchers to understand and replicate its analyses. It presents information on the test and sample design, methodologies used to analyse the data, technical features of the project and quality control mechanisms.

Expand / Collapse Hide / Show all Abstracts Table of Contents

  • Mark Click to Access
  • Click to Access:  The Programme for International Student Assessment
    The OECD’s Programme for International Student Assessment (PISA) is a collaborative effort among OECD member countries to measure how well 15-year-old young adults approaching the end of compulsory schooling are prepared to meet the challenges of today’s knowledge societies.
  • Click to Access:  Test Design and Test Development
    In PISA 2003, four subject domains were tested, with mathematics as the major domain, and reading, science and problem solving as minor domains. Student achievement in mathematics was assessed using 85 test items representing approximately 210 minutes of testing time.
  • Click to Access:  The Development of the PISA Context Questionnaires
    In addition to the assessment of the achievement of 15-year-old students in reading, science, mathematics and problem-solving skills, PISA 2003 also included the collection of information on the characteristics of students and their schools.
  • Click to Access:  Sample Design
    The desired base PISA target population in each country consisted of 15-year-old students attending educational institutions located within the country, in grades 7 and higher. This meant that countries were to include 15-year-olds enrolled full-time in educational institutions, 15-year-olds enrolled in educational institutions who attended on only a part-time basis, students in vocational training types of programmes, or any other related type of educational programmes, and students attending foreign schools within the country (as well as students from other countries attending any of the programmes in the first three categories).
  • Click to Access:  Translation and Cultural Appropriateness of the Test and Survey Material
    Translation errors are known to be a major cause of items functioning poorly in international tests. Translation errors are much more frequent than other problems, such as clearly identified discrepancies due to cultural biases or curricular differences.
  • Click to Access:  Field Operations
    PISA was implemented in each country by a National Project Manager (NPM). The NPM implemented procedures prepared by the consortium and agreed upon by participating countries. To implement the assessment in schools, the NPMs were assisted by school co-ordinators and test administrators. Each NPM typically had several assistants, working from a base location that is referred to throughout this report as a national centre.
  • Click to Access:  Monitoring the Quality of PISA
    It is essential that users of the PISA data have confidence that the data collection activities have been undertaken to a high standard. The quality assurance that provides this confidence consists of two methods. The first is to carefully develop and document procedures that will result in data of the desired quality, the second is to monitor and record the implementation of the documented procedures.
  • Click to Access:  Survey Weighting and the Calculation of Sampling Variance
    Survey weights were required to analyse PISA 2003 data, to calculate appropriate estimates of sampling error, and to make valid estimates and inferences. The consortium calculated survey weights for all assessed, ineligible and excluded students, and provided variables in the data that permit users to make approximately unbiased estimates of standard errors, to conduct significance tests and to create confidence intervals appropriately, given the sample design for PISA in each individual country.
  • Click to Access:  Scaling PISA Cognitive Data
    The mixed co-efficients multinomial logit model as described by Adams et al. (1997) was used to scale the PISA data, and implemented by ConQuest software (Wu et al., 1997).
  • Click to Access:  Coding Reliability Studies
    As described in Chapter 2, a substantial proportion of the PISA 2003 items were open ended and required coding by trained personnel. It was important therefore that PISA implemented procedures that maximised the validity and consistency, both within and between countries, of this coding.
  • Click to Access:  Data Cleaning Procedures
    National project managers (NPMs) were required to submit their national data in KeyQuest, the generic data entry package developed by consortium staff and pre-configured to include the data entry forms, referred to later as instruments: the achievement test booklets 1 to 13 (together making up the cognitive data);
  • Click to Access:  Sampling Outcomes

    This chapter reports on PISA sampling outcomes. Details of the sample design are given in Chapter 4.

    Table 12.1 shows the various quality indicators for population coverage, and the various pieces of information used to derive them. The following notes explain the meaning of each coverage index and how the data in each column of the table were used.

  • Click to Access:  Scaling Outcomes
    When main study data were received from each participating country, they were first verified and cleaned using the procedures outlined in Chapter 11. Files containing the achievement data were prepared and national-level Rasch and traditional test analyses were undertaken. The results of these analyses were included in the reports that were returned to each participant.
  • Click to Access:  Outcomes of Coder Reliability Studies
    This chapter reports the result of the various coder reliability studies that were implemented. The methodologies for these studies are described in Chapter 10.
  • Click to Access:  Data Adjudication
    This chapter describes the process used to adjudicate the implementation of PISA 2003 in each of the participating countries, and gives the outcomes of the data adjudication.
  • Click to Access:  Proficiency Scale Construction
    The PISA test design makes it possible to use techniques of modern item response modelling (sometimes referred to as item response theory, or IRT) to simultaneously estimate the ability of all students taking the PISA assessment, and the difficulty of all PISA items, locating these estimates of student ability and item difficulty on a single continuum.
  • Click to Access:  Scaling Procedures and Construct Validation of Context Questionnaire Data
    The PISA 2003 context questionnaires included numerous items on student characteristics, student family background, student perceptions, school characteristics and school principals’ perceptions. Though some of these questions can be analysed as single items (for example, gender), most questions were designed to measure latent constructs that cannot be observed directly. Here, transformations or scaling procedures are needed to construct meaningful indices.
  • Click to Access:  International Database
    The PISA international database consists of three data files: two student-level files and one school-level file. All are provided in text (or ASCII) format with the corresponding SAS and SPSS control files.
  • Click to Access:  Appendix 1
  • Add to Marked List