copy the linklink copied!Annex A1. Construction of indices

copy the linklink copied!Explanation of the indices

This section explains the indices derived from the PISA 2018 student, school, parent and ICT questionnaires used in this volume. Several PISA measures reflect indices that summarise responses from students, their parents, teachers or school representatives (typically principals) to a series of related questions. The questions were selected from a larger pool on the basis of theoretical considerations and previous research. The PISA 2018 Assessment and Analytical Framework (OECD, 2019[1]) provides an in-depth description of this conceptual framework. Item response theory (IRT) modelling was used to confirm the theoretically expected behaviour of the indices and to validate their comparability across countries. For a detailed description of the methods, see the section “Cross-country comparability of scaled indices” in this chapter, and the PISA 2018 Technical Report (OECD, forthcoming[2]). There are three types of indices: simple indices, new scale indices and trend scale indices.

Simple indices are the variables that are constructed through the arithmetic transformation or recoding of one or more items in exactly the same way across assessments. Here, item responses are used to calculate meaningful variables, such as the recoding of the four-digit ISCO-08 codes into “Highest parents’ socio-economic index (HISEI)” or teacher-student ratio based on information from the school questionnaire.

Scale indices are the variables constructed through the scaling of multiple items. Unless otherwise indicated, the index was scaled using a two-parameter item-response model (a generalised partial credit model was used in the case of items with more than two categories) and values of the index correspond to Warm likelihood estimates (WLE) (Warm, 1989[3]). For details on how each scale index was constructed, see the PISA 2018 Technical Report (OECD, forthcoming[2]). In general, the scaling was done in two stages:

  • The item parameters were estimated based on all students from equally-weighted countries and economies; only cases with a minimum number of three valid responses to items that are part of the index were included. In the case of some trend indices, a common calibration linking procedure was used: countries/economies that participated in both PISA 2009 and PISA 2018 contributed both samples to the calibration of item parameters; each cycle and, within each cycle, each country/economy contributed equally to the estimation.1

  • For new scale indices, the Warm likelihood estimates were then standardised so that the mean of the index value for the OECD student population was zero and the standard deviation was one (countries were given equal weight in the standardisation process).

Sequential codes were assigned to the different response categories of the questions in the sequence in which the latter appeared in the student, school or parent questionnaires. Where indicated in this section, these codes were inverted for the purpose of constructing indices or scales. Negative values for an index do not necessarily imply that students responded negatively to the underlying questions. A negative value merely indicates that a respondent answered less positively than other respondents did on average across OECD countries. Likewise, a positive value on an index indicates that a respondent answered more favourably, or more positively, on average, than other respondents in OECD countries did.

Terms enclosed in brackets < > in the following descriptions were replaced in the national versions of the student, school and parent questionnaires by the appropriate national equivalent. For example, the term <qualification at ISCED level 5A> was translated in the United States into “Bachelor’s degree, post-graduate certificate program, Master’s degree program or first professional degree program”. Similarly, the term <classes in the language of assessment> in Luxembourg was translated into “German classes” or “French classes”, depending on whether students received the German or French version of the assessment instruments.

In addition to simple and scaled indices described in this annex, there are a number of variables from the questionnaires that were used in this volume and correspond to single items. These non-recoded variables have prefix of “ST”, “SC”, “PA”, “IC” and “WB” for the questionnaire items in the student, school, parent, ICT and Well-being questionnaires, respectively. All the context questionnaires, and the PISA international database, including all variables, are available through www.oecd.org/pisa.

copy the linklink copied!Student-level simple indices

Immigrant background

Information on the country of birth of the students and their parents was collected. Included in the database are three country-specific variables relating to the country of birth of the student, mother and father (ST019). The variables are binary and indicate whether the student, mother and father were born in the country of assessment or elsewhere. The index on immigrant background (IMMIG) is calculated from these variables, and has the following categories: (1) native students (those students who had at least one parent born in the country); (2) second-generation students (those born in the country of assessment but whose parents were born in another country); and (3) first-generation students (those students born outside the country of assessment and whose parents were also born in another country). Students with missing responses for either the student or for both parents were given missing values for this variable.

Grade repetition

The grade repetition variable (REPEAT) was computed by recoding variables ST127Q01TA, ST127Q02TA and ST127Q03TA. REPEAT took the value of “1” if the student had repeated a grade in at least one ISCED level and the value of “0” if “no, never” was chosen at least once, provided that the student had not repeated a grade in any of the other ISCED levels. The index was assigned a missing value if none of the three categories were ticked for any of the three ISCED levels.

Education expectations

Students’ responses to question ST225 regarding the level of education they expect to complete were used for identifying those students who expected to complete tertiary education, defined using International Standardised Classification of Education 1997 <ISCED level 5A> and/or <ISCED level 6> (theoretically oriented tertiary and post-graduate).

Skipping classes or days of school

Students’ responses to whether, in the two weeks prior to the PISA test, they had skipped classes (ST062Q02TA) or days of school (ST062Q01TA) at least once were used to derive an indicator of student truancy. The indicator takes a value of 0 if students reported that they had not skipped any class or whole day of school in the two weeks prior to the PISA test, and a value of 1 if students reported that they had skipped classes or days of school at least once in the same period.

Arriving late for school

Students responded to a question about whether and how frequently they had arrived late for school during the two weeks prior to the PISA test (ST062Q03TA). This variable was used to derive an indicator of lateness that takes a value of 0 if students reported that they had not arrived late for school in the two weeks prior to the PISA test, and takes a value of 1 if students reported that they had arrived late for school at least once in the same period.

Time spent online outside of school

In 51 of the 52 countries and economies that distributed the ICT questionnaire, PISA 2018 asked students how much time they spend using the Internet during the typical weekday (IC006) and weekend day (IC007) outside of school. These two questions were combined to calculate the amount of time students spend connected to the Internet during a typical week. For each category, the intermediate value was used (e.g. 15.5 minutes for the category “1-30 minutes per day”), and a value of 420 minutes was used for the category “More than 6 hours per day”. Five categories of Internet users were then created based on this indicator: “low Internet user” (0-9 hours per week); “moderate Internet user” (10-19 hours per week); “average Internet user” (20-29 hours per week); “high Internet user” (30-39 hours per week); and “heavy Internet user” (more than 40 hours per week).

copy the linklink copied!Student-level scale indices

Adaptive instruction

The index of adaptive instruction (ADAPTIVITY) was constructed using students’ responses to a new question developed for PISA 2018 (ST212). Students reported how often ( “never or almost never”, “some lessons”, “many lessons”, “every lesson or almost every lesson”) the following things happened in language-of-instruction lessons: “The teacher adapts the lesson to my class’s needs and knowledge”; “The teacher provides individual help when a student has difficulties understanding a topic or task”; and “The teacher changes the structure of the lesson on a topic that most students find difficult to understand “. Positive values on this scale mean that students perceived their language-of-instruction teachers to be more adaptive than did the average student across OECD countries.

Attitudes towards competition

The index of attitudes towards competition (COMPETE) was constructed using students’ responses to a new question (ST181) over the extent they “strongly disagreed”, “disagreed”, “agreed” or “strongly agreed” with the following statements: “I enjoy working in situations involving competition with others”; “It is important for me to perform better than other people on a task”; and “I try harder when I’m in competition with other people”. Positive values on this scale mean that students expressed more favourable attitudes towards competition than did the average student across OECD countries.

Exposure to bullying

PISA 2018 asked (ST038) students how often ( “never or almost never”, “a few times a year”, “a few times a month”, “once a week or more”) during the 12 months prior to the PISA test they had the following experiences in school, including those that happen in social media: “Other students left me out of things on purpose”; “Other students made fun of me”; “I was threatened by other students”; “Other students took away or destroyed things that belong to me”; “I got hit or pushed around by other students”; and “Other students spread nasty rumours about me”. The first three statements were combined to construct the index of exposure to bullying (BEINGBULLIED). Positive values on this scale indicate that the student was more exposed to bullying at school than the average student in OECD countries; negative values on this scale indicate that the student was less exposed to bullying at school than the average student across OECD countries.

Fear of failure

Students in PISA 2018 were asked to report the extent to which they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements (ST183): “When I am failing, I worry about what others think of me”; “When I am failing, I am afraid that I might not have enough talent”; and “When I am failing, this makes me doubt my plans for the future”. These statements were combined to create the index of fear of failure (GFOFAIL). Positive values in this index mean that the student expressed a greater fear of failure than did the average student across OECD countries.

Learning goals

Students in PISA 2018 were asked (ST208) to respond how true ( “not at all true of me”, “slightly true of me”, “moderately true of me”, “very true of me”, “extremely true of me”) the following statements are for them: “My goal is to learn as much as possible”; “My goal is to completely master the material presented in my classes”; and “My goal is to understand the content of my classes as thoroughly as possible”. These statements were combined to construct the index of learning goals (MASTGOAL). Positive values in the index indicate more ambitious learning goals than the average student across OECD countries.

Motivation to master tasks

PISA 2018 asked students (ST182) to report the extent to which they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements about themselves: “I find satisfaction in working as hard as I can”; “Once I start a task, I persist until it is finished”; “Part of the enjoyment I get from doing things is when I improve on my past performance”; and “If I am not good at something, I would rather keep struggling to master it than move on to something I may be good at”. The first three statements were combined to create the index of motivation to master tasks (WORKMAST). Positive values in the index indicate greater motivation than the average student across OECD countries.

Meaning in life

PISA 2018 asked students (ST185) to report the extent to which they agree ( “strongly agree”, “agree”, “disagree”, “strongly disagree”) with the following statements: “My life has clear meaning or purpose”; “I have discovered a satisfactory meaning in life”; and “I have a clear sense of what gives meaning to my life”. These statements were combined to form the index of meaning in life (EUDMO). Positive values in the index indicate greater meaning in life than the average student across OECD countries.

Positive feelings

PISA 2018 asked students (ST186) to report how frequently ( “never”, “rarely”, “sometimes”, “always”) they feel happy, lively, proud, joyful, cheerful, scared, miserable, afraid and sad. Three of these positive feelings – happy, joyful and cheerful – were combined to create an index of positive feelings (SWBP). Positive values in this index mean that the student reported more positive feelings than the average student across OECD countries. An index of negative feelings was not created because of the low internal consistency of the index across PISA-participating countries.

Self-efficacy

PISA 2018 asked (ST188) students to report the extent to which they agree ( “strongly disagree”, “disagree, “agree”, “strongly agree”) with the following statements about themselves: “I usually manage one way or another”; “I feel proud that I have accomplished things”; “I feel that I can handle many things at a time”; “My belief in myself gets me through hard times”; and “When I’m in a difficult situation, I can usually find my way out of it”. These statements were combined to create the index of self-efficacy (RESILIENCE). Positive values in this index mean that the student reported higher self-efficacy than did the average student across OECD countries.

Student competition

PISA 2018 asked (ST205) students how true ( “not at all true”, “slightly true”, “very true”, “extremely true”) the following statements about their school are: “Students seem to value competition”; “It seems that students are competing with each other”; “Students seem to share the feeling that competing with each other is important”; and “Students feel that they are being compared with others”. The first three statements were combined to create the index of student competition (PERCOMP). Positive values in this index mean that students perceived their peers to compete with each other to a greater extent than did the average student across OECD countries.

Student co-operation

PISA 2018 asked (ST206) students how true ( “not at all true”, “slightly true”, “very true”, “extremely true”) the following statements about their school are: “Students seem to value co-operation”; “It seems that students are co-operating with each other”; “Students seem to share the feeling that co-operating with each other is important”; and “Students feel that they are encouraged to cooperate with others”. The first three statements were combined to create the index of student co-operation (PERCOOP). Positive values in this index mean that students perceived their peers to co-operate to a greater extent than did the average student across OECD countries.

Teacher enthusiasm

PISA 2018 asked (ST213) students whether they agree ( “strongly agree”, “agree”, “disagree”, “strongly disagree”) with the following statements about the two language-of-instruction lessons they attended prior to sitting the PISA test: “It was clear to me that the teacher liked teaching us”; “The enthusiasm of the teacher inspired me”; “It was clear that the teacher likes to deal with the topic of the lesson”; and “The teacher showed enjoyment in teaching”. These statements were combined to create the index of teacher enthusiasm (TEACHINT). Positive values in this index mean that students perceived their language-of-instruction teachers to be more enthusiastic than did the average student across OECD countries.

copy the linklink copied!Indices included in earlier assessments

Disciplinary climate

The index of disciplinary climate (DISCLIMA) was constructed using students’ responses to a trend question about how often ( “every lesson”, “most lessons”, “some lessons”, “never or hardly ever”) the following happened in their language-of-instruction lessons (ST097): ” Students don’t listen to what the teacher says”; “There is noise and disorder”; “The teacher has to wait a long time for students to quiet down”; “Students cannot work well”; and “Students don’t start working for a long time after the lesson begins”. Positive values on this scale mean that the student enjoyed a better disciplinary climate in language-of-instruction lessons than the average student across OECD countries. Values in the index of disciplinary climate are directly comparable between PISA 2009 and PISA 2018 (see note 1 for more details).

Enjoyment of reading

The index of enjoyment of reading (JOYREAD) was constructed based on a trend question (ST160) from PISA 2009 (ID in 2009: ST24) asking students whether they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements: “I read only if I have to”; “Reading is one of my favourite hobbies”; “I like talking about books with other people”; “For me, reading is a waste of time”; and “I read only to get information that I need”. Positive values on this scale mean that the student enjoyed reading to a greater extent than the average student across OECD countries. Scores of the index of enjoyment of reading are directly comparable between PISA 2009 and PISA 2018 (see note 1 for more details).

Parents’ emotional support

The index of parents’ emotional support (EMOSUPS) was constructed based on a trend question (ST123) asking students whether they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements related to the academic year when they sat the PISA test: “My parents support my educational efforts and achievements”; “My parents support me when I am facing difficulties at school”; and “My parents encourage me to be confident”. Positive values on this scale mean that students perceived greater levels of emotional support from their parents than did the average student across OECD countries.

Sense of belonging

The index of sense of belonging (BELONG) was constructed using students’ responses to a trend question about their sense of belonging to school. Students were asked whether they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following school-related statements (ST034): “I feel like an outsider (or left out of things) at school”; “I make friends easily at school”; “I feel like I belong at school”; “I feel awkward and out of place in my school”; “Other students seem to like me”; and “I feel lonely at school”. Positive values on this scale mean that students reported a greater sense of belonging at school than did the average student across OECD countries.

Teacher-directed instruction

The index of teacher-directed instruction (DIRINS) was constructed from students’ reports on how often ( “never or hardly never”, “some lessons”, “most lessons”, “every lesson”) the following happened in their language-of-instruction lessons (ST102): “The teacher sets clear goals for our learning”; “The teacher asks questions to check whether we have understood what was taught”; “At the beginning of a lesson, the teacher presents a short summary of the previous lesson”; and “The teacher tells us what we have to learn”. Positive values on this scale mean that students perceived their teachers to use teacher-directed practices more frequently than did the average student across OECD countries.

Teacher feedback

The index of teacher feedback (PERFEED) was constructed using students’ responses to a trend question (ST104) about how often ( “never or almost ever”, “some lessons”, “many lessons”, “every lesson or almost every lesson”) the following things happen in their language-of-instruction lessons: “The teacher gives me feedback on my strengths in this subject”; “The teacher tells me in which areas I can still improve”; and “The teacher tells me how I can improve my performance”. Positive values on this scale mean that students perceived their teachers to provide feedback more frequently than did the average student across OECD countries.

Teachers’ stimulation of reading engagement

The index of teachers’ stimulation of reading engagement (STIMREAD) was constructed based on a trend question (ST152) from PISA 2009 (ID in 2009: ST37) asking students how often ( “never or hardly ever”, “in some lessons”, “in most lessons”, “in all lessons”) the following occur in their language-of-instruction lessons: “The teacher encourages students to express their opinion about a text”; “The teacher helps students relate the stories they read to their lives”; “The teacher shows students how the information in texts builds on what they already know”; and “The teacher poses questions that motivate students to participate actively”. Positive values on this scale mean that the students perceived their teacher to provide greater stimulation than did the average student across OECD countries.

Teacher support

The index of teacher support (TEACHSUP) was constructed using students’ responses to a trend question (ST100) about how often ( “every lesson”, “most lessons”, “some lessons”, “never or hardly ever”) the following things happen in their language-of-instruction lessons: “The teacher shows an interest in every student’s learning”; “The teacher gives extra help when students need it”; “The teacher helps students with their learning”; and “The teacher continues teaching until the students understand”. Positive values on this scale mean that students perceived their teacher to support them more frequently than did the average student across OECD countries.

Value of school

The index of value of school (ATTLNACT) was constructed based on a trend question (ST036) asking students whether they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following school-related statements: “Trying hard at school will help me get a good job “; “Trying hard at school will help me get into a good <college>”; and “Trying hard at school is important”. Positive values on this scale mean that the student valued schooling to a greater extent than the average student across OECD countries.

copy the linklink copied!Scaling of indices related to the pisa index of economic social and cultural status

The PISA index of economic, social and cultural status (ESCS) was derived, as in previous cycles, from three variables related to family background: parents’ highest level of education (PARED), parents’ highest occupational status (HISEI), and home possessions (HOMEPOS), including books in the home.

Parents’ highest level of education

Students’ responses to questions ST005, ST006, ST007 and ST008 regarding their parents’ education were classified using ISCED 1997 (OECD, 1999[4]). Indices on parental education were constructed by recoding educational qualifications into the following categories: (0) None, (1) <ISCED level 1> (primary education), (2) <ISCED level 2> (lower secondary), (3) <ISCED level 3B or 3C> (vocational/pre-vocational upper secondary), (4) <ISCED level 3A> (general upper secondary) and/or <ISCED level 4> (non-tertiary post-secondary), (5) <ISCED level 5B> (vocational tertiary) and (6) <ISCED level 5A> and/or <ISCED level 6> (theoretically oriented tertiary and post-graduate). Indices with these categories were provided for a student’s mother (MISCED) and father (FISCED), and the index of highest education level of parents (HISCED) corresponded to the higher ISCED level of either parent. The index of highest education level of parents was also recoded into estimated number of years of schooling (PARED). In PISA 2018, to avoid issues related to the misreporting of parental education by students, students’ answers about post-secondary qualifications were considered only for those students who reported their parents’ highest level of schooling to be at least lower secondary education. The conversion from ISCED levels to year of education is common to all countries. This international conversion was determined by using the modal years of education across countries for each ISCED level. The correspondence is available in the PISA 2018 Technical Report (OECD, forthcoming[2]).

Parents’ highest occupational status

Occupational data for both the student’s father and the student’s mother were obtained from responses to open-ended questions. The responses were coded to four-digit ISCO codes (ILO, 2007) and then mapped to the international socio-economic index of occupational status (ISEI) (Ganzeboom and Treiman, 2003[5]). In PISA 2018, as in PISA 2015, the new ISCO and ISEI in their 2008 version were used rather than the 1988 versions that had been applied in the previous four cycles (Ganzeboom, 2010[6]). Three indices were calculated based on this information: father’s occupational status (BFMJ2); mother’s occupational status (BMMJ1); and the highest occupational status of parents (HISEI), which corresponds to the higher ISEI score of either parent or to the only available parent’s ISEI score. For all three indices, higher ISEI scores indicate higher levels of occupational status. In PISA 2018, in order to reduce missing values, an ISEI value of 17 (equivalent to the ISEI value for ISCO code 9000, corresponding to the major group “Elementary Occupations”) was attributed to pseudo-ISCO codes 9701, 9702 and 9703 ( “Doing housework, bringing up children”, “Learning, studying”, “Retired, pensioner, on unemployment benefits”).

Household possessions

In PISA 2018, students reported the availability of 16 household items at home (ST011), including three country-specific household items that were seen as appropriate measures of family wealth within the country’s context. In addition, students reported the amount of possessions and books at home (ST012, ST013). HOMEPOS is a summary index of all household and possession items (ST011, ST012 and ST013).

Computation of ESCS

For the purpose of computing the PISA index of economic, social and cultural status (ESCS), values for students with missing PARED, HISEI or HOMEPOS were imputed with predicted values plus a random component based on a regression on the other two variables. If there were missing data on more than one of the three variables, ESCS was not computed and a missing value was assigned for ESCS.

In previous cycles, the PISA index of economic, social and cultural status was derived from a principal component analysis of standardised variables (each variable has an OECD mean of zero and a standard deviation of one), taking the factor scores for the first principal component as measures of the PISA index of economic, social and cultural status. In PISA 2018, ESCS is computed by attributing equal weight to the three standardised components. As in PISA 2015, the three components were standardised across all countries and economies (both OECD and partner countries/economies), with each country/economy contributing equally (in cycles prior to 2015, the standardisation and principal component analysis was based on OECD countries only). As in every previous cycle, the final ESCS variable was transformed, with 0 the score of an average OECD student and 1 the standard deviation across equally weighted OECD countries.

copy the linklink copied!School-level simple indices

School type

Schools are classified as either public or private, according to whether a private entity or a public agency has the ultimate power to make decisions concerning its affairs (Question SC013). Public schools are managed directly or indirectly by a public education authority, government agency, or governing board appointed by government or elected by public franchise. Private schools are managed directly or indirectly by a non-government organisation, such as a church, trade union, business or other private institution. In some countries and economies, such as Ireland, the information from SC013 is combined with administrative data to determine whether the school is privately or publicly managed.

Socio-economic profile of the schools

Advantaged and disadvantaged schools are defined in terms of the socio-economic profile of schools. All schools in each PISA-participating education system are ranked according to their average PISA index of economic, social and cultural status (ESCS) and then divided into four groups with approximately an equal number of students (quarters). Schools in the bottom quarter are referred to as “socio-economically disadvantaged schools”; and schools in the top quarter are referred to as “socio-economically advantaged schools”.

copy the linklink copied!School-level scale indices

Indices included in earlier assessments

Shortage of educational staff

As in PISA 2015 and 2012, PISA 2018 included an eight-item question (SC017) about school resources, measuring school principals’ perceptions of potential factors hindering instruction at school ( “Is your school’s capacity to provide instruction hindered by any of the following issues?”). The four response categories were “not at all”, “very little”, “to some extent”, and “a lot”. A similar question was used in previous cycles, but items were reduced and reworded for 2012 focusing on two derived variables. The index of staff shortage (STAFFSHORT) was derived from the first four items: a lack of teaching staff; inadequate or poorly qualified teaching staff; a lack of assisting staff; inadequate or poorly qualified assisting staff. Positive values in this index mean that principals viewed the amount and/or quality of the human resources in their schools as an obstacle to providing instruction to a greater extent than the OECD average.

Teacher behaviour hindering learning

The index of teacher behaviour hindering learning (TEACHBEHA) was constructed using school principals’ responses to a trend question (SC061) about the extent to which ( “not at all”, “very little”, “to some extent”, “a lot”) they think that student learning in their schools is hindered by such factors as “Teachers not meeting individual students’ needs”; “Teacher absenteeism”; “School staff resisting change”; “Teachers being too strict with students”; and “Teachers not being well-prepared for classes”. Positive values reflect principals’ perceptions that these teacher-related behaviours hinder learning to a greater extent; negative values indicate that principals believed that these teacher-related behaviours hinder learning to a lesser extent, compared to the OECD average.

copy the linklink copied!Parent-level scale indices

Indices included in earlier assessments

Parents’ perceived school quality

The index of parents’ perceived school quality (PQSCHOOL) was constructed using parents’ responses to the trend question (PA007) about the extent to which they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements: “Most of my child’s school teachers seem competent and dedicated”; “Standards of achievement are high in my child’s school”; “I am happy with the content taught and the instructional methods used in my child’s school”; “I am satisfied with the disciplinary atmosphere in my child’s school”; “My child’s progress is carefully monitored by the school”; “My child’s school provides regular and useful information on my child’s progress”; and “My child’s school does a good job in educating students”. Positive values reflect that parents perceived their child’s school to be of higher quality, negative values indicate that parents perceived their child’s school to be of lower quality, than the OECD average parents’ perceptions.

School policies for parental involvement

The index of school policies for parental involvement (PASCHPOL) was constructed using parents’ responses to the trend question (PA007) about the extent to which they agree ( “strongly disagree”, “disagree”, “agree”, “strongly agree”) with the following statements: “My child’s school provides an inviting atmosphere for parents to get involved”; “My child’s school provides effective communication between the school and families”; “My child’s school involves parents in the school’s decision-making process”; “My child’s school offers parent education”; “My child’s school informs families about how to help students with homework and other school-related activities”; and “My child’s school co-operates with <community services> to strengthen school programmes and student development”. Positive values reflect parents’ perceptions that these school policies for parental involvement exist to a greater extent, negative values indicate that these school policies for parental involvement exist to a lesser extent, than the OECD average.

copy the linklink copied!Cross-country comparability of scaled indices

While the forthcoming PISA 2018 Technical Report (OECD, forthcoming[2]) will explain in detail the scaling procedures and the construct validation of all context- questionnaire data, this section presents a summary of the analyses carried out to validate the cross-country comparability of the main scaled indices used in this volume. The internal consistency of scaled indices and the invariance of item parameters are the two approaches that PISA 2018 used to examine the comparability of scaled indices across school systems. Based on these two approaches, all indices examined in this volume met the reporting criteria.

Internal consistency refers to the extent to which the items that make up an index are inter-related. Cronbach’s Alpha was used to check the internal consistency of each scale within the countries/economies and to compare it amongst countries/economies. The coefficient of Cronbach’s Alpha ranges from 0 to 1, with higher values indicating higher internal consistency. Similar and high values across countries/economies are an indication of having measured reliably across countries/economies. Commonly accepted cut-off values are 0.9 for excellent, 0.8 for good, and 0.7 for acceptable internal consistency. In the PISA 2018 context, indices were always omitted for countries and economies with values below 0.6, and for some countries and economies with values between 0.6 and 0.7.

Table III.A1.1, available online, presents the Cronbach’s Alpha for the main scaled indices in this volume. Based on these results, the following indices were omitted from individual countries/economies:

  • Exposure to bullying (BEINGBULLIED): Korea

  • Teacher support (TEACHSUP): Ukraine

  • Positive feelings (SWBP): Italy, Morocco and Viet Nam

  • Self-efficacy (RESILIENCE): Viet Nam

PISA 2018 examined the cross-country comparability of scaled indices also through the invariance of item parameters. The idea was to test whether the item parameters of an index could be assumed to be the same (invariant) across groups of participating countries and language groups. In a first step, groups were defined based on samples of at least 300 students responding to the same language-version questionnaire in a country. In a second step, international and student parameters were estimated based on students across all groups. In a third step, the root mean square deviance (RMSD) item-fit statistics was calculated for each group and item. Values close to zero signal a good item fit, indicating that the international model describes student responses within individual groups accurately. Any group receiving a value above 0.3 was flagged and a group-specific item parameter was calculated. Steps 2 and 3 were then repeated until all items exhibited RMSD values below 0.3. The RMSD values will be reported in the forthcoming PISA 2018 Technical Report. Amongst the main indices examined in this volume, some needed just one round to ensure that all items exhibited acceptable levels of RMSD, whereas other indices needed several iterations:

  • One round: exposure to bullying, teacher support, teacher feedback, student co-operation, meaning in life, positive feelings and fear of failure.

  • Several rounds: disciplinary climate (2 rounds), teacher enthusiasm (2 rounds), teacher behaviour hindering learning (4 rounds), student competition (2 rounds), sense of belonging (2 rounds) and self-efficacy (2 rounds).

In addition to country-specific omissions, some indices were also omitted for all countries. With regard to this volume, the original plan was to produce an index of negative feelings, in the same way that an index of positive feelings was created (which includes the items “happy”, “joyful” and “cheerful”; see Chapter 12). However, an index of negative feelings was omitted because it showed low internal consistency and low invariance of item parameters. Consequently, negative feelings are analysed individually in the report.

copy the linklink copied!Tables available on line

https://doi.org/10.1787/888934030838

  • Table III.A1.1 Internal consistency of the main scaled indices

References

[6] Ganzeboom, H. (2010), “A new international socio-economic index (ISEI) of occupational status for the international standard classification of occupation 2008 (ISCO-08) constructed with data from the ISSP 2002-2007”, Paper presented at Annual Conference of International Social Survey Programme, Lisbon, Portugal, https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.627.203&rep=rep1&type=pdf (accessed on 28 October 2019).

[5] Ganzeboom, H. and D. Treiman (2003), “Three internationally standardised measures for comparative research on occupational status”, in Hoffmeyer-Zlotnik, J. and C. Wolf (eds.), Advances in Cross-National Comparison, Springer US, Boston, MA, http://dx.doi.org/10.1007/978-1-4419-9186-7_9.

[1] OECD (2019), PISA 2018 Assessment and Analytical Framework, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/b25efab8-en.

[4] OECD (1999), Classifying Educational Programmes: Manual for ISCED-97 Implementation in OECD Countries, OECD Publishing, Paris, http://www.oecd.org/education/1841854.pdf (accessed on 28 October 2019).

[2] OECD (forthcoming), PISA 2018 Technical Report, OECD Publishing, Paris.

[3] Warm, T. (1989), “Weighted likelihood estimation of ability in item response theory”, Psychometrika, Vol. 54/3, pp. 427-450, http://dx.doi.org/10.1007/BF02294627.

Note

← 1. PISA expert groups identified a few indices that should be scaled to make index values directly comparable between PISA 2009 and PISA 2018. These indices include DISCLIMA, JOYREAD and JOYREADP. For these trend indices, a common calibration linking procedure was used. Countries and economies that participated in both PISA 2009 and PISA 2018 contributed both samples to the calibration of item parameters. Each country/ economy contributed equally to the estimation in each cycle. Trend indices were equated so that the mean and standard deviation of rescaled PISA 2009 estimates and of the original estimates included in the PISA 2009 database, across OECD countries, matched. Trend indices are therefore reported on the same scale as used in PISA 2009, so that values can be directly compared to those included in the PISA 2009 database.

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/acd78851-en

© OECD 2019

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.