This annex discusses the PISA target population and the procedures used to select the sample that represented the target population. The information presented below is, for the most part, a summary of the information presented in Annex A2 of PISA 2018 Results (Volume I): What Students Know and Can Do (OECD, 2019[1]); the reader is invited to refer to that volume for more details. This annex also includes information specific to the financial literacy sample.
PISA 2018 Results (Volume V)
Annex A2. The PISA target population, the PISA samples and the definition of schools
Who is the PISA target population?
PISA 2018 assessed the cumulative outcomes of education and learning at a point at which most young people are still enrolled in formal education – when they are 15 years old.
Any international survey of education must guarantee the comparability of its target population across nations. One way to do this is to assess students at the same grade level. However, differences between countries in the nature and extent of pre-primary education and care, the age at entry into formal schooling, and the institutional structure of education systems do not allow for a definition of internationally comparable grade levels.
Other international assessments have defined their target population by the grade level that provides maximum coverage of a particular age cohort. However, this method is particularly sensitive to the distribution of students across age and grade levels; small changes in this distribution can lead to the selection of different target grades, even within the same country over different PISA cycles. There also may be differences across countries in whether students who are older or younger than the desired age cohort are represented in the modal grade, further rendering such grade level-based samples difficult to compare.
To overcome these problems, PISA uses an age-based definition of its target population, one that is not tied to the institutional structures of national education systems. PISA assesses students who were aged between 15 years and 3 (complete) months and 16 years and 2 (complete) months1 at the beginning of the assessment period, plus or minus an allowed 1-month variation, and who were enrolled in an educational institution2 at grade 7 or higher.3 All students who met these criteria were eligible to sit the PISA assessment, regardless of the type of educational institution in which they were enrolled and whether they were enrolled in full-time or part-time education. This also allows PISA to evaluate students shortly before they are faced with major life choices, such as whether to continue with education or enter the workforce.
Hence, PISA makes statements about the knowledge and skills of a group of individuals who were born within a comparable reference period, but who may have undergone different educational experiences both in and outside of school. These students may be distributed over different ranges of grades (both in terms of the specific grade levels and the spread in grade levels) in different countries, or in different tracks or streams within countries. It is important to consider these differences when comparing PISA results across countries. In addition, differences in performance observed when students are 15 may disappear later on if students’ experiences in education converge over time.
If a country’s mean scores in reading, mathematics, science or financial literacy are significantly higher than those of another country, it cannot automatically be inferred that schools or particular parts of the education system in the first country are more effective than those in the second. However, one can legitimately conclude that it is the cumulative impact of learning experiences in the first country, starting in early childhood and up to the age of 15, and including all experiences, whether they be at school, home or elsewhere, that have resulted in the better outcomes of the first country in the subjects that PISA assesses.4
The PISA target population does not include residents of a country who attend school in another country. It does, however, include foreign nationals who attend school in the country of assessment.
To accommodate countries that requested grade-based results for the purpose of national analyses, PISA 2018 provided a sampling option to supplement age-based sampling with grade-based sampling.
How were students chosen?
The accuracy of the results from any survey depends on the quality of the information drawn from those surveyed as well as on the sampling procedures. Quality standards, procedures, instruments and verification mechanisms were developed for PISA that ensured that national samples yielded comparable data and that the results could be compared across countries with confidence. Experts from the PISA Consortium selected the samples for most participating countries/economies and monitored the sample-selection process closely in those countries that selected their own samples.
Most PISA samples were designed as two-stage stratified samples.5 The first stage sampled schools in which 15-year-old students may be enrolled. Schools were sampled systematically with probabilities proportional to the estimated size of their (eligible) 15-year-old population. At least 150 schools6 were selected in each country, although the requirements for national analyses often demanded a larger sample. Replacement schools for each sampled school were simultaneously identified, in case an originally sampled school chose not to participate in PISA 2018.
The second stage of the selection process sampled students within sampled schools. Once schools were selected, a list of each sampled school’s 15-year-old students was prepared. From this list, 42 students were then selected with equal probability (all 15-year-old students were selected if fewer than 42 were enrolled). The number of students who were to be sampled in a school could deviate from 42 but could not fall below 20.
Data-quality standards in PISA required minimum participation rates for schools as well as for students. These standards were established to minimise the potential for bias resulting from non-response. Indeed, it was likely that any bias resulting from non-response would be negligible – i.e. typically smaller than the sampling error – in countries that met these standards.
At least 85% of the schools initially selected to take part in the PISA assessment were required to agree to conduct the test. Where the initial response rate of schools was between 65% and 85%, however, an acceptable school-response rate could still be achieved through the use of replacement schools. Inherent in this procedure was a risk of introducing bias, if replacement schools differed from initially sampled schools along dimensions other than those considered for sampling. Participating countries were therefore encouraged to persuade as many of the schools in the original sample as possible to participate.
Schools with a student participation rate of between 25% and 50% were not considered to be participating schools, but data (from both the cognitive assessment and questionnaire) from these schools were included in the database and contributed to the various estimates. Data from schools with a student participation rate of less than 25% were excluded from the database.
In PISA 2018, two countries that participated in the financial literacy assessment – Latvia (82%) and the United States (65%) – did not meet the 85% threshold, but met the 65% threshold, amongst schools initially selected to take part in the PISA assessment. Upon replacement, the United States (76%) still failed to reach an acceptable participation rate.7 Amongst the schools initially selected before replacement, the Netherlands (61%) did not meet the 65% school response-rate threshold, but it reached a response rate of 87% upon replacement. However, these were not considered to be major issues as, for each of these countries and economies, additional non-response analyses showed that there were limited differences between schools that did participate and the full set of schools originally drawn in the sample.8 Data from these jurisdictions were hence considered to be largely comparable with, and were therefore reported together with, data from other countries/economies.
PISA 2018 also required that at least 80% of the students chosen within participating schools participated themselves. This threshold was calculated at the national level and did not have to be met in each participating school. Follow-up sessions were required in schools where too few students had participated in the original assessment sessions. Student-participation rates were calculated over all original schools; and also over all schools, whether original or replacement schools. Students who participated in either the original or in any follow-up assessment sessions were counted in these participation rates; those who attended only the questionnaire session were included in the international database and contributed to the statistics presented in this publication if they provided at least a description of their father’s or mother’s occupation.
This 80% threshold was met in every country/economy except Portugal, where only 76% of students who were sampled actually participated. The high level of non-responding students could lead to biased results, e.g. if students who did not respond were more likely to be low-performing students. This was indeed the case in Portugal, but a non-response analysis based on data from a national mathematics assessment in the country showed that the upward bias of Portugal’s overall results was likely small enough to preserve comparability over time and with other countries. Data from Portugal was therefore reported along with data from the countries/economies that met this 80% student participation threshold.
Table I.A2.3, available on line, shows the response rate for students and schools, before and after replacement.
What proportion of 15-year-olds does PISA represent?
All countries and economies attempted to maximise the coverage of 15-year-olds enrolled in education in their national samples, including students enrolled in special-education institutions.
The sampling standards used in PISA only permitted countries to exclude up to a total of 5% of the relevant population (i.e. 15-year-old students enrolled in school at grade 7 or higher) either by excluding schools or excluding students within schools. Exclusions that should remain within the above limits include both:
at the school level:
schools that were geographically inaccessible or where the administration of the PISA assessment was not considered feasible
schools that provided teaching only for students in the categories defined under “within-school exclusions”, such as schools for the blind.
The percentage of 15-year-olds enrolled in such schools had to be less than 2.5% of the nationally desired target population (0.5% maximum for the former group and 2% maximum for the latter group). The magnitude, nature and justification of school-level exclusions are documented in the PISA 2018 Technical Report (OECD, forthcoming[2]).
at the student level:
schools with an intellectual disability, i.e. a mental or emotional disability resulting in the student being so cognitively delayed that he/she could not perform in the PISA testing environment
schools with a functional disability, i.e. a moderate to severe permanent physical disability resulting in the student being unable to perform in the PISA testing environment
students with limited assessment-language proficiency. These students were unable to read or speak any of the languages of assessment in the country at a sufficient level and unable to overcome such a language barrier in the PISA testing environment, and were typically students who had received less than one year of instruction in the language of assessment
other exclusions, a category defined by the PISA national centres in individual participating countries and approved by the PISA international consortium
students taught in a language of instruction for the major domain for which no materials were available.
Students could not be excluded solely because of low proficiency or common disciplinary problems. The percentage of 15-year-olds excluded within schools had to be less than 2.5% of the national desired target population.
All countries and economies attempted to maximise the coverage of 15-year-olds enrolled in education in their national samples, including students enrolled in special-education institutions. The only countries that participated in the PISA 2018 financial literacy assessment that did not meet this 5% standard were Canada (6.87%),9 the Netherlands (6.24%), Australia (5.72%) and Estonia (5.03%) (Table I.A2.1a, available on line). When language exclusions10 were accounted for (i.e. removed from the overall exclusion rate), Estonia no longer had an exclusion rate greater than 5%. Although exceeding the exclusion rate limit of 5%, data from Australia and Canada were deemed to be acceptable because exclusion rates have consistently been above 5% across cycles. In particular, this reason was accepted by a data-adjudication panel to allow for the reliable comparison of PISA results across countries and across time; thus the data from these countries were reported together with data from other countries/economies. More details can be found in the PISA 2018 Technical Report (OECD, forthcoming[2]).
However, in the Netherlands, there was a marked increase in students who were excluded within schools due to intellectual or functional disabilities. Moreover, a large proportion of students in the Netherlands was not excluded but assigned to UH (une heure) booklets, which were intended for students with special education needs (Table IV.A2.1). As these booklets did not cover the domain of financial literacy, the effective exclusion rate for the Netherlands in financial literacy was roughly 20%. This resulted in a strong upward bias in the country mean and other population statistics in that domain. Data from the Netherlands in financial literacy were not comparable with data from other education systems; but data from the Netherlands in the core PISA subjects were still deemed to be largely comparable. Recourse was made to the UH booklet in only four other participating countries and economies (the Canadian provinces, Finland, the Slovak Republic and the United States). In each of these countries/economies, less than 4% of the student sample were presented with this booklet and not the financial literacy booklet. The data-adjudication panel did not judge this to significantly affect the comparison of these countries’/economies’ results.
Table I.A2.1a describes the target population of the countries participating in PISA 2018. Further information on the target population and the implementation of PISA sampling standards can be found in the PISA 2018 Technical Report (OECD, forthcoming[2]).
The high level of coverage contributes to the comparability of the assessment results. For example, even assuming that the excluded students would have systematically scored worse than those who participated, and that this relationship is moderately strong, an exclusion rate on the order of 5% would likely lead to an overestimation of national mean scores of less than 5 score points on the PISA scale (where the standard deviation is 100 score points).11
Definition of schools
In some countries, subunits within schools were sampled instead of schools, which may affect the estimate of the between-school variance. In the Netherlands, locations were listed as sampling units. In Australia, each campus of a multi-campus school was sampled independently. Some schools in Portugal were organised into clusters where all units in a cluster shared the same teachers and principal; each of these clusters constituted a single sampling unit.
Sampling for the financial literacy assessment
All countries and economies, regardless of their participation in the financial literacy assessment, selected schools in the manner described above. However, countries/economies that participated in the financial literacy assessment sampled a larger number of students in each selected school. In this way, some students in these schools were presented with test forms that involved financial literacy booklets (along with booklets in mathematics, reading or both), while other students were presented with test forms that involved only the core subjects (mathematics, reading and science). To increase the size of the financial literacy student sample, financial literacy scores were imputed for those students who were given forms involving only mathematics and reading (forms 1 to 12); these students were then included in the financial literacy sample.
Table IV.A2.1 presents the number of students who comprised the financial literacy sample in each country/economy, and the number of 15-year-old students in each country/economy that the sample represented.
Table IV.A2.1. Sample size for financial literacy
|
Financial literacy assessment |
|||
---|---|---|---|---|
Number of participating students (unweighted) |
Number of participating students (weighted) |
Percentage of students sitting the une-heure booklet (%) |
||
(1) |
(2) |
(3) |
||
OECD |
||||
Australia |
9 411 |
256 109 |
0.00 |
|
Canadian provinces |
7 762 |
207 800 |
3.40 |
|
Chile |
4 485 |
211 928 |
0.00 |
|
Estonia |
4 167 |
11 543 |
0.00 |
|
Finland |
4 328 |
55 318 |
0.81 |
|
Italy |
9 182 |
521 823 |
0.00 |
|
Latvia |
3 151 |
15 979 |
0.00 |
|
Lithuania |
4 076 |
24 405 |
0.00 |
|
Netherlands |
3 042 |
163 127 |
14.11 |
|
Poland |
4 295 |
312 844 |
0.00 |
|
Portugal |
4 568 |
98 021 |
0.00 |
|
Slovak Republic |
3 411 |
42 575 |
2.83 |
|
Spain |
9 361 |
413 345 |
0.00 |
|
United States |
3 738 |
3 543 521 |
0.65 |
|
Partners |
Brazil |
8 311 |
2 045 364 |
0.00 |
Bulgaria |
4 110 |
47 910 |
0.00 |
|
Georgia |
4 321 |
38 431 |
0.00 |
|
Indonesia |
7 133 |
3 741 920 |
0.00 |
|
Peru |
4 734 |
425 561 |
0.00 |
|
Russia |
4 520 |
1 257 204 |
0.00 |
|
Serbia |
3 874 |
60 923 |
0.00 |
References
[3] OECD (2019), PISA 2018 Initial Report (Volume II): Where All Students Can Succeed, PISA, OECD Publishing, Paris, https://doi.org/10.1787/b5fd1b8f-en.
[1] OECD (2019), PISA 2018 Results (Volume I): What Students Know and Can Do, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/5f07c754-en.
[2] OECD (forthcoming), PISA 2018 Technical Report, OECD publishing.
Notes
← 1. More precisely, PISA assessed students who were at least 15 years and 3 complete months old and who were at most 16 years and 3 complete months old (i.e. younger than 16 years, 2 months and roughly 30 days old), with a tolerance of one month on each side of this age window. If the PISA assessment was conducted in April 2018, as was the case in most countries, all students born in 2002 would have been eligible.
← 2. Educational institutions are generally referred to as schools in this publication, although some educational institutions (in particular, some types of vocational education establishments) may not be referred to as schools in certain countries.
← 3. As might be expected from this definition, the average age of students across OECD countries was 15 years and 9 months. The range in country means was 2 months and 13 days (0.20 year), from the minimum country mean of 15 years and 8 months to the maximum country mean of 15 years and 10 months (OECD, 2019[3]).
← 4. Such a comparison is complicated by first-generation immigrant students, who received part of their education in a country other than the one in which they were assessed. Mean scores in any country/economy should be interpreted in the context of student demographics within that country/economy.
← 5. Details for countries that applied different sampling designs are documented in the PISA 2018 Technical Report (OECD, forthcoming[2]).
← 6. Due to the small size of these education systems, all schools and all eligible students within these schools were included in the samples of Brunei Darussalam, Iceland, Luxembourg, Macao (China), Malta, Montenegro and Qatar.
← 7. The threshold for an acceptable participation rate after replacement varies between 85% and 100%, depending on the participation rate before replacement.
← 8. In particular, in the case of the Netherlands, non-response bias analyses relied on direct measures of school performance external to PISA, typically from national assessments. More indirect correlates of school performance were analysed in the United States, due to the absence of national assessments.
← 9. Information on exclusions was available only for the entire country of Canada, not for the seven Canadian provinces that took part in the financial literacy assessment.
← 10. These exclusions refer only to those students with limited proficiency in the language of instruction/assessment. Exclusions related to the unavailability of test material in the language of instruction are not considered in this analysis.
← 11. If the correlation between the propensity of exclusions and student performance were 0.3, then resulting mean scores would likely have been overestimated by 1 score point if the exclusion rate were 1%; by 3 score points if the exclusion rate were 5%; and by 6 score points if the exclusion rate were 10%. If the correlation between the propensity of exclusions and student performance were 0.5, then resulting mean scores would likely have been overestimated by 1 score point if the exclusion rate were 1%; by 5 score points if the exclusion rate were 5%; and by 10 score points if the exclusion rate were 10%. For this calculation, a model was used that assumed a bivariate normal distribution for performance and the propensity to participate.