This chapter defines the four areas of school organisation that are examined in Volume V of the PISA 2018 Results: grouping and selecting students; resources invested in education; governance of education systems; and evaluations and assessments. It also discusses how much of the variation in student performance is related to system-, school- and student-level factors, and how to interpret the data presented.
PISA 2018 Results (Volume V)
Chapter 1. How PISA examines effective policies and successful schools
Abstract
Worldwide trends such as globalisation, technological change and growing inequality are posing new challenges to education systems and schools around the world (OECD, 2019[1]). School-management policies and practices play a key role in determining how education systems can respond to these challenges.
This volume describes school organisation – the policies and practices that define how education systems and schools work and change over time (Bidwell, 2001[2]) – in the 79 countries/economies that participated in PISA 2018. It examines ways in which school organisation is related to performance, equity in students’ learning outcomes and student well-being. The volume also analyses trends in school organisation to understand how schools and school systems have changed during the past decade, and how these changes are related to changes in performance and equity in students’ learning outcomes.
Building on the experience of prior PISA reports (OECD, 2016[3]; OECD, 2013[4]; OECD, 2016[5]), this volume focuses on four policy-relevant areas of school organisation (Figure V.1.1):
Grouping and selecting students – the structure of instructional grades and programmes that students must complete in order to graduate from schooling (i.e. vertical stratification), and how students are grouped and selected into different curricular programmes and ability groups (i.e. horizontal stratification)
Resources invested in education – the amount and kind of human resources (i.e. teacher and support staff) and material resources (i.e. physical infrastructure and pedagogical materials, including computers and other digital devices) available for schools, and how these resources are allocated and used; the amount of financial resources invested in education (i.e. expenditure per student over the theoretical duration of studies); the amount of students’ learning time that takes place during regular school hours for key subjects, such as language of instruction, mathematics and science; and the learning opportunities that schools offer to their students after regular school hours (e.g. additional lessons, support with homework, extracurricular activities)
Education system governance – how public and private organisations are involved in the administration and funding of schools, and the degree of school choice and school competition
Evaluation and assessment – the policies and practices through which education systems assess student learning and evaluate teacher practices and school outcomes (i.e. evaluation and assessment).
For each of these policy areas of school organisation, the report explores three main questions:
1. What are the main cross-country differences in school organisation policies and practices? And how does school organisation vary within countries according to school characteristics, such as the school’s socio-economic profile, location and public or private ownership (according to PISA 2018 data)?
• How are school-organisation policies and practices changing over time (across PISA cycles)?
• What is the relationship between these school-organisation policies and practices, and student achievement and equity? What is the relationship between changes in policies and practices over time and changes in education outcomes (performance and equity)?
Performance differences amongst school systems, schools and students
As discussed in Volume I of PISA 2018 Results, academic performance amongst 15-year-old students varies widely, and that variation can be broken down into differences at the student, school and school system levels. In PISA 2018, across all countries and economies, about 23% of the variation in reading performance pertained to mean differences in student performance between the participating school systems (Figure V.1.2). Across OECD countries, 6% of the variation in reading performance lay between school systems. On average across all participating countries and economies, about 33% of the variation in reading performance within countries lay between schools and 67% lay within schools. Across OECD countries, 31% of the variation in reading performance within countries lay between schools and 69% lay within schools.
This chapter relates school organisation to student performance within and between countries/economies. It also analyses differences between countries and economies in the relationships amongst school organisation, performance in reading, and the level of equity in a school system. The cross-national analyses provide an overview of how system-level attributes and key organisational arrangements are related to student performance, equity in school systems and student well-being. As always, such relationships require further study in order to determine causality; hence implications of causality are beyond the scope of this report (Box V.1.1).
Box V.1.1.. Interpreting the data from students and schools
PISA 2018 asked students and school principals to answer questions about the organisation of schools, and the social and economic contexts in which learning takes place. Information based on their responses was weighted so that it reflects the number of 15-year-old students enrolled in grade 7 or above. These are reports provided by principals and students themselves rather than external observations, and thus may be influenced by cultural differences in how individuals respond.1
In addition to the general constraints of self-reported data, there are other limitations, particularly those concerning the information collected from principals or the interpretation of school-level results, that should be taken into account when interpreting the data.
The learning environment examined by PISA may only partially reflect that which shaped students’ experiences in education earlier in their school careers, particularly in school systems where students progress through different types of educational institutions at the pre-primary, primary, lower secondary and upper secondary levels. To the extent that students’ current learning environment differs from that of their earlier school years, the contextual data collected by PISA are an imperfect proxy for students’ cumulative learning environments, and the effects of those environments on learning outcomes is likely to be underestimated. In most cases, 15-year-old students have been in their current school for only two to three years. This means that much of their academic development took place earlier, in other schools, which may have little or no connection with the school in which they were enrolled when they sat the PISA test.
In some countries and economies, the definition of the school in which students are taught is not straightforward because schools vary in the level and purpose of education. For example, in some countries and economies, subunits within schools (e.g. study programmes, shifts and campuses) were sampled instead of schools as administrative units (see Annex A2 for further information).
Although principals can provide information about their schools, generalising from a single source of information for each school and then matching that information with students’ reports is not straightforward. Also, principals’ perceptions may not be the most accurate source for some information related to teachers, such as teachers’ morale and commitment.
The age-based sampling followed in PISA means that, in some education systems, students are not always representative of their schools. Interpreting differences between schools appropriately therefore requires specific knowledge about how school systems are structured.
Despite these caveats, information from the school questionnaire provides unique insights into the ways in which national and subnational authorities seek to realise their education objectives.
Schooling and school effects
In using results from non-experimental data on school performance, such as the PISA database, it is important to bear in mind the distinction between school effects and the effects of schooling, particularly when interpreting the modest association between factors such as school resources, policies and institutional characteristics, on the one hand, and student performance, on the other. School effects are education researchers’ shorthand for the effect on academic performance of attending one school or another, usually schools that differ in resources or policies and institutional characteristics. Where schools and school systems do not vary in fundamental ways, the school effect can be modest. Nevertheless, modest school effects should not be confused with a lack of an effect of schooling (the influence on performance of not being schooled compared with being schooled).
Interpreting correlations and changes over time
A correlation indicates the strength and direction of a linear relationship, either positive or negative, between two variables. A correlation is a simple statistic that measures the degree to which two variables are associated with each other, but does not prove causality between the two.
Comparisons of results between resources, policies and practices, and reading performance across time (trends analyses) should also be interpreted with caution. Changes in the strength of the relationship between policies and practices, and reading performance cannot be considered causal because they can occur for two key reasons. First, a particular set of resources, policies and practices might have been chosen by higher-performing students (or higher-performing schools or high-performing systems) while that set of resources, policies and practices might not have existed in lower-performing students/schools/systems. Under this interpretation, the relationship between reading performance, and resources, policies and practices is stronger because they are available to higher-performing students/schools/systems. Second, a particular set of resources, policies and practices may have been used more extensively in 2018 than earlier, and may have promoted student learning more in 2018 than before. PISA trend data indicate where changes have occurred. However, in order to understand the nature of the change, further analysis is needed.
Interpreting results before and after accounting for socio-economic status
When examining the relationship between education outcomes and resources, policies and practices within school systems, this volume takes into account socio-economic differences amongst students, schools and systems. The advantage of doing this lies in comparing similar entities, namely students, schools and systems with similar socio-economic profiles. At the same time, there is a risk that such adjusted comparisons underestimate the strength of the relationship between student performance and resources, policies and practices, since most of the differences in performance are often attributable to both policies and socio-economic status.
Conversely, analyses that do not take socio-economic status into account can overstate the relationship between student performance and resources, policies and practices, as the level of resources and the kinds of policies adopted may also be related to the socio-economic profile of students, schools and systems. At the same time, analyses without adjustments may paint a more realistic picture of the schools that parents choose for their children. They may also provide more information for other stakeholders who are interested in the overall performance of students, schools and systems, including any effects that may be related to the socio-economic profile of schools and systems. For example, parents may be primarily interested in a school’s absolute performance standards, even if that school’s higher achievement record stems partially from the fact that the school has a larger proportion of advantaged students.
For the system-level analyses, in order to account for the extent to which the observed relationships are influenced by the level of economic development of countries and economies, correlations are examined before and after accounting for per capita GDP.
Interpreting the results by school characteristics
When presenting results by the socio-economic profile of schools, the location of schools, the type of school or the education level, the number of students and schools in each subsample has to meet the PISA reporting requirements of at least 30 students and 5 schools. Even when these reporting requirements are met, the reader should interpret the results cautiously when the number of students or schools is just above the threshold. Tables in Annex A5, available on line, show the unweighted number of students and schools, by school characteristics, in the PISA sample so that the reader can interpret the results appropriately.
1. While PISA aims to maximise the cross-national and cross-cultural comparability of complex constructs, it must do so while keeping the questionnaires relatively short and minimising the perceived intrusiveness of the questions. Despite the extensive investments PISA makes in monitoring the process of translation, standardising the administration of the assessment, selecting questions and analysing the quality of the data, full comparability across countries and subpopulations cannot always be guaranteed.
This is the fifth of six volumes that present the results from PISA 2018. It begins, in this first chapter, by providing the rationale and analytical framework for the report. Chapters 2 and 3 explore policies and practices related to vertical and horizontal stratification. Chapter 4 discusses human resources and Chapter 5 examines material resources. Chapter 6 looks at student learning time. Chapter 7 discusses private schools and school competition. Chapter 8 analyses evaluation and assessment practices. The concluding chapter discusses the policy implications of the results.
References
[2] Bidwell, C. (2001), “Analyzing Schools as Organizations: Long-Term Permanence and Short-Term Change”, Sociology of Education, Vol. 74, pp. 100-114, http://dx.doi.org/10.2307/2673256.
[1] OECD (2019), Trends Shaping Education 2019, OECD Publishing, Paris, https://dx.doi.org/10.1787/trends_edu-2019-en.
[5] OECD (2016), Low-Performing Students: Why They Fall Behind and How To Help Them Succeed, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264250246-en.
[3] OECD (2016), PISA 2015 Results (Volume II): Policies and Practices for Successful Schools, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264267510-en.
[4] OECD (2013), PISA 2012 Results: What Makes Schools Successful (Volume IV): Resources, Policies and Practices, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264201156-en.