This chapter describes the range of science competences assessed in PISA 2018 and reports the proportion of students who performed at each level of proficiency.
PISA 2018 Results (Volume I)
Chapter 7. What can students do in science?
Abstract
The PISA assessment of science focuses on measuring students’ ability to engage with science-related issues and with the ideas of science, as reflective citizens. Engaging in reasoned discourse about science and science-based technology requires a sound knowledge of facts and theories to explain phenomena scientifically. It also requires knowledge of the standard methodological procedures used in science, and knowledge of the reasons and ideas used by scientists to justify their claims, in order to evaluate (or design) scientific enquiry and to interpret evidence scientifically.
In contemporary societies, an understanding of science and of science-based technology is necessary not only for those whose careers depend on it directly, but also for any citizen who wishes to make informed decisions related to the many controversial issues under debate today – from personal issues, such as maintaining a healthy diet, to local issues, such as how to manage waste in big cities, to global and far-reaching issues, such as the costs and benefits of genetically modified crops or how to prevent and mitigate the negative consequences of global warming on physical, ecological and social systems.
What the data tell us
-
On average across OECD countries, 78 % of students attained Level 2 or higher in science. At a minimum, these students can recognise the correct explanation for familiar scientific phenomena and can use such knowledge to identify, in simple cases, whether a conclusion is valid based on the data provided. More than 90 % of students in Beijing, Shanghai, Jiangsu and Zhejiang (China) (97.9 %), Macao (China) (94.0 %), Estonia (91.2 %) and Singapore (91.0 %) achieved this benchmark.
-
On average across OECD countries, 6.8 % of students were top performers in science in 2018, meaning that they were proficient at Level 5 or 6. Almost one in three (32 %) students in Beijing, Shanghai, Jiangsu and Zhejiang (China), and more than one in five students in Singapore (21 %) performed at this level. In addition to skills associated with lower proficiency levels, these students can creatively and autonomously apply their knowledge of and about science to a wide variety of situations, including unfamiliar ones.
Science was the major domain assessed in both 2006 and 2015. The PISA science test was significantly expanded in 2015 to make use of the capabilities of computers, the new mode of delivery used in most participating education systems. For example, through its interactive interface, PISA 2015 was able, for the first time, to assess students’ ability to conduct scientific enquiry by asking test-takers to design (simulated) experiments and interpret the resulting evidence. The main part of this chapter covers the range of science proficiency as assessed in the computer-based test of science.
The nine countries that participated in PISA 2018 using pen-and-paper tests continued to use tasks designed initially for the 2006 assessment. Because some of these tasks were adapted and also used in countries that delivered the science test on computer, results can be reported on the same numeric scale (something that is particularly important for assessing performance trends over time that start from earlier pen-and-paper assessments, including in countries that conducted the PISA 2018 science test on computer). However, strictly speaking, these scores should be interpreted according to different descriptors of proficiency. When describing the performance of students in these nine countries, this chapter therefore also highlights the most relevant distinctions between the range of proficiency assessed through the pen-and-paper test (which does not include the ability to carry out experiments and conduct scientific enquiry) and the wider range assessed through computer delivery of the test.
The range of proficiencies covered by the PISA science test
As discussed in Chapter 2, student performance in PISA is reported as a score on a scale. To help interpret what students’ scores mean in substantive terms, the scale is divided into levels of proficiency that indicate the kinds of tasks that students at those levels are capable of completing successfully. The seven proficiency levels used in the PISA 2018 science assessment were the same as those established for the PISA 2015 assessment.1 The process used to produce proficiency levels in science is described in Chapter 2. Table I.7.1 illustrates the range of science competences covered by the PISA test and describes the skills, knowledge and understanding that are required at each level of the science scale.
Since it is necessary to preserve the confidentiality of the test material in order to continue to monitor trends in science beyond 2018, the questions used in the PISA 2018 assessment of science cannot be presented in this report. Instead, it is possible to illustrate the proficiency levels with questions that were released after previous assessments. Sample items that illustrate the different levels of science proficiency can be found in Annex C of PISA 2015 Results (Volume I) (OECD, 2016, pp. 462-481[1]) and on line at www.oecd.org/pisa/test/.
Table I.7.1. Summary description of the seven levels of science proficiency in PISA 2018
Level |
Lower score limit |
Percentage of students able to perform tasks at each level or above (OECD average) |
Characteristics of tasks |
---|---|---|---|
6 |
708 |
0.8 % |
At Level 6, students can draw on a range of interrelated scientific ideas and concepts from the physical, life, and earth and space sciences and use content, procedural and epistemic knowledge in order to offer explanatory hypotheses of novel scientific phenomena, events and processes or to make predictions. In interpreting data and evidence, they are able to discriminate between relevant and irrelevant information and can draw on knowledge external to the normal school curriculum. They can distinguish between arguments that are based on scientific evidence and theory and those based on other considerations. Level 6 students can evaluate competing designs of complex experiments, field studies or simulations and justify their choices. |
5 |
633 |
6.8 % |
At Level 5, students can use abstract scientific ideas or concepts to explain unfamiliar and more complex phenomena, events and processes involving multiple causal links. They are able to apply more sophisticated epistemic knowledge to evaluate alternative experimental designs and justify their choices, and use theoretical knowledge to interpret information or make predictions. Level 5 students can evaluate ways of exploring a given question scientifically and identify limitations in interpretations of data sets, including sources and the effects of uncertainty in scientific data. |
4 |
559 |
24.9 % |
At Level 4, students can use more complex or more abstract content knowledge, which is either provided or recalled, to construct explanations of more complex or less familiar events and processes. They can conduct experiments involving two or more independent variables in a constrained context. They are able to justify an experimental design by drawing on elements of procedural and epistemic knowledge. Level 4 students can interpret data drawn from a moderately complex data set or less familiar context, draw appropriate conclusions that go beyond the data and provide justifications for their choices. |
3 |
484 |
52.3 % |
At Level 3, students can draw upon moderately complex content knowledge to identify or construct explanations of familiar phenomena. In less familiar or more complex situations, they can construct explanations with relevant cueing or support. They can draw on elements of procedural or epistemic knowledge to carry out a simple experiment in a constrained context. Level 3 students are able to distinguish between scientific and non-scientific issues and identify the evidence supporting a scientific claim. |
2 |
410 |
78.0 % |
At Level 2, students are able to draw on everyday content knowledge and basic procedural knowledge to identify an appropriate scientific explanation, interpret data and identify the question being addressed in a simple experimental design. They can use basic or everyday scientific knowledge to identify a valid conclusion from a simple data set. Level 2 students demonstrate basic epistemic knowledge by being able to identify questions that can be investigated scientifically. |
1a |
335 |
94.1 % |
At Level 1a, students are able to use basic or everyday content and procedural knowledge to recognise or identify explanations of simple scientific phenomena. With support, they can undertake structured scientific enquiries with no more than two variables. They are able to identify simple causal or correlational relationships and interpret graphical and visual data that require a low level of cognitive demand. Level 1a students can select the best scientific explanation for given data in familiar personal, local and global contexts. |
1b |
261 |
99.3 % |
At Level 1b, students can use basic or everyday scientific knowledge to recognise aspects of familiar or simple phenomena. They are able to identify simple patterns in data, recognise basic scientific terms and follow explicit instructions to carry out a scientific procedure. |
Percentage of students at the different levels of science proficiency
Figure I.7.1 and, for countries that used paper test booklets, Figure I.7.2, show the distribution of students across the seven proficiency levels in each participating country and economy. Table I.B1.3 (in Annex B1) shows these same percentages of students at each proficiency level on the science scale, with standard errors.
Proficiency at or above Level 2
Level 2 in science is an important benchmark for student performance: it represents the level of achievement, on the PISA scale, at which students begin to demonstrate the science competences that will enable them to engage in reasoned discourse about science and technology (OECD, 2018, p. 72[2]). At Level 2, the attitudes and competences required to engage effectively with science-related issues are only just emerging. Students demonstrate basic or everyday scientific knowledge, and a basic understanding of scientific enquiry, which they can apply mostly in familiar contexts. Students’ skills progressively expand to less familiar contexts, and to more complex knowledge and understanding at higher levels of proficiency.
Level 2 does not establish a threshold for scientific illiteracy. PISA views science literacy not as an attribute that a student has or does not have, but as a set of skills that can be acquired to a greater or lesser extent. It also does not identify a “sufficient” level of science literacy, particularly not for those whose careers will directly depend on an understanding of science and of science-based technology. However, Level 2 does establish a baseline threshold below which students typically require some support to engage with science-related questions, even in familiar contexts. For this reason, this report describes students performing below Level 2 as “low-achieving students”.
Proficiency at Level 2
At Level 2, students can draw on everyday content knowledge and basic procedural knowledge to identify an appropriate scientific explanation, interpret data, and identify the question being addressed in a simple experimental design. They can use common scientific knowledge to identify a valid conclusion from a simple data set. Level 2 students demonstrate basic epistemic knowledge by being able to identify questions that could be investigated scientifically.
Level 2 can be considered as the level of science proficiency at which students begin to demonstrate the competences that will enable them to engage effectively and productively with issues related to science and technology. More than 90 % of students in Beijing, Shanghai, Jiangsu and Zhejiang (China) (hereafter “B-S-J-Z [China]”) (97.9 %), Macao (China) (94.0 %), Estonia (91.2 %) and Singapore (91.0 %) met this benchmark. Across OECD countries, an average of 78 % of students attained Level 2. Meanwhile, only about one in six students in the Dominican Republic (15 %) and only a minority (less than 50 %, but more than 20 %) of students in 15 other countries and economies attained this level of proficiency (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Proficiency at Level 3
At Level 3, students can draw upon moderately complex content knowledge to identify or construct explanations of familiar phenomena. In less familiar or more complex situations, they can construct explanations with relevant cueing or support. They can draw on elements of procedural or epistemic knowledge to carry out a simple experiment in a constrained context (the ability to carry out experiments was not assessed in paper-based tests). Level 3 students can distinguish between scientific and non-scientific issues and identify the evidence supporting a scientific claim.
On average across OECD countries, more than half of all students (52 %) were proficient at Level 3 or higher (that is, at Level 3, 4, 5 or 6). The average median score across OECD countries, i.e. the score that divides the population in two equal halves (one half scoring above the median, and the other half below), fell within Level 3. Similarly, Level 3 corresponds to the median proficiency of students in 29 participating countries and economies. Across OECD countries, on average, 27 % of students scored at Level 3, the largest share amongst the seven proficiency levels described in PISA. Similarly, in 30 countries and economies, the largest share of students performed at Level 3 (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Proficiency at Level 4
At Level 4, students can use more sophisticated content knowledge, which is either provided or recalled, to construct explanations of more complex or less familiar events and processes. They can conduct experiments involving two or more independent variables in a constrained context (the ability to conduct experiments was not assessed in paper-based tests). They can justify an experimental design, drawing on elements of procedural and epistemic knowledge. Level 4 students can interpret data drawn from a moderately complex data set or less familiar contexts and draw appropriate conclusions that go beyond the data and provide justifications for their choices.
On average across OECD countries, 25 % of students performed at Level 4 or above, and scored higher than 559 points on the PISA science scale. The largest share of students in B-S-J-Z (China) and Singapore performed at this level (the modal level); Level 4 was also the median level of performance in B-S-J-Z (China) and Singapore (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Proficiency at Level 5
At Level 5, students can use abstract scientific ideas or concepts to explain unfamiliar and more complex phenomena, events and processes. They can apply more sophisticated epistemic knowledge to evaluate alternative experimental designs, justify their choices and use theoretical knowledge to interpret information or make predictions. Students at this level can evaluate ways of exploring a given question scientifically and identify limitations in the interpretation of data sets, including sources and the effects of uncertainty in scientific data.
Level 5 on the science scale marks another qualitative difference. Students who can complete Level 5 tasks can be said to be top performers in science in that they are sufficiently skilled in and knowledgeable about science to be able to creatively and autonomously apply their knowledge and skills to a wide variety of situations, including unfamiliar ones.
On average across OECD countries, 6.8 % of students were top performers, meaning that they were proficient at Level 5 or 6. Almost one in three (32 %) students in B-S-J-Z (China), and more than one in five students in Singapore (21 %) performed at this level. In 9 countries/economies (Macao [China], Japan, Finland, Estonia, Korea, Chinese Taipei, Canada, New Zealand and the Netherlands, in descending order of the share of students), between 10 % and 14 % of all students performed at Level 5 or above. By contrast, in 27 countries/economies, including Colombia (0.5 %) and Mexico (0.3 %), fewer than one in 100 students was a top performer (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Countries and economies with similar mean performance may have significantly different shares of students who are able to perform at the highest levels in PISA. This is true, for example, in Hong Kong (China) (with a mean performance of 517 points and where 7.8 % of students were top performers) and Chinese Taipei (with a mean performance of 516 points and where 11.7 % of students were top performers). The smaller share of top-performing students in Hong Kong (China) compared to Chinese Taipei reflects a more narrow variation of student performance around the mean.
Proficiency at Level 6
Students at Level 6 on the PISA science scale can successfully complete the most difficult items in the PISA science assessment. At Level 6, students can draw on a range of inter-related scientific ideas and concepts from the physical, life, and earth and space sciences. They can use procedural and epistemic knowledge to offer explanatory hypotheses of novel scientific phenomena, events and processes that require multiple steps or to make predictions. In interpreting data and evidence, they can discriminate between relevant and irrelevant information and can draw on knowledge external to the normal school curriculum. They can distinguish between arguments that are based on scientific evidence and theory, and those based on other considerations. Students at Level 6 can evaluate competing designs of complex experiments, field studies or simulations and justify their choices.
On average across OECD countries, 0.8 % of students (or about 1 in 120 students) attained Level 6. B-S-J-Z (China) had the largest proportion of students (7.3 %) who scored at this level in science, followed by Singapore (3.8 %). In 14 participating countries and economies, between 1 % and 2 % of students scored at this level, while in the remaining countries/economies, fewer than 1 in 100 students scored at the highest level (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Proficiency below Level 2
The PISA science assessment identified two proficiency levels below Level 2. Students who scored at or below these levels are considered low achievers in science.
Proficiency at Level 1a
At Level 1a, students can use common content and procedural knowledge to recognise or identify explanations of simple scientific phenomena. With support, they can undertake structured scientific enquiries with no more than two variables (the ability to undertake scientific enquiry was not assessed in the paper-based test of science). They can identify simple causal or correlational relationships, and interpret graphical and visual data that require a low level of cognitive ability. Students at Level 1a can select the best scientific explanation for given data in familiar personal, local and global contexts.
On average across OECD countries, 16.1 % of students performed at Level 1a and only 5.9 % of students performed below Level 1a. In the Dominican Republic, fewer than one in two students (about 47 %) attained Level 1a (or a higher level of performance). In 15 countries and economies (including some countries that used the paper-based test of science), the median proficiency level of the 15-year-old student population was within Level 1a (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
Proficiency at Level 1b
At Level 1b, students can use common content knowledge to recognise aspects of simple scientific phenomena. They can identify simple patterns in data, recognise basic scientific terms and follow explicit instructions to carry out a scientific procedure.2
Across OECD countries, 5.2 % of students performed at Level 1b and 0.7 % performed below Level 1b. In 44 countries and economies, less than 10 % of students performed at or below Level 1b (Figure I.7.1, Figure I.7.2 and Table I.B1.3).
No item in the PISA assessment can indicate what students who perform below Level 1b can do. Students who scored below Level 1b may have acquired some elements of science knowledge and skills, but based on the tasks included in the PISA test, their ability can only be described in terms of what they cannot do – and they are unlikely to be able to solve, other than by guessing, any of the PISA tasks. In some countries, more than 1 in 20 students performed below Level 1b: 14 % in the Dominican Republic, 10 % in Panama, and between 9 % and 5 % in Lebanon, the Philippines, Georgia and Qatar (in descending order of that share).
References
[2] OECD (2018), “PISA for Development Science Framework”, in PISA for Development Assessment and Analytical Framework: Reading, Mathematics and Science, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264305274-6-en.
[1] OECD (2016), PISA 2015 Results (Volume I): Excellence and Equity in Education, PISA, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264266490-en.
[3] OECD (forthcoming), PISA 2018 Technical Report, OECD Publishing, Paris.
Notes
← 1. Six of the seven levels are aligned with the levels used in describing the outcomes of PISA 2006 (ranging from the highest, Level 6, to Level 1a, formerly known as Level 1). These levels and their respective descriptors are still applicable to the paper-based assessment of science.
← 2. Descriptions of what students can do at Level 1b are based on items included in the PISA 2015 science assessment. In 2018, only one item in the paper-based test of science was located at this level; the easiest tasks included in the PISA 2018 computer-based test of science were located at Level 1a. It is nevertheless possible to estimate for every student the likelihood of scoring at Level 1b, based on how he or she responded to Level 1a tasks (for a student whose proficiency lay below the level of difficulty of the task, the probability of a correct response varied between 0 and 62 %, depending on how far below the task’s difficulty the student’s proficiency lay) and on the science performance of students with similar response patterns in 2015. From these individual estimates of the (posterior) likelihood of performance, it is also possible to obtain country-level estimates of the share of students at each proficiency level. See the PISA 2018 Technical Report (OECD, forthcoming[3]) for details.