Rien Rouw
Ministry of Education, Culture and Science, the Netherlands
Quirine van der Hoeven
Ministry of Education, Culture and Science, the Netherlands
Rien Rouw
Ministry of Education, Culture and Science, the Netherlands
Quirine van der Hoeven
Ministry of Education, Culture and Science, the Netherlands
This chapter presents the use of knowledge within the Dutch Ministry of Education, Culture and Science based on a concrete policy case: the revision of the curriculum in primary and secondary education. Employing a framework for analysing “quality use of research evidence”, the chapter shows what types of knowledge were applied in different stages of the revision process. It also describes the knowledge infrastructure surrounding this case and the culture within the ministry and the composition of the policy team. The analysis examines whether the provided (research) knowledge was appropriate and whether policy makers engaged with the evidence thoroughly and systematically.
This chapter presents a reconstruction of how knowledge in general, and research knowledge in particular, was used in a specific policy case, the long-running policy initiative on curriculum revision. This policy initiative started in 2013 and was partly ignited by the reports of two strategic advisory councils that urged the government to assess and redesign the primary and secondary education curriculum. Both internal educational reasons and societal developments were driving the request for curriculum renewal. The curriculum revision process is particularly interesting because it was, and still is, a long trajectory that developed itself in a non-linear and sometimes even bumpy way, like so many policy initiatives in a field as highly complex as education. As this chapter will show, a variety of knowledge sources played a role during the renewal process. Furthermore, different sources of knowledge came to the fore in different stages. However, most of the time, a systematic appraisal of those different sources was lacking. This chapter will argue that for a thorough engagement with evidence, it is necessary to create specific structures and mechanisms within the ministry.
For the analysis, we applied the Quality Use of Research Evidence framework developed by Rickinson et al. (2020[1]) for investigating the use of knowledge in educational practice. The chapter will demonstrate that using this framework led to a more nuanced and rich perspective on the knowledge used during the policy process.
Before that, however, the chapter briefly describes the main characteristics of the Dutch education system and the most important elements of the knowledge infrastructure in education practice and policy.
One of the key features of the Dutch education system, guaranteed under Article 23 of the Constitution, is freedom of education, i.e. the right for any citizen to found schools and provide teaching based on religious, ideological or educational beliefs. The Dutch system also offers freedom of choice for students and their parents; they can choose the school that best fits their expectations and world views. The system contains both public schools as well as privately governed-publicly funded schools. Both types of schools are equally funded.
The Dutch education system is, therefore, a decentralised system with distributed responsibilities. The Ministry of Education, Culture and Science is responsible for national education policies. Boards are responsible for running the schools. “Freedom to organise teaching” means that schools are free to determine – within legal boundaries – what is being taught and how. The Ministry of Education, Culture and Science, however, sets quality standards that apply to both public and publicly funded schools since the ministry is ultimately responsible for the educational quality of schools. These standards prescribe the subjects to be studied; the attainment targets or examination syllabuses; and the content of national examinations, the number of teaching periods per year and the qualifications teachers are required to have, giving parents and pupils a say in school matters, planning and reporting obligations, etc.
A key actor in the knowledge infrastructure for education is the Netherlands Initiative for Education Research (NRO), a research funding organisation. The NRO funds fundamental or curiosity-driven research as well as policy and education practice-oriented research. In addition to the funding, the NRO is also tasked with disseminating and promoting the use of research. Among other activities, the NRO establishes thematic web pages on research, provides guidance reports and maintains a so-called knowledge roundabout, an online platform where experts answer questions from practitioners and policy makers.
Other main parties include the Dutch Inspectorate of Education, which is responsible for the inspection and supervision of school governing boards and schools, and the Dutch Education Council, which advises the government and parliament on education policy and legislation. The council’s work culminates in evidence-based studies and advisory reports focused on offering solutions for the long term.
Three quasi-governmental agencies are tasked with specific functions in the research and development infrastructure around schools and the ministry: Stichting Leerplan Ontwikkeling (SLO), Centraal Instituut voor Toetsontwikkeling (CITO) and College voor Toetsen en Examens (CvTE). The SLO is the Netherlands Institute for Curriculum Development. The SLO not only develops curricula but also performs research into the realisation of curricula in classrooms. CITO, the National Institute for the Development of Tests, gathers data on student performance from the CITO tests and examinations. Research and development is a core activity. The CvTE is the National Board of Tests and Examinations, responsible for the quality and proper administration of national tests.
At the central level, the Ministry of Education, Culture and Science’s Knowledge Unit is responsible for the overall knowledge infrastructure. The knowledge advisors at the department work as brokers between research and policy and play an active role in advising policy makers during the different phases of policy development, particularly in integrating evidence in all phases.
In addition to the Knowledge Unit, research co-ordinators within the various policy directorates play an important role, particularly supporting policy makers with evaluations. The research co-ordinators form a network, which is co-ordinated by the Knowledge Department. Additionally, policy officers are supported by information specialists in the use of data. Figure 4.1 depicts the knowledge landscape and network of actors.
The story of curriculum revision in primary and secondary education illustrates the use and non-use of (research) knowledge in a concrete case. The case of curriculum renewal is interesting because different types of knowledge are involved in a highly politicised discussion about what students should learn. This section will first briefly sketch the revision trajectory before turning to the “knowledge arena” around the curriculum revision process and subsequently analyse the ways in which policy makers used the knowledge.
Broadly speaking, the curriculum revision process can be divided into three phases (Figure 4.2): the first phase was characterised by the highly participatory trajectory and subsequent publication of the advisory report Education 2032. In the second phase, the teacher-led development of so-called curriculum building blocks took centre stage, labelled as Curriculum.nu. After a rather critical assessment of the building blocks in parliament, and a political stand-still because of elections and the formation of a new government, we entered the current, third phase, a step-by-step approach to the revision.
The revision process started in 2014, partly based on the reports of two advisory councils. In response to both recommendations, the then State Secretary Sander Dekker gave the go-ahead to revise the curriculum of basic education in a letter to parliament on Future-oriented Basic Education. He established the so-called Education 2032 platform, led by a highly esteemed government advisor and senator. After a broad social dialogue, extensive stakeholder consultation, and the study of scientific insights and international comparisons, the committee published its report.
The report, however, received several criticisms in parliament (Box 4.1). Confronted with the critical position of the parliament, and after consulting teacher organisations and organisations of school boards, the State Secretary initiated a new approach: the teacher-led development of “building blocks” for the curriculum. The project was renamed Curriculum.nu.
Members of parliament expressed criticisms concerning the curriculum revision process at various stages. Following the committee’s publication of a first proposal, some felt that teachers did not sufficiently support this initial report and that the report lacked scientific evidence. This criticism was partly inspired by teachers who were active on social media and were also invited to hearings in parliament.
In late 2019, several other points were raised in the debate. There was still concern regarding a perceived lack of scientific underpinning to the curriculum reform proposal and the issue of missing support among teachers. Additionally, parliament was worried about the necessary preconditions (time, space, resources) for implementation in a time of teacher shortages and high work pressure.
In 2018 and 2019, so-called developing groups, made up of teachers and school leaders from primary and secondary education, were busy formulating learning goals at regular intervals and consulting the field and the broader public. The results were presented to the Minister of Education end-2019, who sent the building blocks to parliament in December 2019.
Again, members of parliament were critical of the proposals (see Box 4.1). Several members of parliament insisted on establishing a Temporary Scientific Curriculum Committee to validate the proposals of the teacher-led teams. This committee was formed in September 2020 and published several reports in 2021 and 2022, including one about the usefulness of the proposals mentioned earlier and another report about the headlines of the revision of the curriculum goals.
In the meantime, the cabinet resigned and elections were held. As a consequence, the curriculum revision was declared “controversial” by the parliament and resulted in the development of the new curriculum coming virtually to a halt. Nevertheless, several discussions were held in parliament. Members of parliament continued voicing concerns about the over-ambition of a comprehensive reform and the support by teachers.
Only in 2022 did the newly appointed minister propose to parliament to revise the curriculum step-by-step, starting with the attainment targets for numeracy, literacy, digital skills and citizenship. Furthermore, the minister announced that he would develop a system of periodic curriculum maintenance to ensure that curriculum reform in the future takes place when necessary and more independently from political whim. Based on advice from the Scientific Curriculum Committee, this system is currently being developed.
This section briefly describes how various forms of knowledge were used by the policy teams who managed the curriculum revision process. We have applied the Quality Use of Research Evidence Framework (Figure 4.3), developed by the Australian researcher Mark Rickinson and his team (2020[1]). Although the framework was developed to assess the use of research knowledge in education practice, in our experience, it also proved to be highly valuable for self-assessment in policy processes.
We will start our description with the outer ring. Several developments were relevant in the case of curriculum revision. In the beginning of the process, around 2013, the discourse on what could be called “future skills” was dominant. This discourse focuses on the changing society, economy and labour market, and the skills needed to flourish in a dynamic world. However, in the course of the revision process, the dominant discourse changed to one of lacking basic skills, particularly after the publication of the Programme for International Student Assessment’s (PISA) results in 2018. PISA showed a sharp decline in reading skills in the Netherlands, which caused a lot of concern among politicians and influential opinion makers. In addition, the inspectorate repeatedly reported on decreasing basic skills, for example in its influential yearly State of Education report (Inspectorate of Education, 2022[2]). Additionally, the issue of teacher shortages became ever more urgent between 2013 and 2020. Combined with discussions about a heavy workload and teachers’ salaries, this led to a waning appetite for a large-scale curriculum revision and paved the way for the step‑by-step approach that is currently applied. Rather than a comprehensive revision at once, the curriculum is renewed subject by subject, starting with so-called basic skills: reading, mathematics, digital skills and citizenship skills.
A strong political-administrative orientation among directors-general, directors and heads of division within the ministry was observed. As the Advisory Council for Science, Technology and Innovation (AWTI) stated (2021[3]), much time and energy in the civil service is spent on incidents and short-term issues. This leads to a culture where political sensitivity is valued more than content expertise and knowledge, according to the AWTI. Human resource policies focus on recruiting and developing generalists and process specialists rather than recruiting content experts.
However, this is only one part of the story. There is also an undercurrent of strengthening the evidence base and investing in the knowledge infrastructure for both policy and practice. For example, in the case of curriculum revision, a research programme was developed together with the NRO, among others, to monitor and evaluate the introduction of the new curriculum.
In terms of leadership in the broader knowledge infrastructure, the ministry plays a leading role in arranging the system, facilitating the organisations to play their role in the system and regulating responsibilities. Specifically for the knowledge infrastructure in the domain of the curriculum, the Scientific Curriculum Committee stated that the ministry should more strongly co-ordinate and structure the “curriculum chain”. According to the committee, knowledge ecosystem co-ordination was weak. There are several strong nodes of knowledge, like the SLO, the NRO and the inspectorate, but the links between those nodes are weak, resulting in a lack of knowledge circulation (SCC, 2022, pp. 17-18[4]). This analysis has led the minister to announce the establishment of a new co-ordination body to organise the information flows in the curriculum system.
In terms of individual policy makers’ skillsets and mindsets, we noted that the policy team within the ministry leading the curriculum revision had a good mix of skills, including skills in communication, education practice, research literacy, and programme and process management. Team members came from both policy and staff departments, i.e. the legal, communications and knowledge departments. Several team members had extensive experience in policy making and had developed a strong political-administrative intuition. Compared to other teams, this team was highly experienced with a diversity of expertise.
The mindsets within the team were a mixture of political-administrative orientation, sensitivity to stakeholder perspectives, sensitivity to the public, and media discourse and inquisitiveness to what works. To illustrate this last point, the team was very interested in knowledge about the implementation and realisation of curricula in schools and wanted to learn from other countries’ experiences. Research was thus commissioned into lessons from large-scale curriculum revisions across the globe (Nieveen et al., 2022[5]) and study visits were made to several countries.
We now turn to the centre of the framework. Could the team benefit from appropriate research evidence? It turns out that several types of knowledge were available to inform the policy, as shown in Table 4.1. Some of them are rather straightforward, such as scientific research, curriculum research, inspection research and international comparative research. This research provides insights into the implementation and realisation of the curriculum in schools and classrooms. Knowledge and experience drawn from practice can deepen these insights by adding a much more fine-grained picture of what happens in daily practice. Some academics and teachers also turned into advocates or activists, voicing their views in the (social) media, which also shaped the context for politics and policy.
Type of knowledge |
Provider |
---|---|
Trends/explorative/foresight |
(Scientific) Strategic advisory councils: Scientific Council for Government Policy, Social‑Economic Council, Education Council |
High-level committee advice (mixture of practice expertise, scientific research, public consultation, political-administrative knowledge) |
High-level committee e.g. Education 2032 |
Scientific research and advice (educational sciences, political sciences) “Shadow advice”/academic advocacy |
Academics i.e. Temporary Scientific Curriculum Committee “Activist academics” |
Curriculum research, e.g. implementation research |
Academics and centres of expertise i.e. SLO |
Practice expertise Advocacy |
Teachers, school leaders “Activist teachers” |
Inspection research |
Inspectorate |
International comparative research |
OECD, IEA PIRLS, TIMSS, PISA |
Notes: OECD: Organisation for Economic Co-operation and Development; IEA: International Association for the Evaluation of Educational Achievement; PIRLS: Progress in International Reading Literacy Study; TIMSS: Trends in International Mathematics and Science Study; PISA: Programme for International Student Assessment.
What is much harder to define is the knowledge produced by high-level committees and strategic advisory councils. Both are a mixture of scientific research, practice expertise, knowledge produced in public consultation meetings and inside political-administrative knowledge processed into “advice”. Strictly speaking, this is not research evidence. However, for the development of policy most certainly in the Netherlands, in many cases, this type of knowledge proves to be crucial to proceed. As Geoff Mulgan argues, policy making is highly dependent on policy makers’ synthesis capacity (Mulgan, 2021[6]). In a way, advisory committees and advisory councils do part of the synthesising. To be clear, this is not a pure research evidence synthesis but an exercise where different kinds of knowledge are combined and complemented with more value-driven advice.
Turning back to the question of appropriate research, in a critical appreciation of the knowledge system around curriculum, the Scientific Curriculum Committee identified several gaps. According to the committee, the evaluation of both the intended as well as the realised curriculum was limited, i.e. little is known about the way teachers implement curriculum goals and materials in classrooms. The inspectorate provides some insights, but there is room for improvement. Furthermore, the committee observed that teachers’ experiences with the curriculum are not gathered and analysed systematically (SCC, 2022[4]).
Most interestingly, it seems there was a dynamic in time in the use of the types of knowledge. As described above, this dynamic was mainly caused by a shift in the political-public discourse on education; roughly speaking, a shift from 21st century skills to basic skills. In the first stage of the curriculum revision process, explorative and foresight types of knowledge prevailed, while in the latter stages inspection research and international comparative research dominated the discussions.
How did the curriculum team engage with knowledge from these different sources? Four observations can be made.
The first is that (research) knowledge was often used pragmatically, with a sharp eye for the political- administrative context and the situation and opinions of schools and teachers. Politics and (research) knowledge are closely intertwined, and it is hard to say what comes first, the policy or the knowledge. A clear, linear interaction between knowledge and policy is non-existent.
Second, more often, specific pieces of analysis were applied rather than an assembly of research pieces.
Third, as a consequence, the use of knowledge was quite fragmented, both in time and through time. As was argued above, different types of knowledge were dominant in policy preparations and political and public discussions in different stages of the curriculum revision process. In time, mostly single pieces of evidence were referred to rather than assemblies of various research pieces. And although one of the team members was from the Knowledge Unit, and there were sometimes presentations on research, overall, the use of research was more intuitive and implicit, dependent on and mingled with the (tacit) knowledge of individual team members.
Fourth, from 2020, the Temporary Scientific Curriculum Committee was explicitly tasked with advising the government on a scientific basis. The committee produced several thematic reports that were used in policy development, among others, in developing the overarching framework of curriculum development. As already described above, it could be argued that the synthesis of research was outsourced to the committee: within the ministry, the committee's analysis and recommendations were discussed intensively and incorporated into policy. Both the advice and its use in policy were reported to parliament.
What have we learnt from reconstructing the use of evidence in the case of the curriculum revision process? First, using the QURE framework deepened our insight and led to a more nuanced view of how different types of knowledge were applied during the process. Particularly reflecting on the questions of whether the knowledge was appropriate and whether the team engaged thoroughly with evidence were highly valuable. The reconstruction made us even more aware of the dynamics through time: different types, and even more specifically pieces, of knowledge were prominent in different phases of the revision process.
Additionally, the framework showed us that a structure for collating and appraising research and knowledge was lacking. The Scientific Curriculum Committee also observed this in its advice on a system for periodic curriculum maintenance. The committee concluded that while various players in the curriculum system carry out relevant research, a “standardised working process with regular intervals of monitoring, analysis, evaluation, and decision making” is missing (SCC, 2022, p. 18[4]). This leads to a lack of “effective knowledge circulation and collaborative use of information” (SCC, 2022, p. 17[4]). There was no structure or mechanism in the ministry for systematically gathering, accumulating and weighing all the relevant pieces of knowledge and explicitly judging what it means for policy.
Recently, the lack of structure within the ministry has been taken up in several policy projects. For example, in the National Program for Education, a big COVID-19 recovery impulse, progress reports are sent to parliament twice a year. To prepare for the report, all the involved policy makers gather and present the state of play regarding their topic based on the most recent research. Furthermore, the Knowledge Unit is aiming for a series of collaborative evidence appraisal meetings to promote the uptake of policy evaluation and research. Evidence appraisal meetings should combine two functions: 1) make an evidence synthesis; and 2) have a well-structured dialogue about the implications for policy. Most important is that these meetings lead to a “genuine exchange” with evidence and with researchers (Knight and Lyall, 2013, p. 310[7]). “Having the opportunity to discuss research helps practitioners (in the context of policy, this would be policy makers) gain a deeper understanding and sense of ownership of the findings, and in doing so, enables evidence to be integrated more relevantly and sensitively in professional settings” as Sharples observes in his account of what it takes to make evidence relevant for the frontline of social services (Sharples, 2013, p. 18[8]). What should happen at these meetings might be called a “percolation” of evidence into policy making. This means that the use of evidence is not to be understood mechanically as a consequence of a single study or even a body of knowledge, but rather more indirectly as a shift of thinking about issues, based on “lengthy interaction rather than one-way conversation” (Brown, 2012, p. 457[9]).
The reconstruction of evidence use also underlined the need to invest in strategic human resources policies. In the curriculum revision process, the team was highly experienced and had different expertise and specialist knowledge. But this is not always the case, and due to high turnover, it is hard to build and preserve deep knowledge about policies, which complicates the possibility of composing well-balanced teams. Moreover, as was already argued above, the human resource policy in the Dutch civil service was characterised by a focus on generalists for several decades. To promote the use of research, a shift is needed to recruit and develop more content specialists and knowledge workers. A well-rounded policy maker is capable of innovative policy making, as stated by a recent report from the European Commission’s Joint Research Centre (Schwendinger, Topp and Kovacs, 2022[10]). According to the researchers, working with evidence is one of the seven clusters of competences, which includes futures literacy and engagement with citizens and stakeholders (Schwendinger, Topp and Kovacs, 2022[10]). The Knowledge Unit is currently exploring together with the Human Resources Management Department how these insights can feed into training for policy makers, and also into the ministry’s strategic human resources policy, e.g. the recruitment, development and career paths of policy makers.
Lastly, the reconstruction of evidence use in such a long policy process led us to the metaphor of the journey to describe and analyse the application of knowledge in policies. Evidence travels through policies and politics, so to speak, most of the time not in a very linear way but with many detours and hilly tracks. Following the journey of knowledge through policies leads to more insights into how evidence is actually being used, and under which conditions, rather than only if it is being used. This kind of research relies on “in-depth descriptions”, answering “questions about when, why, how and who finds what type of knowledge sound, timely, and relevant at different stages of the policy cycle” (Oliver, Lorenc and Innvaer, 2014, p. 8[11]). This chapter only reported about a first, cursory exploration of such an approach. We are planning to use it more often to enrich our understanding of how to strengthen the impact of research in education policy.
[3] AWTI (2021), State of Knowledge, Advisory Council for Science, Technology and Innovation, The Hague.
[9] Brown, C. (2012), “The ‘policy-preferences model’: A new perspective on how researchers can facilitate the take-up of evidence by educational policy makers”, Evidence & Policy: A Journal of Research Debate and Practice, Vol. 8/4, pp. 455-472, https://doi.org/10.1332/174426412X660106.
[2] Inspectorate of Education (2022), The State of Education 2022, Ministry of Education, Culture and Science, The Hague, https://english.onderwijsinspectie.nl/documents/annual-reports/2022/04/28/state-of-education-2022.
[7] Knight, C. and C. Lyall (2013), “Knowledge brokers: The role of intermediaries in producing research impact”, Evidence & Policy: A Journal of Research Debate and Practice, Vol. 9/3, pp. 309-316, https://doi.org/10.1332/174426413X671941.
[6] Mulgan, G. (2021), “The Synthesis Gap: Reducing the imbalance between advice and absorption in handling big challenges”, https://www.geoffmulgan.com/post/the-synthesis-gap-reducing-the-imbalance-between-advice-and-absorption-in-handling-big-challenges.
[5] Nieveen, N. et al. (2022), Perspectives on Curriculum Change: An overview study to provide insights for the Dutch context, https://www.nro.nl/ (accessed on 12 June 2023).
[11] Oliver, K., T. Lorenc and S. Innvaer (2014), “New directions in evidence-based policy research: A critical analysis of the literature”, Health Research Policy and Systems, Vol. 12/34, https://doi.org/10.1186/1478-4505-12-34.
[1] Rickinson, M. et al. (2020), Using Evidence Better: Quality Use of Research Evidence Framework, Monash University, Victoria, Australia, http://monash.edu/education/research/projects/qproject (accessed on 25 October 2022).
[4] SCC (2022), Structure and Regularity, Scientific Curriculum Committee, The Hague.
[10] Schwendinger, F., L. Topp and V. Kovacs (2022), Competences for Policymaking – Competence Frameworks for Policymakers and Researchers Working on Public Policy, JRC Science for Policy Report, Publications Office of the European Union, Luxembourg, https://doi.org/10.2760/642121.
[8] Sharples, J. (2013), Evidence for the Frontline: A Report for the Alliance for Useful Evidence, Alliance for Useful Evidence, London.