Nóra Révai
OECD
Who Really Cares about Using Education Research in Policy and Practice?
6. Engaging with research to understand research use: The value of evidence use journeys
Abstract
This chapter shows how the analysis of evidence use journeys can bring us closer to a deep understanding of systematic and high-quality evidence use and support us in identifying factors that enable the development of a culture of research engagement in policy organisations and processes. It does so by providing a comparative analysis of the Dutch and Flemish evidence use journeys presented in Chapters 4 and 5. The chapter explores how the two frameworks and corresponding analyses enable understanding evidence use. It reflects on the exercise itself and describes the Dutch and Flemish analyses as policy action research. The chapter concludes with reflections on the exercise of conducting self-reflective evidence use journeys.
Introduction
Developing a culture of research engagement requires understanding what it means to use evidence “systematically and well”. A group of civil servants from the Netherlands and Flanders1 (Belgium) decided to explore this by investigating the use of evidence in two policy processes: curriculum revision and introducing standardised testing, respectively. The two systems decided to engage in an honest self‑inspection and – accepting their vulnerability – ask peers from three countries (Finland, Ireland and Norway) to help them reflect. They each hosted a learning seminar co-organised with the OECD as a platform for reflection.
The two systems also used research to guide their reflections by using a conceptual framework to analyse their evidence use. The Netherlands used the Quality Research Use (QURE) framework developed by Mark Rickinson and his team at the Monash Q University in Australia to understand how research can be used well in schools (Rickinson et al., 2022[1]). Flanders used the OECD/CERI Strategic Education Governance (SEG) framework developed by Claire Shewbridge and Florian Köster (Shewbridge and Köster, 2019[2]), building on Tracey Burns’ earlier work on governing complex education systems (Burns and Köster, 2016[3]). The self-reflective analyses were coined “evidence use journeys” by Rien Rouw and draw on the peer reflections generated by participants in the two learning seminars. This metaphor reflects the fact that using evidence in policy is not a linear and straightforward process.
This chapter compares the two analyses and explores the affordances of each framework: what kinds of reflections do they enable with respect to understanding evidence use and what are their limitations? It aims to show how the analysis of evidence use journeys can bring us closer to a deep understanding of systematic and high-quality evidence use and support us in identifying factors that enable developing a culture of research engagement in policy organisations and processes. The chapter also reflects on the exercise of conducting self-reflective evidence use journeys.
Context and landscape
The overview of the Dutch and Flemish education systems reveals many similarities, which make the two evidence use journeys somewhat more comparable. The descriptions of the context in which the given reform took place highlighted the strongly politicised nature of policy making. In both systems, reform was motivated by strong signals that standards were dropping (e.g. as measured by international student assessments).
However, several differences can be observed in the way the context and the landscape were presented between the two cases. First, the timelines and process of the respective reforms were very different. In the Netherlands, it has been almost a decade (since 2014), albeit with irregular intensity and a changing focus, starting with a participatory approach. In contrast, the starting point of the Flemish case description is the governmental decision to introduce standardised tests in 2019.
Second, the presentation of the landscape of actors differs. The Dutch case describes a wide scope of actors, including central government, government-funded independent bodies, and research institutions and practice-oriented organisations. The Flemish case focuses on central actors and presents relevant units of the ministry. It includes other actors, such as research centres, umbrella organisations of schools, and other bodies and organisations later in the analysis.
Finally, the description of the reform context has different emphases. There is a bigger emphasis on the political process in the Dutch case, which references major political decisions and decision makers’ names. The Flemish description centres more on the context of education policy: the drivers of the reform and the main determining contextual elements, such as the culture of using tests and accountability, while the political process is present to a lesser extent.
Two frameworks, two tales
To identify the respective benefits and limitations of each conceptual framework in better understanding evidence use in policy, this section compares the two analyses with respect to key components of both frameworks.
The similarities and differences between the two frameworks are mirrored in the respective analyses. Each framework has a clear purpose, which provided the authors with a clear structure for conducting their analysis of evidence use. However, the different purposes of the two frameworks resulted in different sorts of analyses. The QURE framework is a better fit given that it aims to support reflections on the quality of research use, which is precisely the purpose of the evidence use journey analysis. The Dutch analysis could rely on the descriptions of its dimensions and apply them directly. The SEG framework has a broader scope to support education governance, with knowledge governance being one of its dimensions that directly relates to the purpose of this analysis. The Flemish analysis adapted three additional dimensions: capacity building, whole-of-system perspective and stakeholder involvement by narrowing the focus of these broad dimensions to evidence use. This exercise appeared to be easy for capacity building but perhaps slightly less straightforward for the other two dimensions. Accountability from a governance perspective was a key piece in introducing standardised tests and was addressed in the analysis as a factor influencing research use. In the QURE framework, this would have been labelled as a systemic influence. Strategic thinking remains implicit in the analysis, but its descriptors are present. For example, crafting and consolidating a system vision, and balancing short- and long-term priorities, both influenced decisions and research knowledge.
Second, the context for which the frameworks were originally developed influenced the required extent and nature of adaptation. The QURE framework was developed for the context of practice (schools), while the SEG framework was developed for policy. Adapting the former for policy in the Dutch analysis seemed straightforward: the framework’s key dimensions and elements could be easily interpreted for a policy process and the ministry. However, the SEG framework captures the policy context better than the QURE framework does. This resulted in more concrete examples of the challenges of balancing knowledge of context with research evidence in the Flemish evidence use journey. Third, the overlaps and differences in the content of the frameworks are reflected in the structure and focus of the two analyses. The sections below elaborate on the key aspects of content.
Evidence and other types of knowledge: Determinants of thoughtful engagement
Both accounts include a reflection on the evidence itself, the types of knowledge used and how knowledge influences decisions. In the Dutch case, this is prompted by the QURE framework’s two core elements: appropriate research evidence and thoughtful engagement with this. In the Flemish case, it is stimulated by the SEG’s framework knowledge governance piece.
The analyses are in many ways similar and, unsurprisingly, very much in line with recent research on evidence use in policy. They present how different sources of knowledge interact in complex ways in policy making. Policy decisions often follow a pragmatic approach, in which evidence, context, interests and values are combined according to both accounts.
However, they each highlight a few distinct features of evidence journeys. The Dutch case emphasises the difficulty of tracing the use of research evidence because what reaches policy makers and politicians are often reports that combine different sources of knowledge. As a result, it is hard to know if and to what extent a particular decision drew on research evidence. In addition, the Dutch analysis notes the varying uses of different sources of knowledge over time. This analysis is more an original idea of the authors rather than the result of the QURE framework, which does not naturally lend itself to analysing change over time. The Flemish case demonstrates the balancing act between research and contextual knowledge through a few concrete examples. It also points to the fact that research itself is often value- or even belief‑driven (Cairney, 2019[4]).
While both frameworks stimulated a rich discussion on the complexity of knowledge use, the QURE framework’s concept of thoughtful engagement also generated a reflection on the quality of evidence use. A common conclusion was the need for better evidence synthesis.
Capacity building
Capacity building was a common element explored in both analyses with respect to policy makers’ skills to thoughtfully engage with research evidence. Both analyses reflect on the collective skills of teams rather than just individual policy makers’ skills. In the Dutch analysis, this is an addition to what the QURE framework explicitly says, where skills and mindset are labelled as “individual enablers” (Rickinson et al., 2022[1]). Resulting from its system focus, the SEG framework takes a broader perspective, where capacity relates not only to individuals, but also organisations and the system, and specifically asks for horizontal capacity building. This prompted the Flemish case to also discuss the skills of researchers and other stakeholders. It is also worth noting that both analyses go beyond research literacy skills and point to more complex competences of policy makers. These include sensitivity to different perspectives and inquisitiveness in the Dutch case and communication, boundary spanning and political advisory skills in the Flemish case. Importantly, both these descriptions refer to a new civil service competence framework recently developed by the European Commission’s Joint Research Centre.
Stakeholder engagement
Both accounts describe stakeholder engagement as a key component. In the Flemish case, this is a direct consequence of the SEG framework’s corresponding dimension and is discussed with respect to the intensity and nature of involvement in the different stages of the policy process, along with the associated tensions and challenges. The Dutch analysis discusses it in less detail and in a cross-cutting manner: as part of the context, skills and conclusion. Both cases dedicate a section of the conclusion to the collective appraisal of evidence by stakeholders. Using the literature on evidence-informed deliberative dialinvaliogues from the health sector, this method was presented in the learning seminars to explore its feasibility for education policy making (Box 6.1). Flanders concluded that stakeholders’ involvement could benefit from a more structured approach throughout the whole policy process: one which allows them to engage with evidence, reflect on their own preconceptions and contribute with their consolidated deliberation. The Netherlands has so far focused more on the dialogue between researchers and policy makers and set out a plan for a series of collaborative evidence appraisal meetings with respect to the theme of policy evaluation.
Box 6.1. Collective evidence appraisal: An example of deliberative stakeholder engagement from the health sector
Evidence-informed deliberative dialogues are a stakeholder platform used mainly in the healthcare sector. They are a group of approaches for structuring conversations between stakeholders to discuss the best available evidence and inform policy making in a controlled way on an identified topic. They share conceptual foundations with other deliberative democratic methods (e.g. community panels, citizen assemblies) where “a broadly representative body of people weighs evidence, deliberates to find common ground and develops detailed recommendations on policy issues for public authorities” (OECD, 2020, p. 195[5]).
These approaches have been used in a variety of healthcare systems to: determine funding and eligibility for treatments, identify and assess new technology, optimise service provision, design and implement visions and strategies, inform the content of professional development and modernise workforce training, oversight and planning (The ASTUTE Health study group, 2014[6]; Oortwijn, Jansen and Baltussen, 2021[7]; McMaster Health Forum, 2021[8]).
In deliberative approaches, the definition of evidence is broader than just research evidence. Culyer and Lomas (2006[9]) classify evidence into three types: 1) context-free scientific; 2) context-sensitive scientific; and 3) colloquial. Colloquial evidence is informal evidence from experts, professionals, lobby groups, etc. and provides context to scientific evidence in healthcare policy making (Sharma et al., 2015[10]).
How do they work?
To maximise the effectiveness of deliberative dialogues, they should:
be informed by pre-circulated packaged evidence summaries
ensure fair representation among policy makers, stakeholders who could be affected by the outcome and researchers
engage one or more skilled facilitators to assist with the deliberations
allow for frank, off-the-record deliberations by following the Chatham House rule
not aim for consensus (Boyko, Lavis and Dobbins, 2014[11]).
Impact on decision making and participants
With respect to their impact on decision making, deliberative processes have been found to:
enhance the legitimacy of policy design based on deliberation between stakeholders to identify how values can be combined with evidence to arrive at a decision (Oortwijn, Jansen and Baltussen, 2021[7])
facilitate discussions of evidence between stakeholders on high-stakes topics
inform ethical, accountable policy decisions in highly emotive or politicised policy areas (The ASTUTE Health study group, 2014[6]).
Positive impacts on participants include:
the acquisition of new knowledge by participants, a stronger culture of research use within stakeholders’ organisations and concrete actions aimed at implementing recommendations emerging from deliberative dialogues (Moat et al., 2014[12]; Ridde and Dagenais, 2017[13])
participants report being more likely to use research in their own organisations immediately following such dialogues (Lavis, Boyko and Gauvin, 2014[14])
improved stakeholder involvement in, and satisfaction with, strategic planning processes (Moat et al., 2014[12]).
Culture and infrastructure
As a result of the QURE framework’s explicit focus on culture, the Dutch analysis included an appraisal of the existing culture of evidence use within the ministry. Beyond the actual analysis, it is worth noting that terms such as “knowledge infrastructure” and “knowledge ecosystem”, as well as the generally strongly self-reflective nature of the Dutch case, suggest that the Netherlands has been investing in the development of a research use culture. In comparison, the Flemish case does not have an explicit discussion on culture, leadership or infrastructure, which may be the result of the different focus of the SEG framework.
Systems perspectives
Evidence use journeys are influenced by systemic factors and need to be interpreted as part of the education system. A systems perspective is explicit in both frameworks and thus is included in both accounts. In the QURE framework, it is present as system-level influences, and the Dutch case analyses the ways in which the political discourse influenced the curriculum revision process. This is an interesting adaptation of the QURE framework for the context of policy. The whole-of-systems dimension of the SEG framework calls for developing synergies across the system and overcoming systemic inertia. Accordingly, the Flemish case discusses the ways in which different policy processes are interconnected. However, a systems perspective is also present in the description of the respective contexts in both analyses. These include political developments that posed challenges to the reform process, influential political leaders, and the political discourse and the influence of the media.
Policy action research
The two evidence use journeys demonstrate the way in which research can directly assist policy making. More specifically, in these cases, conceptual research – two frameworks – and a reflection on policy making by policy makers contributed to improving “the practice of policy making”. This is apparent in the Dutch case, where the ministry started to put the conclusions of this analysis into action: they have been exploring human resource policies in the ministry and designing structures and processes that will support a more systematic and higher quality use of research.
In fact, we could call these analyses policy action research or collaborative enquiry in policy. Action research can be defined as a “systematic process of practitioner problem posing and problem solving” (Kuhne and Quigley, 1997, p. 23[15]). In the context of teaching practice, its main goal is to better understand teaching and learning-related problems and improve practice. This was exactly the driver of the evidence use journeys for policy: the problem being the fragmented use of research evidence and the desire to improve it. A more recent adaptation of action research is collaborative enquiry, in which teams of teachers explore and answer questions about their professional practice (Townsend and Adams, 2014[16]). It has been described as “a process of knowledge generation, occurring when researcher and practitioner knowledge meet in particular sites, aimed at producing new knowledge about ways in which broad values might better be realised in future practice” (Ainscow et al., 2016, p. 10[17]). Again, something very similar happened in these cases: a team of policy makers, mostly (although not exclusively) from the respective knowledge units of ministries, explored a question about their own policy-making practice. They mobilised research knowledge and their knowledge of their own context and created new knowledge: specifically about ways in which evidence can be used more systematically and better.
Action research has been extensively criticised for lack of rigour and labelled by many as low-quality research. However, action research can also be considered as engagement with research rather than as research production. In this sense, the primary goal is to improve practice – which in this case is policy – not necessarily to add to the body of academic research knowledge. Collaborative enquiry may then be a better term to avoid associations with academic research production. Research engagement is apparent in both cases and goes beyond using the two frameworks. Examples include using literature on knowledge mobilisation and complexity science as well as recent developments, such as the application of the Joint Research Centre’s competence frameworks. Furlong and Oancea (2005[18]) underline that the quality of applied and practice-based research should also be assessed with respect to social and economic robustness in addition to scientific robustness (Table 6.1).
Table 6.1. Dimensions and sub-dimensions of applied and practice-based research quality
Scientific robustness |
Social and economic robustness |
||
---|---|---|---|
Epistemic: Methodological and theoretical robustness |
Technological |
Capacity development and value for people |
Economic |
Trustworthiness |
Purposefulness |
Plausibility |
Marketability and competitiveness |
Builds on what is known + contributes to knowledge |
Salience/timeliness |
Partnership, collaboration and engagement |
Cost-effectiveness |
Explicitness |
Specificity and accessibility |
Reflexivity, deliberation and criticism |
Auditability |
Propriety |
Concern for enabling impact |
Receptiveness |
Feasibility |
Paradigm-dependent criteria |
Flexibility and operationalisability |
Transformation and personal growth |
Originality |
Source: Adapted from Furlong and Oancea (2005, p. 15[18]), Assessing Quality in Applied and Practice-based Educational Research.
Clearly, many of the criteria for social and economic robustness are met by the two evidence use journeys. They have a clear purpose, and are saliant and timely, specific and accessible. They enable capacity development and are of clear value to policy makers and policy organisations, possibly also for researchers and other actors. They seem to indicate personal growth (of colleagues involved in the analysis as well as of the participants of the learning seminars) and a potential for transformation. The economic criteria may be more difficult to assess in the short term.
In addition to their social value and quality, the two evidence use journeys also have the potential to contribute to research itself. The application of the respective frameworks highlighted different affordances and limitations. These could drive further refinement of each framework and possibly the development of a QURE framework for policy. For instance, the use of different types of knowledge over time and throughout the stages of a policy process in the Dutch analysis may prompt the development of a more dynamic tool than a static framework. Both analyses highlighted nuances that may motivate further specification of some of the dimensions and descriptors. This demonstrates that while the primary goal of applied research is improving practice, it can also contribute to developing theoretical knowledge (Furlong and Oancea, 2005[18]).
Conclusions: The value of self-reflective evidence use journeys
This chapter is a meta-analysis, not in the research sense, but in the everyday one: it provided a comparative analysis of two analyses. In addition, it reflected on the process and value of countries’ self‑analysis of evidence use journeys.
Evidence use in policy making is complex, but there are concrete ways to improve it
Both analyses demonstrated the complexity of knowledge governance and evidence use, showcasing the ways in which different sources of knowledge, political values and stakeholders’ interests interact [see also (Garrison, 2005[19])]. Nevertheless, they brought more clarity on what thoughtful engagement with research evidence may mean in policy making. They also proposed concrete ways in which this can be improved:
They both call for more evidence synthesis, noting that this was largely missing in both contexts. Evidence synthesis requires adequate processes that provide relevant insights in a timely manner.
They suggest that, similarly to the health and other sectors, a collective appraisal of evidence by stakeholders can increase quality research use in education policy. It can also enable a more meaningful and systematic integration of stakeholders’ professional and contextual knowledge in policy decisions.
Collective capacity development within policy organisations to build research engagement skills and other competences that are necessary to bridge evidence and policy is key. Such competences should be identified and integrated into human resources policies (e.g. recruitment, development, building policy teams for a specific process).
Guided self-reflection is a valuable complement to policy advice
Finally, we would like to reflect on the nature of this guided, self-reflective analytical exercise as opposed to more standard policy advice. Consultancies and international organisations generally support countries and specific policies through policy reviews and advice. This typically includes reviewing the evidence, investigating the country’s context and policy, conducting desk research, collecting data (e.g. through surveys, interviews), and analysing all of this to draw recommendations.
However, the process, originating in the learning seminars, was country-driven in this case. The OECD provided insights from research, facilitated discussions at the learning seminars to enrich the Dutch and Flemish self-reflection with peer feedback, and supported the framing of the analysis. To reflect and further strengthen countries’ self-reflection journey, instead of giving them direct advice, we asked the Netherlands and Flanders to write their own analyses and both countries came up with their own recommendations. As opposed to policy advice, this sort of “policy coaching” builds on countries’ willingness to change and supports policy action research as a tool for improvement. While a neutral, external perspective with traditional policy review and advice remains important to countries, it could be complemented with guided self-reflection (or policy coaching) approaches.
References
[17] Ainscow, M. et al. (2016), “Using collaborative inquiry to foster equity within school systems: opportunities and barriers”, School Effectiveness and School Improvement, Vol. 27/1, pp. 7-23, https://doi.org/10.1080/09243453.2014.939591.
[11] Boyko, J., J. Lavis and M. Dobbins (2014), “Deliberative dialogues as a strategy for system-level knowledge translation and exchange”, Healthcare Policy, Vol. 9/4, pp. 122-131, https://doi.org/10.12927/hcpol.2014.23808.
[3] Burns, T. and F. Köster (eds.) (2016), Governing Education in a Complex World, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264255364-en.
[4] Cairney, P. (2019), “Evidence and policy making”, in Boaz, A. et al. (eds.), What Works Now? Evidence-informed Policy and Practice, Policy Press.
[9] Culyer, A. and J. Lomas (2006), “Deliberative processes and evidence-informed decision making in healthcare: Do they work and how might we know?”, Evidence & Policy, Vol. 2/3, pp. 357-371, https://doi.org/10.1332/174426406778023658.
[18] Furlong, J. and A. Oancea (2005), Assessing Quality in Applied and Practice-based Educational Research: A Framework for Discussion, Department of Educational Studies, Oxford University.
[19] Garrison, M. (2005), Follow that Egg!, https://www.southparkstudios.com/video-clips/8bncpu/south-park-scientific-unbiased-study.
[15] Kuhne, G. and B. Quigley (1997), “Understanding and using action research in practice Settings”, in Creating Practical Knowledge Through Action Research: Posing Problems, Solving Problems, and Improving Daily Practice, Jossey-Bass, https://doi.org/10.1002/ace.7302.
[14] Lavis, J., J. Boyko and F. Gauvin (2014), “Evaluating deliberative dialogues focussed on healthy public policy”, BMC Public Health, Vol. 14/1, https://doi.org/10.1186/1471-2458-14-1287.
[8] McMaster Health Forum (2021), “Products of stakeholder dialogues and citizen panels”, McMaster Health Forum, Ontario, https://www.mcmasterforum.org/about-us/products (accessed on 15 November 2022).
[12] Moat, K. et al. (2014), “Evidence briefs and deliberative dialogues: Perceptions and intentions to act on what was learnt”, Bulletin of the World Health Organisation, Vol. 92/1, pp. 20-8, https://doi.org/10.2471/BLT.12.116806.
[5] OECD (2020), Innovative Citizen Participation and New Democratic Institutions: Catching the Deliberative Wave, OECD Publishing, Paris, https://doi.org/10.1787/339306da-en.
[7] Oortwijn, W., M. Jansen and R. Baltussen (2021), “Evidence-informed deliberative processes for health benefit package design – Part II: A practical guide”, International Journal of Health Policy and Management, Vol. 11/10, pp. 2327-2336, https://doi.org/10.34172/ijhpm.2021.159.
[1] Rickinson, M. et al. (2022), “A framework for understanding the quality of evidence use in education”, Educational Research, Vol. 64/2, pp. 133-158, https://doi.org/10.1080/00131881.2022.2054452.
[13] Ridde, V. and C. Dagenais (2017), “What we have learnt (so far) about deliberative dialogue for evidence-based policymaking in West Africa”, BMJ Global Health, Vol. 2/4, p. e000432, https://doi.org/10.1136/bmjgh-2017-000432.
[10] Sharma, T. et al. (2015), “Evidence informed decision making: The use of “colloquial evidence” at Nice”, International Journal of Technology Assessment in Health Care, Vol. 31/3, pp. 138-146, https://doi.org/10.1017/s0266462314000749.
[2] Shewbridge, C. and F. Köster (2019), “Strategic education governance – Project plan and Organisational framework”, OECD, Paris, https://www.oecd.org/education/ceri/SEG-Project-Plan-org-framework.pdf.
[6] The ASTUTE Health study group (2014), “Disinvestment policy and the public funding of assisted reproductive technologies: Outcomes of deliberative engagements with three key stakeholder groups”, BMC Health Services Research, Vol. 14/1, https://doi.org/10.1186/1472-6963-14-204.
[16] Townsend, D. and P. Adams (2014), “From action research to collaborative inquiry”, Education Canada, https://www.edcan.ca/articles/from-action-research-to-collaborative-inquiry/ (accessed on 30 November 2019).
Note
← 1. In this chapter, the Flemish community of Belgium will be referred to as Flanders for simplicity.