Jordan Hill
OECD
José Manuel Torres
OECD
Jordan Hill
OECD
José Manuel Torres
OECD
This chapter describes characteristics of a culture of research engagement across OECD countries in policy and practice. Based on a review of the available literature and an analysis of the results of the OECD’s Strengthening the Impact of Education Research policy survey, the chapter proposes a set of dimensions to analyse the reported culture across respondent systems and the skills that support it. The chapter then assesses the resources and learning opportunities that can promote research engagement within each survey participant system. The chapter illustrates promising practices through several country cases and concludes with recommendations for education systems to strengthen the research engagement culture.
Work on evidence-informed policy and practice over the past two decades has increasingly recognised the importance of cultural factors in mobilising knowledge (Powell, Davies and Nutley, 2017[1]; Haynes et al., 2020[2]). This is no coincidence. Growing decentralisation and school autonomy mean there are now more actors with decision-making abilities in OECD education systems and increasing numbers of relationships and interactions (Golden, 2020[3]). These relationships and interactions are shaped by a diverse array of organisational cultures. These cultures shape, and are shaped by, the skills that individuals have, which can be strengthened through individual and collective professional learning opportunities. Professional learning that is informed by internal and external evidence can promote a decision-making culture that is more likely to improve education quality and, ultimately, student learning. Despite the presence of a strong basis for developing a culture of research engagement, data from the Strengthening the Impact of Education Research policy survey suggest that key systemic enablers are still missing in many education systems.
Almost a quarter of a century ago, Davenport and Prusak (1998[4]) considered developing a culture of research engagement as one of the main objectives of knowledge mobilisation within an organisation. Yet today, the relational approaches and systematic supports required to build and maintain such a culture remain elusive in policy making (Oliver et al., 2022[5]). Similarly, in education practice, meaningful capacity for research engagement depends on the consistent presence of a culture supporting and incentivising research-informed practice (Godfrey and Brown, 2018[6]). Identifying opportunities for change can be challenging, since people shape organisational culture, but this same culture affects, in turn, people’s behaviours, attitudes and beliefs (Belkhodja et al., 2007[7]).
Social interactions, processes and contexts (henceforth “social processes”) influence individual behaviours related to research engagement. They help organisations integrate evidence into their activities, stimulate professional learning, and are an important building block of a research engagement culture for both practice and policy. As Oliver and Boaz (2021[8]) argue: “mobilisation takes people, not papers”. Connections between individuals, bodies of knowledge and experience are often forged through people coming together, or the work that results from this. OECD work has found that giving decision makers time to discuss the implications of research cultivates a shared understanding of what constitutes fit-for-purpose evidence (Köster, Shewbridge and Krämer, 2020[9]). Interventions that systematically promote interactions between decision makers and researchers have been found to be among the most effective for increasing evidence use (Langer, Tripney and Gough, 2016[10]).
When it comes to engaging with research in education practice, Cain (2019[11]) argues that because learning itself is a social process, so is research use. A strong culture of research engagement is one in which people engage with research but also help others to engage with it. Social contexts promoting collective discussions, observation, reading and thinking about research can contextualise evidence within a professional’s experience. Recent work in the Austrian context found that nine in ten school leaders exchanged with colleagues about using evidence to enhance education quality, often through informal collegial exchanges (Köster, Shewbridge and Krämer, 2020[9]). Cultivating these social processes requires systematic enablers in the form of dedicated resources and learning opportunities.
A culture of thoughtful engagement with research is strictly connected to organisational learning, stimulated through opportunities that encourage evaluative thinking and systematic attention to building individual skills, knowledge and attitudes (OECD, 2017[12]). Developing systematic learning opportunities to enable practitioners and policy makers to use evidence requires a well-developed vision of the school or ministry as a learning organisation. Kools and Stoll (2016[13]) define a learning organisation in education as one that places student learning at the centre and offers continuous learning opportunities for all staff. A learning organisation mobilises knowledge through supportive leadership, a culture of inquiry, innovation and exploration, as well as engagement with the wider education system. This definition in education mirrors research on organisational learning in healthcare, where there is consistent evidence that providing a safe space for taking risks and experimentation is crucial for effective and safe healthcare delivery and promoting organisational learning (Grailey et al., 2021[14]).
Allocating adequate human and financial resources that provide dedicated time and space for social processes to flourish are crucial for developing a research engagement culture. Resources underpin collaborative learning and allow for needs-based knowledge creation linked to professional ideas and targeted outcomes (Godfrey, 2017[15]). But resources need to be tied together by a structured approach. A structured approach is one that is taken intentionally, deploying systematic mechanisms to consciously achieve clear goals (Langer, Tripney and Gough, 2016[10]). Structural factors such as these can make the difference between using research and using research well (Rickinson et al., 2022[16]).
This chapter will present data on culture, skills, resources and learning opportunities from the Strengthening the Impact of Education Research policy survey1. The survey – conducted from June to September 2021 – collected data at the national or sub-national (state, province, canton, etc.) level from 37 education systems in 29 countries. It focused on the actors, mechanisms and relationships that facilitate the use of research in policy making and in practice, as well as the production of research (see more information in Box 1.1, in Chapter 1). Its first results and analyses were presented in our previous report (OECD, 2022[17]). The survey gathered the perceptions of ministries of education about policy makers’ and practitioners’ individual attributes generally. Naturally, this general impression of individuals most likely hides a significant degree of individual heterogeneity within systems.
This chapter structures the analysis along three questions:
What are the characteristics of a culture of research engagement in education systems?
To what extent are systemic enablers of a culture of research engagement present in education systems?
In what ways are culture, skills and learning opportunities connected?
The first part of the analysis is anchored in two questions asked in the Strengthening the Impact of Education Research policy survey. The first question asked ministries of education about the culture and mindset (hereafter referred to as “culture”) of research engagement among policy makers and practitioners. The second asked about policy makers’ and practitioners’ skills and capacity (hereafter referred to as “skills”) to use research.
The second part of the analysis is centred on additional survey questions on two enablers of culture and skills of practitioners and policy makers in education systems: 1) dedicated resources; and 2) policy makers’ and teachers’ learning opportunities.
The third and final section looks at the statistical relationships between these factors, to get a fuller picture of the perceptions of respondent ministries. Additional data are also presented throughout. These relate to relevant barriers, mechanisms and levels of ministries’ satisfaction with the extent to which policy makers and practitioners engage with research.
The policy survey asked the extent to which ministries agreed with nine statements explicitly linked to the theme of research engagement culture, summarised in Figure 3.1. The survey measured these perceptions on a 1-5-point Likert scale, ranging from “Strongly disagree” to “Strongly agree”.
Drawing on the knowledge mobilisation literature, each of the statements has been grouped under the following dimensions:2
Motivation: Motivation is regarded as a key component of behaviour change, alongside capability and opportunity (Michie, van Stralen and West, 2011[18]). Organisational norms and ethos form an important social influence which affect an individual’s extrinsic motivation to use research (Rickinson et al., 2020[19]). These include both social (expectation of others) and contextual (political will) pressures. In addition to extrinsic motivation, the extent to which an individual values research is an intrinsic motivation. Intrinsic motivation is an important driver of professional improvement and skill development (Dysvik and Kuvaas, 2010[20]). When it comes to education practice, intrinsic motivation is associated with involvement in professional development, decision making and the quality of instructional practices (Guerriero, 2017[21]).
Willingness: Disposition to implement changes based on research indicates a positive attitude towards thoughtful engagement with research evidence, influenced by values and beliefs (Rickinson et al., 2020[19]). Without a willingness to solve problems and collaborate, even the most elaborate administrative solutions will fail to bring any lasting change (Sitra, 2018[22]). Of course, willingness can come from an intrinsic belief that using research is the “right” approach, or from extrinsic incentives to behave in a certain way. It is beyond the scope of this analysis to discuss in detail the authenticity of reported willingness, but it remains an important area for further study.
Relationships: Trust and mutual understanding between and within different communities are important outcomes of relationships that promote quality research engagement (Rickinson et al., 2020[19]). More broadly, trust in research itself enhances the credibility of the findings, which can be both a product and a cause of interpersonal relationships between the research community and policy makers/practitioners (Gu et al., 2021[23]). This has long been recognised as crucial for implementing evidence-based interventions in the health sector (Lanham et al., 2009[24]; Albers et al., 2021[25]). Interactions between decision makers are an important route to achieving a shared understanding of which evidence is most appropriate and can contribute to building professional standards of what fit-for-purpose evidence looks like (Köster, Shewbridge and Krämer, 2020[9]). The components of this dimension speak to the quality of relationships between policy makers/practitioners and researchers.
Overall, motivation is the strongest dimension of research engagement culture and relationships are the weakest across respondent systems. At the level of individual systems, ministries of education often reported policy makers’ and practitioners’ relationships and willingness being at similar levels. However, when it comes to motivation, ministries more often reported these to be different for policy makers and practitioners.
Ministries of education most commonly agreed with statements related to the motivation to use research among both policy makers and practitioners. It is particularly encouraging that most respondent systems reported that using education research is important for both policy makers and practitioners. In some systems, this indicator of intrinsic motivation does not mean there is an expectation (i.e. political or social pressure) to use education research. For example, a diverse group of six systems (Belgium [Flemish Community], Colombia, Hungary, Lithuania, the Netherlands and South Africa) reported that, despite research engagement being important for policy makers, the systems lack a strong expectation to actually use research in policy making. A mix of intrinsic and extrinsic motivation factors is more likely to lead to greater engagement with research. This could mirror the decision to enter the teaching profession, the general pedagogical knowledge and the professional development of teachers, where both intrinsic and extrinsic factors play a role (König, 2017[26]; Lauermann, 2017[27]). However, ascertaining such relationships would require a more complex design of indicators.
Five respondent systems agreed that there was an expectation to use research in policy making but did not agree that there was political will to do so. These were New Zealand, the Slovak Republic, Slovenia and Switzerland (Appenzell Ausserrhoden and Zurich). Interestingly, the systems that exhibited this distinction were entirely different for practice (Belgium [Flemish Community], Costa Rica, South Africa, Switzerland [Nidwalden] and the Republic of Türkiye). This suggests that the extent of political will to use research is not always the same in the policy and practice contexts within a system, providing evidence to support the hypothesis of Burns and Schuller (2022[28]) about differing sources of motivation for policy makers and practitioners to engage with research depending on the system context.
The extent of willingness to use research can make the difference between low- and high-quality use, as working with evidence requires the disposition to be open to a range of interpretations [Earl and Timperly (2009[29]) in Rickinson et al. (2020[19])]. Willingness to use research to challenge preconceived notions is a form of introspection and self-criticism, closely aligned with individual and organisational learning. More than half the ministries reported that practitioners were willing to learn and try new things when it comes to research use. However, some of those systems did not agree that they were also willing to use research to challenge preconceived notions. This was the case in Colombia, Hungary, Portugal, Switzerland (Zurich) and Türkiye. Colombia and Portugal also reported this way for policy makers. This may mean that some ministries of education have the perception that willingness to learn new skills and methods exists on the condition that this new knowledge does not fundamentally challenge preconceived notions of what education practice or policy should be. It has long been recognised that a culture closed to challenging the status quo cannot promote the open-ended inquiry needed to use research (Sharp et al., 2006[30]).
Despite the importance attached to trusting relationships when it comes to engaging with research (OECD, 2007[31]; Mitton et al., 2007[32]; Ward, House and Hamer, 2009[33]; Rickinson et al., 2022[16]), levels of trust are low according to the survey responses. Less than half of the respondent systems agreed that both researchers and policy makers/practitioners have a shared understanding of education research and its use. Sharing a common understanding of research means developing agreement around which evidence is fit-for-purpose for which tasks and how it is best used in concrete situations (for example, designing professional development) (Köster, Shewbridge and Krämer, 2020[9]). Good relationships are an important way of encouraging a shared understanding of research, promoting the production of relevant research and ensuring that there are realistic expectations for its use. Respondent systems that did report mutual understanding generally also reported high levels of trust in other areas.
Implementing interventions targeting one or several dimensions of a culture of research engagement necessitates focusing on the resources and knowledge needed to support such a culture. In this vein, the city of Amsterdam recently launched an initiative supporting the relationships between researchers and practitioners and the motivation and knowledge of these actors to achieve an evidence-informed improvement of the local educational system. Box 3.1 outlines this initiative.
EducationLab is a Dutch research consortium launched in 2020 which aims to build infrastructure for experimental research and foster the use of scientific evidence in educational policy and practice. EducationLab supports two labs for policy (Teacher Lab and Language Lab) and the Knowledge Infrastructure for Primary Schools in Amsterdam (ONA) project for practice.
ONA kicked off in September 2022, supported by the municipality of Amsterdam, the association of school boards of primary and special education in Amsterdam (BBO), two of the largest publicly funded research universities (VU Amsterdam and the University of Amsterdam) and the University of Maastricht. It aims to tackle two main challenges faced by Amsterdam’s primary education:
1. declining performance and increasing illiteracy, especially among students with low-educated parents;
2. a qualitative and quantitative shortage of teachers and school leaders.
The project focuses on setting up a knowledge infrastructure for primary education that will enable scientists, primary teachers and school leaders, and (municipal) administrators to generate sound scientific knowledge and develop evidence-informed practices to achieve the sustainable improvement and innovation of Amsterdam’s city centre.
To achieve this, ONA focuses on:
Knowledge culture: Strengthening a learning culture in schools, with growing support and enthusiasm for evidence-informed working.
Knowledge use: Providing accessible and relevant knowledge about proven effective approaches in education and supporting its share and use through professional development on evidence-informed approaches.
Knowledge creation: Enhancing research to be conducted into solutions for the biggest challenges in Amsterdam’s context.
In addition, ONA is setting up a replicable prototype of the knowledge infrastructure for primary education that can also be informative for other regions and education sectors with similar objectives.
Source: ONA (n.d.[34]), Educational Knowledge Network Amsterdam (ONA), https://ona.amsterdam.
The policy survey also asked ministries about a handful of barriers to using research in educational policy and practice. The core analysis of these featured in our first volume (OECD, 2022[17]), which identified three cultural barriers:
1. “lack of broader political will to use research” (policy only);
2. “lack of openness to new ideas from research”;
3. “lack of willingness to use research”.
Interestingly, these cultural barriers were the least commonly reported category of barrier by ministries, whereas organisational and structural barriers, such as lack of time and lack of dedicated mechanisms, were the most commonly reported in policy and practice. In policy making, three-quarters of the cultural barriers reported in the whole sample were reported by ministries also reporting the lowest levels of satisfaction with the extent of research use (see Table 3.1). The remaining quarter was ministries reporting the highest levels of satisfaction.
This clustering of cultural barriers at the extremes of both high and low satisfaction within the sample might indicate a heightened perception of the importance of culture in respondent systems reporting them.
Rank assigned to cultural barriers to research use by systems and their average satisfaction with research use in policy, 2021.
Cultural barrier |
Reported ranking |
|||||||||||||||||||||||||||||||||||
Lack of broader political will to use research |
1 |
2 |
2 |
3 |
6 |
6 |
3 |
1 |
3 |
3 |
1 |
1 |
||||||||||||||||||||||||
Lack of organisational openness to new ideas from research |
2 |
5 |
4 |
6 |
3 |
|
||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Policy makers’ lack of willingness to use research |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
|
2 |
|
2 |
Average level of satisfaction with research |
5 |
5 |
4 |
4 |
4 |
4 |
4 |
4 |
4 |
3.8 |
3.8 |
3.8 |
3.5 |
3.5 |
3.3 |
3 |
3 |
3 |
3 |
3 |
3 |
3 |
3 |
3 |
3 |
2.8 |
2.8 |
2.5 |
2.5 |
2.5 |
2.5 |
2.5 |
2.3 |
2.3 |
2 |
1.5 |
Country/system |
Finland |
Chile |
United States (Illinois) |
Switzerland (Appenzell A.) |
Canada (Saskatchewan) |
Switzerland (Nidwalden) |
Hungary |
Switzerland (St. Gallen) |
Canada (Quebec) |
Austria |
Norway |
Türkiye |
Switzerland (Lucerne) |
Portugal |
Switzerland (Obwalden) |
Iceland |
Sweden |
Switzerland (Zurich) |
Estonia |
Colombia |
New Zealand |
Lithuania |
United Kingdom (England) |
Japan |
Costa Rica |
Denmark |
Belgium (French Comm.) |
Spain |
Latvia |
Slovenia |
South Africa |
Czech Republic |
Belgium (Flemish Comm.) |
Slovak Republic |
Netherlands |
Switzerland (Uri) |
Notes: The first three rows show the rank (from 1st to 6th) assigned by each system to the given barrier to increasing and improving the use of education research in policy making related to culture. The fourth row indicates the average level of satisfaction of respondent systems with four aspects of research use in policy making ("The extent to which policy makers use research in policy processes", "The ways in which policy makers use research", "The ways in which policy makers access research", and "The extent to which policy makers evaluate the quality of research they use"), using a 5-point Likert scale (from 1. "Not at all satisfied" to 5. "Highly satisfied") for each statement. Data collected at the national and sub-national levels. “Appenzell A.” refers to the Swiss canton of Appenzell Ausserrhoden. "Flemish Comm." and "French Comm." refer to the Flemish and French Communities of Belgium, respectively. N = 36.
Systems are ranked in descending order of average level of satisfaction with research use in policy making.
Source: OECD Strengthening the Impact of Education Research policy survey data.
A culture of quality research engagement cannot exist without adequate skills. The policy survey asked about policy makers’ and practitioners’ skills to engage with research. The survey measured these perceptions on a 5-point Likert scale, ranging from “Strongly disagree” to “Strongly agree”, in response to eight statements (Figure 3.2).
This chapter uses the concept of research engagement defined in (OECD, 2022[17]) to group each of the statements into three dimensions of research engagement:
1. Research literacy: This dimension draws on definitions established in the healthcare and education sectors, which describe an individual’s competence in finding, accessing, understanding and critically evaluating research (Jakubec and Astle, 2021[35]; BERA, 2014[36]). This includes familiarity with research methods, the latest research findings, and implications for both policy and practice. These skills allow an individual to make sense of research findings to inform, develop and translate ideas into practice in a way that is meaningful to their context.
2. Research use: The core components of research use are broadly defined since the use of research can have a wide variety of purposes, including instrumental, conceptual, strategic and symbolic (Weiss, 1979[37]). However, research use is always directed at solving problems (Backer, 1991[38]). An important, and often forgotten, part of solving problems is understanding the nature of the problem in the first place. There is often the temptation among policy and practice to look for immediate (and perhaps poorly chosen) solutions before a proper diagnosis of the problem can be performed (Van Klaveren and Cornelisz, 2023[39]). Quality research use also requires understanding its relevance for decision making through translation, as well as how it can be shared with decision makers with sensitivity to how individuals might respond to research findings (Farley-Ripple, Oliver and Boaz, 2020[40]). Although research use has been seen as a part of research literacy (Evans, Waring and Christodoulou, 2017[41]), it is more often conceptualised as a distinct process, centred on translating and applying research to specific questions as well as communicating research to colleagues.
3. Research production: This dimension encompasses statements relating to how research is produced. It captures aspects of production that are most relevant for practitioners’ and policy makers’ daily work (i.e. formulating research needs, commissioning, supervising, co‑designing or co-conducting research). As discussed in Chapter 2, more evidence needs to be gathered on the feasibility and usefulness of policy makers’ and practitioners’ widespread involvement in research.
Overall, more than one-quarter of the responding ministries reported that policy makers do not have sufficient skills and capacity across all three dimensions of research engagement. Almost half of the respondent systems report this for practitioners. Both practitioners’ and policy makers’ skills in co‑designing and co-conducting research with researchers were lacking in the largest number of respondent systems. Interestingly, compared to culture, there are greater differences between policy and practice when it comes to each of the dimensions of skills, with ministries reporting practitioners having noticeably lower skills levels. In particular, policy makers and practitioners were often regarded as having very different levels of skills when it comes to translating and applying relevant research and formulating research needs. This may be because the data were reported by ministries, which may either have a more lenient perception of policy makers than practitioners or a more accurate perception of policy makers’ skills than those of practitioners.
Ministries perceived that practitioners’ and policy makers’ research literacy skills were the strongest of all the dimensions. A large majority agreed that policy makers had adequate research literacy skills in all three statements. However, this was not the case for Colombia and Latvia, which reported that policy makers were able to understand and evaluate research but lacked the skills to find and access it, and Switzerland (St. Gallen), which responded that the opposite was true in its context. When it comes to practice, there was a greater variation in responses and over one-third of ministries did not agree with all three research literacy statements. Two systems perceived that practitioners did not have sufficient skills to understand and/or evaluate research, despite being able to find and access it (the Netherlands, and Switzerland [Zurich]). However, Iceland and the United Kingdom (England) reported that the opposite was true and low level of practitioners’ skills to find and access research were a challenge, rather than their ability to understand and evaluate it.
When it comes to research use, most ministries reported that policy makers have the skills to translate, apply and communicate research. A small group (Colombia, the Slovak Republic, Switzerland [Nidwalden] and Türkiye) felt that policy makers did not have adequate skills to communicate research but did have adequate skills to translate and apply it to their contexts. The opposite was true in Switzerland (Zurich). When it comes to practitioners, there were overall fewer ministries agreeing that practitioners had adequate research use skills, and most did not agree with both statements in this dimension. For example, Hungary and Iceland both felt practitioners were able to translate and apply findings but not communicate research to their peers. Although respondent systems generally felt that research use skills were stronger among policy makers compared to practitioners, only skills relating to translating and applying education research results to solve problems in their context had statistically significant differences (see Table 3.A.1. Annex 3.A).
Finally, only around one-third of the ministries agreed with all three statements related to research production in policy. Just under a third agreed with only two of the statements. For instance, the Netherlands, New Zealand and the United Kingdom (England) only reported policy makers to be able to formulate their research needs and supervise its production. However, they disagreed that policy makers were able to participate in designing and/or conducting research. Involving policy makers and practitioners in research production can be beneficial in terms of increasing the relevance of the research produced. However, there are unanswered questions about how desirable and feasible it is for these skills to be widespread among individuals working in policy and practice (for a deeper discussion, see Chapter 2). A minority of the ministries agreed with one statement and just under one-third did not agree with any statements relating to policy makers. For example, Switzerland (Zurich) agreed that policy makers can formulate their research needs but strongly disagreed that they have the skills to supervise, co-design and co-conduct research. When it comes to practice, just over a quarter of the ministries agreed with both statements and half did not agree with either statement. The remainder only agreed with one statement. For example, Colombia and Latvia reported practitioners to be able to formulate their research needs but disagreed that they have the skills and capacity to participate in co-designing or co-producing research. Indeed, the survey data suggest it is far more common for policy makers and practitioners to have the skills to engage with research in ways other than its production. Although respondent systems generally thought that research production skills were stronger among policy makers compared to practitioners, skills relating to formulating research needs showed the only statistically significant difference (see Table 3.A.1. Annex 3.A).
Box 3.2 outlines two competence frameworks developed by the European Commission’s Joint Research Centre to increase research engagement in policy. The frameworks take a comprehensive view and target multiple facets of policy makers’ and researchers’ skills and capacities.
The Joint Research Centre (JRC) is a department of the European Commission that provides independent, evidence-based knowledge to support European Union policies to positively impact society.
The JRC developed two competence frameworks tailored to decision makers’ and researchers’ needs to increase capacities for effective policy making. The first is designed to guide policy organisations on the relevant competences for innovative, effective and evidence-informed policy making. The second is aimed at research organisations contributing to policy making with evidence and advice.
The framework describes the level of competence expected for a generalist policy maker, rather than a professional working in a specialised role (e.g. knowledge broker, data analyst).
It considers 36 competences divided into 7 clusters that enable innovative policy making and constitute a collective set of competences that are relevant for policy making and the different roles within the profession (e.g. a team or unit composed of different roles).
For instance, the cluster “Work with evidence” considers the competences of scientific and data literacy; identifying evidence needs; connecting to experts; gathering evidence; assessing evidence; and working with data and models.
Formal university education and doctoral programmes rarely cover the competences required to achieve a meaningful policy impact. Possessing these competences can empower researchers to ensure that the most robust evidence is provided and understood in good time by policy makers to be considered during the policy cycle.
The JRC mapped these competences in the “Science4Policy” Competence Framework. It consists of five clusters, each of which is made up of three to seven competences. The five clusters are: 1) understand policy; 2) participate in policy making; 3) communicate; 4) engage with citizens and stakeholders; and 5) collaborate.
While the two competence frameworks differ in terms of audience and scope, they are both interdependent and overlapping. This complementarity reflects the competences needed by both researchers and decision makers to interact and develop effective evidence-informed policy making. Competences in collaboration, communication and stakeholder engagement are pertinent for both contexts and thus featured in both frameworks.
Sources: European Commission (2023[42]), “Joint Research Centre”, web page, https://commission.europa.eu/about-european-commission/departments-and-executive-agencies/joint-research-centre_en; European Commission (2023[43]), “Supporting policy with scientific evidence”, web page, https://knowledge4policy.ec.europa.eu/projects-activities/competence-frameworks-policymakers-researchers_en; Schwendinger, Topp and Kovacs (2022[44]), Competences for Policymaking: Competence Frameworks for Policymakers and Researchers Working on Public Policy, https://doi.org/10.2760/642121.
Integrating research into practice relies on the interplay between culture, skills, mindsets, relationships and structures within schools (Maxwell, Sharples and Coldwell, 2022[45]). A research use culture is underpinned by appropriate organisational structures, systems and resources (Brown and Greany, 2018[46]). The policy survey responses show that these enabling factors are overall lacking.
A culture of research engagement requires sufficient human and financial resources, including organisational leadership, structures, tools, resources and dedicated time. For example, supportive leadership has been found to be a major prerequisite to a change in research use culture in schools (Gu et al., 2021[23]). Leaders can be important role models for a research engagement culture while also impacting the availability of other resources. For instance, leaders have been reported to have the authority to carve out dedicated time and space where organisational routines and collaborative work around research can take place in schools (Godfrey, 2017[15]). Moreover, the combination of role models, structures and resources that prioritise open and transparent discussions and encourage experimentation can be a powerful learning tool (Burns and Köster, 2016[47]).
Interestingly, more than twice as many respondent systems perceived there was sufficient soft infrastructure compared to adequate time to engage with research. In the healthcare sector, well-resourced networks, databases, journal subscriptions and collaborative forums have enabled healthcare professionals to use time more efficiently (NHS Health Education England, 2020[48]). However, they require a minimum level of financial and human resources to do so. Some education ministries clearly reported that the current level of soft infrastructure, although adequate, does not actually reduce the burden of time for policy makers or practitioners.
A lack of mechanisms to support practitioners’ research engagement was reported by over 60% of the ministries. Our last report provided a detailed look at the prevalence of individual mechanisms (OECD, 2022[17]). However, of direct relevance to this chapter is the marked difference in terms of culture and skills between systems that reported mechanisms offering resources to support practitioners’ research use and those that did not. Systems that reported the presence of these mechanisms also reported higher levels of the culture and skills dimensions within practice (Figure 3.4). All these differences were statistically significant, except for motivation and research production. Of course, the mix of resources and mechanisms needed to support research engagement among policy makers and practitioners will likely vary and depend on the system context. This reality is also frequently acknowledged in the health sector (Freebairn et al., 2017[49]).
Learning opportunities are crucial to empower individuals with the skills and knowledge to engage with research. Although learning opportunities are a challenge for both policy makers and practitioners in many systems, a lack of learning opportunities is more often perceived for practitioners than for policy makers.
The policy survey asked about the extent to which policy makers had access to learning opportunities to develop their research knowledge and skills. Almost 40% of the ministries agreed or strongly agreed that there were extensive learning opportunities. This leaves well over half of the ministries either disagreeing or providing a neutral response, indicating significant scope for at least increasing the visibility of any learning opportunities that may exist for policy makers in most respondent systems. However, it is not just the quantity of learning opportunities that is relevant. Poor quality learning opportunities can be a barrier to individuals acquiring the skills to effectively engage with research (Humphries et al., 2014[50]). Regarding their quality, the nature and focus of these learning opportunities are crucial, as complementary cross‑cutting competences are required for both policy makers and researchers.
Several initiatives have focused their efforts on improving policy makers’ learning opportunities to address their research use skills and capacity in different education systems. Examples from New Zealand and the United Kingdom illustrate this (Box 3.3).
Launched in 1998, Nesta was the first-ever publicly supported national endowment in the United Kingdom and is currently an independent innovation agency. It proposes a set of programmes, policies and research to promote research engagement in policy making. Some of these are discussed below.
The Capabilities in Academic Policy Engagement (CAPE) is a research project that aims to improve the use of evidence and academic expertise when policies are created to ensure effective policy outcomes that generate meaningful social impact. Launched in February 2022, the project designs, tests and evaluates several interventions (e.g. events, funding, fellowships, training) to improve engagement between researchers and policy makers. It promotes the use of an embedded evaluation team that focuses on the structures, mechanisms and incentives needed to best deliver what policy stakeholders want, embracing the opportunities of research-policy partnerships.
The Evidence Masterclass is a range of training courses focused on developing policy makers’ capacity to use research. It has been delivered since 2017 to public sector leaders, managers and civil servants from both local and national government departments. Participants learn how to find research relevant to their policy question, develop their ability to assess the quality of research, practise their new skills through simulations, and strengthen their confidence in using evidence.
Engaging with Evidence is an open access toolkit for policy makers which aims to build capacity to harness data, information and evidence to inform real problems and recognise good-quality evidence for policy. The toolkit provides a range of interactive activities that will determine what type of evidence and expertise is needed for each specific purpose, and the potential processes and methodologies that might support this work.
The New Zealand Policy Project aims to improve the quality of policy advice by ensuring it is based on the best available evidence and insights. The Policy Project builds and maintains an active policy community and equips policy makers with analytical tools, frameworks and information. Two of the tools are discussed below.
The Policy Quality framework presents a set of standards that specify what quality policy advice means. The standards increase accountability when developing new advice or reviewing existing advice.
The Policy Skills framework sets out the knowledge, applied skills and behaviours expected from policy makers to deliver quality policy advice.
The toolbox acts as a repository of policy development methods helping policy makers identify the right approach for their policy initiative. It recommends different themes, such as behavioural insights, community engagement and design thinking, depending on the stage and focus of the initiative.
Sources: Nesta and Alliance for Useful Evidence (2015[51]), Using Research Evidence: A Practice Guide, https://media.nesta.org.uk/documents/Using_Research_Evidence_for_Success_-_A_Practice_Guide.pdf; Alliance for Useful Evidence (2018[52]), The Evidence Masterclass: Course Description, https://wtgrantfoundation.org/wp-content/uploads/2018/07/Evidence-Masterclass-Course-Description.pdf; DPMC (2022[53]), “The Policy Project”, web page, https://www.dpmc.govt.nz/our-programmes/policy-project; Tennant and Morgan (2022[54]), “Using evidence to make policy more effective”, web page, https://www.nesta.org.uk/project-updates/using-evidence-make-policy-more-effective; Morgan et al. (2023[55]), Engaging With Evidence Toolkit, https://www.nesta.org.uk/toolkit/engaging-with-evidence-toolkit.
When it comes to practitioners, the survey asked about the extent to which skills related to research engagement were taught in initial teacher education (ITE) and continuing professional development (CPD) (Figure 3.5). Only around one-third of the ministries reported that training future teachers to understand and interpret research findings is required in all ITE programmes, and less so in CPD. Furthermore, skills related to collaborating in research production and use are not required to be taught in ITE in four out of five respondent systems.
It is worth noting that systems reporting skills as required or mostly covered by ITE tended to report that this was the case for all the skills. This suggests that when systems do integrate research engagement into teacher training, they tend to do it quite comprehensively. However, this is still rare. Revising curricula to include skills related to research engagement can be challenging but has been undertaken by some education systems. One case is the Estonian teacher standards reform and the University of Tartu teacher education curriculum revision, described in Box 3.4.
First introduced in 2005, the national standards for the teaching profession were updated in 2013 in Estonia, following an interactive development that considered a diverse range of stakeholders (e.g. teacher educators, teachers, school leaders, local and national decision makers). The standards describe the professional activities involved in the teaching career and set out the knowledge, skills and attitudes necessary to be successful in the role. They inform curriculum design, assessment, recruitment and professional training.
Through accreditation processes awarding the right to higher education institutions to issue a teaching licence, the application of teaching standards is guaranteed. To obtain these licensing rights, higher education institutions must demonstrate that their teaching curriculum and programmes prepare teacher students for all the requirements stated in the mentioned standards. Regarding research engagement, the Estonian teaching standards require teachers to use and conduct research and reflect on their practice.
In parallel to the development of the new professional standards, the University of Tartu – one of the two leading research universities that provide initial teacher education in Estonia with the University of Tallinn – revised its teacher education curriculum in 2012-13. The review was motivated by previous accreditation reports, identifying as a key concern the lack of a general module of teachers’ professional studies. Some capacities gained importance in the programme, such as building teacher candidates’ research capacity, for example, through involvement in research projects. Teacher students are required to use and conduct research to improve their own teaching, suggesting that research engagement is, in fact, an active part of teaching training and activity.
The University of Tartu Institute of Education provides research-based pre-service training for teachers, special education teachers, speech therapists and social pedagogues. There is also an extensive in‑service training programme for teachers, heads of schools and university teaching staff.
Estonian professional standards for teachers were updated in 2019-20, aiming for a more competence‑based career structure. They focus more on digital pedagogy and inclusive education and prioritise the continuous education of teachers. Furthermore, they raised the requirements for entry into teacher preparation as well as teacher salaries.
Sources: University of Tartu (n.d.[56]), “Institute of Education in the University of Tartu”, https://haridus.ut.ee/en; Révai (2018[57]); “What difference do standards make to educating teachers?: A review with case studies on Australia, Estonia and Singapore”, https://doi.org/10.1787/f1cb24d5-en; OECD (2020[58]), Education Policy Outlook Estonia, http://www.oecd.org/edu/policyoutlook.htm; NCEE (2021[59]), “Top-performing countries: Estonia”, https://ncee.org/country/estonia.
This section looks at relationships between the different themes related to culture and skills in the survey responses. See Annex 3.A for a description of the statistical tests and the results.
The various aspects of culture and skills in policy seem to be strongly related.3 For example, the higher the perceived levels of policy makers’ research literacy, the more positive their relationships are with research and researchers (Figure 3.6). However, it is important to point out that, in this case, two-thirds of the systems still do not agree that policy makers’ relationships with researchers are characterised by high levels of trust in research and mutual understanding (scores of less than four points).
Respondent systems that perceive more adequate soft infrastructure also report that practitioners had greater levels of research literacy. This suggests that systems do recognise the role played by databases and journal subscriptions in providing access to research knowledge. It also indicates that networks and collaborative forums may play a valuable role, in line with the extensive literature on social processes and research engagement for practitioners. In this dataset, there were no statistically significant correlations between other resources and culture and skills. This may be puzzling, given the importance attached to resources in both the policy and practice literature and the differences in terms of culture and skills of the systems with mechanisms offering resources to practitioners. While this may simply be a limitation of the dataset, it may also suggest that current resources are not well-oriented towards fostering research engagement.
This chapter explored the characteristics of research engagement culture and skills for policy makers and practitioners based on survey data from 37 education systems. A culture of research engagement was characterised by the three dimensions of motivation to use research, willingness to use research, and relationships with research and researchers. Skills were also characterised along three dimensions, related to research literacy, research use and research production. This chapter also sought to understand the connections between these dimensions and two systemic enablers, resources (human, financial, strategic and infrastructure) and learning opportunities for policy makers and practitioners.
Policy makers and practitioners are generally motivated to engage with research; however, many systems lack quality relationships to thoughtfully do so.
Despite the extensive literature emphasising the importance of viewing research engagement as a social process, practitioners, in particular, appear to lack the quality relationships needed. Systemic mechanisms promoting and supporting collaboration between policy makers, practitioners and researchers are crucial. However, they must be focused on promoting an environment where individuals have the time and space to come together and develop the trusting relationships needed to critically appraise research evidence and understand how it can be useful to them.
Research literacy skills are the most common skills among practitioners and policy makers; however, many other skills are still lacking, as well as learning opportunities.
Although most ministries report that practitioners’ and policy makers’ research literacy skills are sufficient, a significant number still report research literacy skills to be lacking. These skills are not necessarily intuitive, their development does not come naturally, and so they must be systematically taught and practised. To support a culture of research engagement, explicit, specific and adequate interventions must address educational systems’ learning needs. Evidence-informed frameworks can help policy makers understand, track and tailor training regarding the research engagement skills of both policy makers and practitioners. These frameworks can then serve as tools for human resource strategies, including recruitment and individual and collective learning opportunities, including professional development.
The lack of adequate resources may be hindering the development of a research engagement culture and skills.
Respondent systems that indicated the presence of resources for engaging with research had noticeably stronger culture and skills. However, many systems reported that current resources are insufficient. Soft infrastructure is the most adequate resource according to ministries, and there was a link to levels of practitioner skills in the data, but its impact may be limited due to a lack of other resources. Human, financial and strategic resources are all interrelated and should work together to tackle big barriers, notably a lack of time. More research is needed to understand these interconnections. Greater efforts should be given to dedicating time and space for the social processes behind research engagement to take root. Providing soft infrastructure should not be a box-ticking exercise; rather, it should support structured discussions among colleagues to promote quality relationships built around concrete challenges.
Policy and practice seem to face different challenges related to learning opportunities. Yet, more data are needed to demonstrate a connection between learning opportunities, skills and culture.
While policy makers appear better served than practitioners in terms of availability of learning opportunities, questions remain around the precise nature of learning opportunities and whether or not they are actually making a difference to culture and skills. Better understanding the content and quality of these programmes and how they are connected to skill needs would allow the improvement of intermediary activities, such as providing training in research. Further research should explore the nature of existing learning opportunities for both practitioners and policy makers to better map their current focus and intended impact with a view to increasing their overall quality.
[25] Albers, B. et al. (2021), “The mechanisms of implementation support: Findings from a systematic integrative review”, Research on Social Work Practice, Vol. 32/3, pp. 259-280, https://doi.org/10.1177/10497315211042375.
[52] Alliance for Useful Evidence (2018), The Evidence Masterclass: Course Description, https://wtgrantfoundation.org/wp-content/uploads/2018/07/Evidence-Masterclass-Course-Description.pdf.
[38] Backer, T. (1991), “Knowledge utilization”, Knowledge, Vol. 12/3, pp. 225-240, https://doi.org/10.1177/107554709101200303.
[7] Belkhodja, O. et al. (2007), “The extent and organizational determinants of research utilization in Canadian health services organizations”, Science Communication, Vol. 28/3, pp. 377-417, https://doi.org/10.1177/1075547006298486.
[36] BERA (2014), Research and the Teaching Profession: Building the Capacity for a Self-improving Education System, British Educational Research Association, https://www.bera.ac.uk/wp-content/uploads/2013/12/BERA-RSA-Research-Teaching-Profession-FULL-REPORT-for-web.pdf.
[46] Brown, C. and T. Greany (2018), “The evidence-informed school system in England: Where should school leaders be focusing their efforts?”, Leadership and Policy in Schools, Vol. 17/1, pp. 115-137, https://doi.org/10.1080/15700763.2016.1270330.
[47] Burns, T. and F. Köster (eds.) (2016), Governing Education in a Complex World, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264255364-en.
[28] Burns, T. and T. Schuller (2022), “History and evolution of brokerage agencies in education”, in Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/b2d2c2fc-en.
[11] Cain, T. (2019), Becoming a Research-Informed School: Why? What? How?, Routledge, London, https://doi.org/10.4324/9781315143033.
[39] Choi, A. (ed.) (2023), Purposes of Education: The Iterative 5D Model for Sustainable Improvement of Educational Quality, Bloomsbury Publishing Plc.
[4] Davenport, T. and L. Prusak (1998), Working Knowledge: How Organizations Manage What They Know, Harvard Business School Press.
[53] DMPC (2022), “The Policy Project”, web page, https://www.dpmc.govt.nz/our-programmes/policy-project.
[20] Dysvik, A. and B. Kuvaas (2010), “Exploring the relative and combined influence of mastery‐approach goals and work intrinsic motivation on employee turnover intention”, Personnel Review, Vol. 39/5, pp. 622-638, https://doi.org/10.1108/00483481011064172.
[29] Earl, L. and H. Timperley (2009), “Understanding how evidence and learning conversations work”, in Earl, L. and H. Timperley (eds.), Professional Learning Conversations: Challenges in Using Evidence for Improvement, Springer, https://doi.org/10.1007/978-1-4020-6917-8_1.
[42] European Commission (2023), “Joint Research Centre”, web page, https://commission.europa.eu/about-european-commission/departments-and-executive-agencies/joint-research-centre_en (accessed on 2 May 2023).
[43] European Commission (2023), “Supporting policy with scientific evidence”, web page, https://knowledge4policy.ec.europa.eu/projects-activities/competence-frameworks-policymakers-researchers_en (accessed on 2 May 2023).
[41] Evans, C., M. Waring and A. Christodoulou (2017), “Building teachers’ research literacy: Integrating practice and research”, Research Papers in Education, Vol. 32/4, pp. 403-423, https://doi.org/10.1080/02671522.2017.1322357.
[40] Farley-Ripple, E., K. Oliver and A. Boaz (2020), “Mapping the community: Use of research evidence in policy and practice”, Humanities and Social Sciences Communications, Vol. 7/1, https://doi.org/10.1057/s41599-020-00571-2.
[49] Freebairn, L. et al. (2017), “Knowledge mobilisation for policy development: Implementing systems approaches through participatory dynamic simulation modelling”, Health Research Policy and Systems, Vol. 15/1, https://doi.org/10.1186/s12961-017-0245-1.
[15] Godfrey, D. (2017), “What is the proposed role of research evidence in England’s ‘self-improving’ school system?”, Oxford Review of Education, Vol. 43/4, pp. 433-446, https://doi.org/10.1080/03054985.2017.1329718.
[6] Godfrey, D. and C. Brown (2018), “How effective is the research and development ecosystem for England’s schools?”, London Review of Education, Vol. 16/1, pp. 136-151, https://doi.org/10.18546/LRE.16.1.12.
[3] Golden, G. (2020), “Education policy evaluation: Surveying the OECD landscape”, OECD Education Working Papers, No. 236, OECD Publishing, Paris, https://doi.org/10.1787/9f127490-en.
[61] Gorard, S., B. See and N. Siddiqui (2020), “What is the evidence on the best way to get evidence into use in education?”, Review of Education, Vol. 8/2, pp. 570-610, https://doi.org/10.1002/rev3.3200.
[14] Grailey, K. et al. (2021), “The presence and potential impact of psychological safety in the healthcare setting: An evidence synthesis”, BMC Health Services Research, Vol. 21/1, https://doi.org/10.1186/s12913-021-06740-6.
[21] Guerriero, S. (ed.) (2017), Pedagogical Knowledge and the Changing Nature of the Teaching Profession, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264270695-en.
[23] Gu, Q. et al. (2021), The Research Schools Network: An Evaluation Report, Education Endowment Foundation, London.
[2] Haynes, A. et al. (2020), “Applying systems thinking to knowledge mobilisation in public health”, Health Research Policy and Systems, Vol. 18/1, https://doi.org/10.1186/s12961-020-00600-1.
[50] Humphries, S. et al. (2014), “Barriers and facilitators to evidence-use in program management: A systematic review of the literature”, BMC Health Services Research, Vol. 14/1, https://doi.org/10.1186/1472-6963-14-171.
[35] Jakubec, S. and B. Astle (2021), Research Literacy for Health and Community Practice, Canadian Scholars’ Press, Toronto.
[26] König, J. (2017), “Motivations for teaching and relationship to general pedagogical knowledge”, in Pedagogical Knowledge and the Changing Nature of the Teaching Profession, OECD Publishing, Paris, https://doi.org/10.1787/9789264270695-9-en.
[13] Kools, M. and L. Stoll (2016), “What makes a school a learning organisation?”, OECD Education Working Papers, No. 137, OECD Publishing, Paris, https://doi.org/10.1787/5jlwm62b3bvh-en.
[9] Köster, F., C. Shewbridge and C. Krämer (2020), Promoting Education Decision Makers’ Use of Evidence in Austria, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/0ac0181e-en.
[10] Langer, L., J. Tripney and D. Gough (2016), The Science of Using Science: Researching the Use of Research Evidence in Decision Making, technical report, EPPI-Centre, University College London, London, http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/Science%20Technical%20report%202016%20Langer.pdf?ver=2016-04-18-142648-770 (accessed on 20 January 2023).
[24] Lanham, H. et al. (2009), “How improving practice relationships among clinicians and nonclinicians can improve quality in primary care”, The Joint Commission Journal on Quality and Patient Safety, Vol. 35/9, pp. 457-AP2, https://doi.org/10.1016/s1553-7250(09)35064-3.
[27] Lauermann, F. (2017), “Teacher motivation, responsibility, pedagogical knowledge and professionalism: A new era for research”, in Pedagogical Knowledge and the Changing Nature of the Teaching Profession, OECD Publishing, Paris, https://doi.org/10.1787/9789264270695-10-en.
[45] Maxwell, B., J. Sharples and M. Coldwell (2022), “Developing a systems‐based approach to research use in education”, Review of Education, Vol. 10/3, p. e3368, https://doi.org/10.1002/rev3.3368.
[18] Michie, S., M. van Stralen and R. West (2011), “The behaviour change wheel: A new method for characterising and designing behaviour change interventions”, Implementation Science, Vol. 6/1, https://doi.org/10.1186/1748-5908-6-42.
[32] Mitton, C. et al. (2007), “Knowledge transfer and exchange: Review and synthesis of the literature”, Milbank Quarterly, Vol. 85/4, pp. 729-768, https://doi.org/10.1111/j.1468-0009.2007.00506.x.
[55] Morgan, K. et al. (2023), Engaging With Evidence Toolkit, Nesta, https://www.nesta.org.uk/toolkit/engaging-with-evidence-toolkit (accessed on 27 March 2023).
[59] NCEE (2021), “Top-performing countries: Estonia”, https://ncee.org/country/estonia (accessed on 15 May 2023).
[51] Nesta and Alliance for Useful Evidence (2015), Using Research Evidence: A Practice Guide, https://media.nesta.org.uk/documents/Using_Research_Evidence_for_Success_-_A_Practice_Guide.pdf.
[48] NHS Health Education England (2020), NHS Funded Library and Knowledge Services in England, EconomicsByDesign, https://www.hee.nhs.uk/sites/default/files/HEE%20-%20Library%20and%20Knowledge%20Services%20Value%20Proposition%20The%20Gift%20of%20Time%20FINAL%20Nov2020_0.pdf.
[17] OECD (2022), Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/d7ff793d-en.
[60] OECD (2021), Strengthening the Impact of Education Research policy survey, OECD, Paris.
[58] OECD (2020), Education Policy Outlook Estonia, OECD Publishing, Paris, http://www.oecd.org/edu/policyoutlook.htm.
[12] OECD (2017), The OECD Handbook for Innovative Learning Environments, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/9789264277274-en.
[31] OECD (2007), Evidence in Education: Linking Research and Policy, OECD Publishing, Paris, https://doi.org/10.1787/9789264033672-en.
[8] Oliver, K. and A. Boaz (2021), “The hard labour of connecting research to policy during COVID-19”, LSE Blog, https://blogs.lse.ac.uk/impactofsocialsciences/2021/03/09/the-hard-labour-of-connecting-research-to-policy-during-covid-19 (accessed on 24 January 2023).
[5] Oliver, K. et al. (2022), “What works to promote research-policy engagement?”, Evidence & Policy, Vol. 18/4, pp. 691-713, https://doi.org/10.1332/174426421x16420918447616.
[34] ONA (n.d.), Educational Knowledge Network Amsterdam (ONA), https://ona.amsterdam.
[1] Powell, A., H. Davies and S. Nutley (2017), “Facing the challenges of research-informed knowledge mobilization: ‘Practising what we preach’?”, Public Administration, Vol. 96/1, pp. 36-52, https://doi.org/10.1111/padm.12365.
[57] Révai, N. (2018), “What difference do standards make to educating teachers?: A review with case studies on Australia, Estonia and Singapore”, OECD Education Working Papers, No. 174, OECD Publishing, Paris, https://doi.org/10.1787/f1cb24d5-en.
[19] Rickinson, M. et al. (2020), Using Evidence Better: Quality Use of Research Evidence Framework, Monash University, Victoria, https://bridges.monash.edu/articles/report/Quality_Use_Of_Research_Evidence_QURE_Framework_Report/14071508/2 (accessed on 1 September 2021).
[16] Rickinson, M. et al. (2022), “Using research well in educational practice”, in Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/65aac033-en.
[44] Schwendinger, F., L. Topp and V. Kovacs (2022), Competences for Policymaking: Competence Frameworks for Policymakers and Researchers Working on Public Policy, Publications Office of the European Union, Luxembourg, https://doi.org/10.2760/642121.
[30] Sharp, C. et al. (2006), Advising Research-engaged Schools: A Role for Local Authorities, National Foundation for Educational Research, Berkshire.
[22] Sitra (2018), “Phenomenon-based public administration”, Sitra Working Papers, Finnish Innovation Fund Sitra, Helsinki, https://www.sitra.fi/en/publications/phenomenonbased-public-administration.
[54] Tennant, G. and K. Morgan (2022), “Using evidence to make policy more effective”, web page, https://www.nesta.org.uk/project-updates/using-evidence-make-policy-more-effective (accessed on 27 March 2023).
[56] University of Tartu (n.d.), “Institute of Education in the University of Tartu”, https://haridus.ut.ee/en (accessed on 3 May 2023).
[33] Ward, V., A. House and S. Hamer (2009), “Knowledge brokering: The missing link in the evidence to action chain?”, Evidence & Policy, Vol. 5/3, pp. 267-279, https://doi.org/10.1332/174426409x463811.
[37] Weiss, C. (1979), “The many meanings of research utilization”, Public Administration Review, Vol. 39/5, p. 426, https://doi.org/10.2307/3109916.
T-tests were calculated to see if there were any statistically significant differences between the items of research engagement culture and skills.
Dimension |
Motivation |
Willingness |
Relationships |
|||||||
---|---|---|---|---|---|---|---|---|---|---|
Items |
Using education research in the policy process is important for policy makers/ practitioners |
Policy makers / Practitioners are expected to use education research in the policy process/ their practice |
There is strong political will to use education research in policy/ practice |
Policy makers/ practitioners are willing to use education research to question their ideas and preconceptions |
Policy makers/ practitioners are willing to learn new skills for using education research |
Policy makers/ practitioners are willing to try new ways to integrate education research into policy making |
There is a high level of trust between policy makers/ practitioners and researchers |
Policy makers/ practitioners and researchers have a shared understanding of education research and its use |
There is a high level of trust in research amongst policy makers/ practitioners |
|
Number of respondent systems |
Policy |
26 |
26 |
26 |
26 |
26 |
26 |
25 |
26 |
26 |
Practice |
20 |
20 |
20 |
20 |
20 |
19 |
20 |
20 |
20 |
|
Average
|
Policy |
4.46 |
4.04 |
3.73 |
3.50 |
3.69 |
3.73 |
3.52 |
3.27 |
3.65 |
Practice |
4.30 |
3.95 |
3.75 |
3.50 |
3.65 |
3.58 |
3.40 |
3.10 |
3.40 |
|
t-test (policy vs. practice) |
0.31 |
0.39 |
0.73 |
0.95 |
1.00 |
0.85 |
0.56 |
0.68 |
0.57 |
Note: Data show the correlation coefficients with the following values of significance: * t < 0.1; ** t < 0.05; *** t < 0.01.
Source: OECD (2021[60]), Strengthening the Impact of Education Research policy survey.
Dimension |
Research Literacy |
Research Use |
Research Production |
|||||
---|---|---|---|---|---|---|---|---|
Items |
To understand and interpret education research |
To evaluate the quality of education research |
To find and access research that is relevant for their needs |
To translate and apply education research results to solve problems in their context |
To communicate research for decision making |
To co-design and co-conduct research with researchers |
To formulate research needs and commission research based on needs |
|
Number of respondent systems |
Policy |
25 |
25 |
25 |
25 |
25 |
25 |
25 |
Practice |
20 |
20 |
20 |
20 |
20 |
20 |
20 |
|
Average
|
Policy |
3.76 |
3.64 |
3.76 |
3.72 |
3.56 |
3.08 |
3.88 |
Practice |
3.50 |
3.25 |
3.55 |
3.25 |
3.35 |
3.15 |
3.40 |
|
t-test (policy vs. practice) |
0.31 |
0.21 |
0.40 |
0.06* |
0.43 |
0.81 |
0.07* |
Note: Data show the correlation coefficients with the following values of significance: * t < 0.1; ** t < 0.05; *** t < 0.01.
Source: OECD (2021[60]), Strengthening the Impact of Education Research policy survey.
← 1. Many responses to the survey were optional; this analysis will only report on answers given and will not make inferences regarding the number and composition of unanswered questions.
← 2. These dimensions were constructed conceptually. The sample size did not allow for conducting factor analysis to confirm the constructs statistically.
← 3. Although to note that the sample size did not allow for conducting factor analysis, therefore these findings are simply indications for further research.