This chapter examines the emerging policy evaluation culture across OECD countries through the lens of countries’ experiences in implementing key policies and evidence from OECD country reviews, as well as from the perspective of policy evaluators reflecting on policy implementation processes ex post. It identifies four key principles that can assist countries in promoting policy improvement: 1) involve all stakeholders, including students, and ensure their continued engagement; 2) elevate evidence and use data strategically in the policy implementation and evaluation process; 3) develop a common understanding of concepts and shared goals and standards; and 4) ensure a fair distribution of resources and equal capacity to use them on the ground. The chapter also highlights the vital role that strengthening evaluation capacity can play in improving implementation.
Education Policy Outlook 2018
Chapter 6. Policy implementation and evaluation: Learning from experience and evidence
Abstract
Highlights
The story of a successful policy goes beyond design and specific features. To promote true change and learning, it is crucial to fully understand the complexity of the surrounding policy ecosystem when developing implementation strategies.
Policy evaluation is an emerging instrument across the OECD to promote policy success. Evaluations can offer both summative and formative perspectives of the implementation of specific reforms and can help us understand some of the factors that promote success in policy implementation.
According to evidence from the Education Policy Outlook National Survey for Comparative Policy Analysis 2016-17, as well as national policy evaluations, OECD countries are working to improve reform success by promoting greater inclusion of stakeholders, elevating the role of evidence in the reform process and developing a clear strategic vision for education systems and associated policies.
Introduction
This chapter explores policy implementation and evaluation in education systems and the relationship between them. It identifies an incipient policy evaluation culture in some education systems and provides examples of how education systems are building policy implementation and evaluation capacity to support policy improvement and success.
There is an emerging climate in which more is being demanded of the public sector, and the public sector is moving towards a culture of greater openness, innovation and transparency. In that climate, it is increasingly challenging for governments to spend large sums of public funds to implement policies if they are not demonstrably based on sound analysis of evidence and a clear understanding of the policy context. New modalities of getting better value for money, such as the social investment model (Morel, Palier and Palme, 2012), which has already been adopted in some OECD countries (Hemerijck, 2017) also imply taking a much more structured approach to evaluating different investment alternatives to achieve the goal of social cohesion, in which education plays a definitive role (Gradstein and Justman, 2002). Evaluation and performance measurement have also become integrated into the budget allocation process and public finances of OECD countries (OECD, 2015a).
Improving policy implementation has become an increasingly important topic in the work of the OECD, for example in the 2007 horizontal project, Making Reform Happen (OECD, 2010), and initiatives such as the Education Policy Outlook, Improving Schools and the National Skills Strategies. OECD Country Reviews and Thematic Reviews also often make recommendations to countries on implementation processes. The Education Policy Outlook National Survey for Comparative Policy Analysis 2016-17 (EPO Policy Survey 2016-17) was the first systematic collection of countries’ perspectives on implementing prominent recent education policies, and the OECD has recently proposed a framework to analyse implementation of education policy (Viennet and Pont, 2017).
The importance of policy evaluation is also increasingly recognised in OECD countries, as regulatory policy and constrained resources in recent years make it imperative to ensure that policies are delivering as intended. An analysis of more than 80 recent policy evaluations compiled within the Education Policy Outlook evidence base provides a vision of the education policy evaluation landscape in OECD countries. Policy evaluations are defined as assessments undertaken as specific initiatives or projects, as opposed to ongoing informal monitoring. This database provides insights into how many times key policies are evaluated, who is evaluating them, when in the policy cycle they tend to be evaluated, how they have been evaluated and for what purpose these evaluations are used (Golden, forthcoming).
In order to be included in the database, the evaluation must meet the following conditions: 1) it must have been in relation to a specific new policy or reform implementation in the country; 2) it must have been reported to the Education Policy Outlook (EPO) team in the EPO Surveys carried out in 2013 and 2016-17 or included in an EPO country profile; and 3) there must be at least one published report of the evaluation. Each evaluation in the database contains key features of the policies along with additional variables describing evaluation methodology, outcomes and implementation perspectives.
International organisations such as the OECD can help countries to improve implementation and to embed policy evaluation in education by providing data, sharing knowledge and creating international benchmarks that can channel peer pressure (OECD, 2010). Given the failure of many reforms to take hold in the classroom, OECD education systems have highlighted their need for deeper understanding and guidance on education policy implementation and evaluation. This can help them to increase the number and quality of reform evaluations and to generate the vital insights and evidence needed on whether, how and why reforms are succeeding or failing (OECD, 2015b). This chapter therefore also outlines some of the key principles that can encourage successful implementation in the field of reforms targeted at students.
These key principles presented are based on analysis of:
recommendations made in previous OECD country-based work related to implementation processes (see Annex A, Table A A.3);
responses of education systems to the EPO Survey 2016-17, which sought specific perspectives on implementation; and
findings from analysis of a database of more than 80 policy evaluations carried out recently in OECD countries, in which evaluators identified the factors possibly influencing policy success.
Implementation success in the education policy ecosystem
Policy implementation can be defined as “a specific set of strategies to put into practice an activity or programme of known dimensions” (NIRN, n.d.). There are many different explanatory models for how policy is implemented on the ground and how it can become embedded successfully (Spillane, Reiser and Reimer, 2002). However, even the definition of success in policy implementation can often be contested (Marsh and McConnell, 2010), and most policy implementation processes in truth neither completely succeed nor completely fail, but instead fall somewhere along a success-failure spectrum (McConnell, 2010). Classifying a policy as successful can, therefore, depend on the criteria established to define or monitor a policy, or even the particular perspectives that different education stakeholders may adopt on it. A recent OECD literature review (Viennet and Pont, 2017) characterises policy implementation as:
Purposeful, to the extent that the process is supposed to change education, according to some policy objectives.
Multidirectional, because it can be influenced by actors at various points in the education system.
Contextualised, as institutions and societal shocks and trends (in culture, demography, politics and economy) affect education systems and the ways in which policies are shaped and translate in the education sector.
It follows that while it is important to design education reforms to properly meet key challenges, the cornerstone of creating greater policy success is careful implementation, based on thoughtful evaluation of context and evidence. Policies that are well designed and well resourced can still fail, due to poor execution on the ground, insufficient consideration of context and the loss of vital opportunities to learn when policies are not evaluated or evaluations are not well designed (OECD, 2015b). Much evidence highlights the importance of individual contextual factors and the relationships and dependencies between them in influencing policy development and implementation (Howlett, 2004; Butler and Allen, 2008). In other words, as discussed in Chapter 1, the whole policy ecosystem – not just the immediate issue at hand – must be considered when developing education reforms. This implies careful analysis of the ways in which a policy can interact with its ecosystem before, during and after reform implementation.
It is clear that effective policy implementation requires access to knowledge about the ecosystem and the capacity to interpret and apply it. Useful knowledge for policy implementation is increasingly accessible from many sources, such as analysis by international organisations, best practices and benchmarks, and research and statistics produced by government bodies and academia, as well as consultations with experts and stakeholders’ evidence through dialogue (Burns and Köster, 2016).
However, this increased access to information also brings the challenge of ensuring that the new information available can be analysed and translated into useful policy lessons. These policy lessons should not only help the system know what it needs to improve, but also offer pathways that are feasible and cost-effective. As discussed in the next section, the policy evaluation process can act as the lynchpin for creating and using knowledge to make key decisions on assessing the value of policy reforms, monitoring their implementation or even indicating when they have failed and need to be abandoned or retooled. Policy evaluations often create valuable insights into the implementation process of the reform and offer an opportunity to learn lessons that apply beyond the immediate context of the reform at hand. Thus, the policy evaluation process can support implementation of education reforms in general, as well as acting as a means to judge the implementation of a specific policy.
Policy evaluation is an emerging instrument which can promote more successful system reforms
Evaluation is relevant at all stages of the policy process (Fischer, Miller and Sidney, 2007; Stufflebeam and Coryn, 2014). It is the mechanism by which countries can look forward to recognise a challenge and weigh response options and can look backward to identify what worked to ameliorate the situation. A well-established culture of policy evaluation can, therefore, act as the bridge between evolving challenges in education systems and how countries eventually respond to them.
OECD education systems have significantly expanded evaluation and assessment capacities across their education systems in recent years, with new mechanisms for evaluating schools, teachers and students introduced in many countries. Evaluations can take many different forms and use many different methods. Diverse examples from the Education Policy Outlook database of recent evaluations include:
Longitudinal tracking of implementation and outcomes, such as the phased monitoring programme for the Action Plan of the Indigenous Student Success Programme (2013-17) in Australia, which included a survey of school leaders, case studies in schools, interviews with key stakeholders and a review of available outcomes data in each evaluation phase.
Evaluating externally shortly after implementation, such as the schedule of implementation for the Organic Law for Improvement of Education (2014) in Spain, in which the schedule of implementation includes an external evaluation after the first year it is implemented in every grade.
Ex ante reviews to support decision making, such as the ex ante study carried out in the Slovak Republic to underpin the decision process for a reform to expand childcare facilities, which constructed a number of efficiency and characteristic indices to allow for a structured decision on where and how childcare provision should be expanded, using a comprehensive evidence base.
Some countries have started to lay the groundwork for building a stronger education policy evaluation system, through legislation, new evaluation frameworks or new institutional arrangements. Recent developments at the country level include the following:
New legislation
Reform of the Republic’s Schools (Refondation de l'école de la République, 2013) in France makes provisions for improvement of evaluation of the education system.
In Mexico, the Law of the National Institute for Education Evaluation (Instituto Nacional para la Evaluacion de la Educación, 2013) granted independence to the national educational evaluation institute.
New evaluation frameworks
In 2010, Korea began broadening its evaluation and assessment framework to encompass the whole education system (student assessment, school evaluation, teacher appraisal, evaluation of principals, evaluation of local education authorities, evaluation of research institutes and evaluation of education policies).
In 2016, Norway introduced a revised set of instructions for official reports and studies which sets the standards for how official studies must be carried out and reported. It specifies minimum requirements for evaluations in a clear and concrete manner, as well as setting out the requirements for early stakeholder involvement and proportionality of the review process to the size or scope of the initiative under consideration.
In 2017, Slovenia adopted a National Framework for Quality Assessment and Assurance in Education and Training as a basis for setting up a coherent system of quality assessment and assurance in this area. The framework aims to link different evaluation activities on the level of educational institutions (kindergartens, basic and upper secondary schools) and the system. It provides for the creation of special professional bodies (strokovna jedra) to support kindergartens and schools in the processes of assessment (monitoring and evaluation) and quality assurance (planning and implementation of policies). An analytical centre is being set up in the Directorate for Development and Quality in Education of the Ministry of Education, Science and Sport to assure overall co‑ordination of the related activities.
New institutional arrangements
In 2013, Portugal created the Educational Evaluation Institute (Instituto de Avaliação Educativa), an independent autonomous institute specialising in external evaluation. It replaced the Office for Educational Evaluation (Gabinete de Avaliação Educacional), which reported directly to the minister.
In Finland, the Finnish Education Evaluation Centre (FINEEC) started operations on 1 May 2014. It was formed by combining the evaluation activities of the Finnish Higher Education Evaluation Council, the Finnish Education Evaluation Council and the Finnish National Board of Education. Since the beginning of 2018, FINEEC has been operating as an independent unit under the Finnish National Agency for Education.
Despite these recent developments, regular evaluation of all policies is not yet the norm across the OECD. To improve policy success, there is a need to enhance the robustness of evaluations and demonstrate causal links between reforms and outcomes (Cook, 2002), and also to understand why and how the impacts have occurred. The following section outlines some fundamental concepts for countries to consider in building a strong culture of policy evaluation.
Broad and robust methodologies and evaluative thinking
Many of the reports in the Education Policy Outlook evaluations database highlight changes in outcomes which occurred over the period of policy implementation, with implied links to reform efficacy. The vast majority of educational evaluations are not designed to directly attribute changes in the outcomes of a policy to the policy itself. That makes it difficult to disentangle the impacts of a policy from other policies which operate in parallel or events which might have occurred as a natural consequence of the context and situation, even without the policy implementation.
Demonstrating a causal relationship empirically requires the use of robust methodologies, using experimental approaches, such as randomised controlled trials. In recent years, many have argued in favour of institutionalising such approaches in public policy, and they are considered best practice in causally evaluating the outcomes of policy initiatives (Cook, 2002; Haynes, Goldacre and Torgerson, 2012). As ethical considerations can often prohibit or at least discourage randomised controlled trials, public sectors in the OECD are increasingly developing accessible tools and environments to allow such testing to be carried out on a quasi-experimental basis or to mine for other contextual factors. For example, administrative microdata and specific models and tools constructed around them, are increasingly used to generate evidence relevant for policy evaluation. Many OECD countries have already developed such initiatives for simulating policy experiments and quasi-causal education policy evaluation. These include Denmark’s DREAM model for educational forecasting, New Zealand’s Integrated Data Infrastructure and Canada’s longitudinal LifePaths database, which includes transitions between education levels and sectors and the labour market.
Box 6.1. Evaluations of the Programme for Reinforcement, Guidance and Support in Spain
Spain’s Programme for Reinforcement, Guidance and Support (PROA) is a school support programme directed at educational centres with students of lower socio-economic status, which includes additional tutoring, support, mentoring and programmes to change school culture and expectations. Since its inception in 2006, PROA has been evaluated regularly according to a CIPO (Context, Inputs, Process, Outcomes) framework. This provided a range of evidence on the success of the programme, based on questionnaires aimed at students, their families and practitioners that gathered both quantitative and qualitative information. Evaluations show general positive perceptions outcomes of the policy (Manzanares Moya and Ulla Diez, 2012).
Evidence of programme efficacy was further strengthened by a 2014 causal evaluation by Universidad Pablo de Olavide (Seville), which performed an exercise matching the individuals, centres and students of the treatment group with individuals of a control group similar in observable characteristics. Individual characteristics considered included gender, immigrant status, whether or not the student had repeated a grade, as well as parental education and occupation. Centre variables included whether the centre was public or private, and the percentage of students whose parents had tertiary education. Results showed that the effect of PROA is positive, with a particularly strong effect on reading. Furthermore, the evaluation showed that the effects of the programme were cumulative and significant in both the short and long term.
On their own, outcomes of randomised controlled trials and other experimentally-focused evaluations are not considered sufficient evidence for judging policy efficacy in education. This is because they cannot always take into account other important or valuable contextual information, cannot build into their design the viewpoint of education as a complex system and can generally only answer very specific research questions (Morrison, 2001). In terms of policy learning, the results of experiments can determine if a reform has had impact, but not necessarily why or how. The depth of understanding required for such insights often can come from qualitative evidence (Dumas and Anderson, 2014). Ideally, education systems would have the capacity to conduct experimentally-focused evaluations but would also take into account other types of evaluation which give weight to the explanatory power of qualitative analysis, which is particularly relevant in the context of education.
In addition to ensuring capability to carry out broad-ranging and robust evaluation, there must be capacity to ensure that evaluation results meet the needs of policy makers for insight into their system and reform context and that the results of policy evaluations are fed back effectively into the system for future learning. Having the ability to judge the quality of evidence from policy evaluations and ensure its application implies the development of evaluative thinking across education systems. Buckley et al., (2015) define evaluative thinking as:
… critical thinking applied in the context of evaluation, motivated by an attitude of inquisitiveness and a belief in the value of evidence, that involves identifying assumptions, posing thoughtful questions, pursuing deeper understanding through reflection and perspective taking, and informing decisions in preparation for action.
While evaluation can be considered as a process or activity, evaluative thinking is a “way of doing business”, and embedding such thinking across an organisation is the opposite of treating evaluation as a simple box-ticking exercise (CLEAR, 2013). Instead, it implies an approach of constant search for improvement by questioning and analysing the assumptions, theories and evidence surrounding an educational reform. Developing effective evaluation capacity, therefore, goes beyond integrating evaluation as a process into more of the policy cycle, but making evaluative thinking a default setting at all stages of the policy cycle to help turn the information and feedback emerging from the system into usable knowledge that can help to continuously improve interventions (Earl and Timperley, 2015).
Box 6.2. Moving towards evaluative thinking in the European Commission
In 2013, the European Commission developed a new approach to policy evaluation that aims to link ex ante and summative evaluation in a more structured way, to close the policy cycle. “Strengthening the foundations of Smart Regulation: Improving evaluation” (EC, 2013) called for a number of measures to improve the policy evaluation framework across the policy cycle, including the “evaluate first” principle: before moving forward with a new policy, making it standard practice where possible to first ensure that previous actions and current policies have been thoroughly evaluated and lessons learned. There should be a continuous loop between ex ante impact assessments and summative evaluations, with the summative evaluation influenced by the framework set out in ex ante evaluation, and ex ante evaluations for new policies fed with strong evidence from previous summative evaluations.
The European Commission further strengthened its commitment to the “evaluate first” principle in revised regulation guidelines published in 2015 (EC, 2015) that apply the principle to both spending and non-spending EU activities with impact on society and the economy.
Promoting policy improvement and success: Principles derived from OECD and national evidence
As noted earlier, lessons to promote policy improvement come from both thematic and country-specific work that OECD has carried out with different education systems. Countries also often identify issues that can positively influence policy success, as they conduct policy evaluations or reflect on implementation experiences. This section brings together the most prominent of these lessons, organised under four key principles for policy success. Table 1.4 summarises these principles and identifies countries the OECD advised to adopt them. The subsequent sections examine each principle in detail and describe recent reflections and practices by countries that have taken these principles into account in their policy development processes.
Table 6.1. Key principles for successful education policy implementation
Key principles |
Countries |
|
---|---|---|
1 |
Involve all stakeholders (including) students and ensure their continued engagement. |
CAN, CHL, IRL, NZL, PRT, SVN, SWE, , GBR, GBR (Wales) |
2 |
Elevate evidence and use data strategically in the policy implementation and evaluation process. |
AUS, CZE, FRA, GRC, LUX, NOR, SVK, GBR (Wales) |
3 |
Develop a common understanding of concepts and shared goals and standards. |
AUS, MEX, NOR, SWE |
4 |
Ensure a fair distribution of resources and equal capacity to use them on the ground. |
AUS, CAN, TUR, NOR, ESP, GBR (Scotland) |
Note: See Table A A.3 in Annex A for the list of OECD publications consulted.
Source: OECD reviews of countries’ education policies (2000-17)
Key principle 1. Involve all stakeholders (including students) and ensure their continued engagement
Involving all stakeholders (including students) and ensuring their continued engagement can pay off for policy success, as suggested in comments by evaluators in Canada (Ontario), Ireland, New Zealand and the United Kingdom.
Canada (Ontario) Student Success/Learning to 18 Evaluations |
“Although most student respondents were familiar with at least one of the components or elements of the Strategy, many remain unaware of the various programs and supports available to them.” (Canadian Council of Learning, 2008) |
Ireland Review of Delivering Equality of Opportunity in Schools |
“Stakeholder consultations have identified the role of Home School Community Liaison in the Department of Education and Skills as being particularly significant and effective. Schools report better relations with parents and greater involvement by them in school life and in the education of their children.” (Department of Education and Skills, 2017) |
New Zealand Check and Connect evaluation report |
“Student accounts and comments provide some vivid testimony to the ‘life-changing’ difference [participation in the programme] could make to their engagement and achievement in school, and their capacity to live more purposefully, confidently, and contentedly in and out of school.” (Wylie and Felgate, 2016) |
United Kingdom Mapping user experiences of the Education, Health and Care process |
“There has to be a genuine desire to listen to the views of parents – good and bad – and a culture of critical appraisal/self-evaluation for services to use the evidence from parental feedback to really influence improvements in service delivery. It should not be consultation for consultation’s sake.” (Skipp and Hopwood, 2016) |
Three main factors underlie the increased attention to stakeholder engagement in implementation of education reform: 1) societies today are more democratic and participative; 2) there is a greater awareness of the importance of education quality for the future of a country; and 3) evolving technologies now allow populations to be more vocal about policy matters (OECD, 2016). As a result, stakeholder engagement has become an integral part of education processes in many countries, with stakeholder consultation now commonplace at the inception of new reforms. From an implementation perspective, the involvement of key stakeholders in education policy development can help to cultivate a sense of joint ownership over policies and hence build more effective and relevant reforms (Finlay, 1998).
Education stakeholders need adequate knowledge of educational policy goals and consequences, and the tools to implement a reform as planned (Hooge, Burns and Wilkoszewski, 2012; OECD, 2015b). Continuous reference and integration of evidence as part of the dialogue between stakeholders during policy design and implementation can help to build a strong and informed consensus on the path forward. This is particularly vital in situations where stakeholders may have strong a priori beliefs tied both to their identities and experiences (Burns and Köster, 2016).
Education systems in many countries have involved a significant number of actors at different levels of governance in recent policy development and implementation processes, either by engaging them in shaping policy or by developing policy specifically in response to stakeholder feedback. Some examples:
Mexico’s most recent pedagogical plan, the New Education Model (2017), was elaborated according to suggestions and comments from more than 300 000 stakeholders, feeding into the new model as well as its implementation plan.
Portugal’s Promote Educational Success programme (2016), which aims to increase retention rates, was also developed using a bottom-up strategy. Schools were invited to present individual strategic action plans listing measures ranked in priority order, allowing for identification and consideration of each school’s specific needs during the policy design process.
The Slovak Republic’s New State Curriculum (2015) and Dual VET System (2015) were both adopted as a direct response to reports from employers of insufficient levels of labour supply.
In 2016, Latvia formed an inter-institutional working group to develop proposals for complex solutions and an “ideal” mapping of school networks that promotes education quality. Ongoing discussions involving various stakeholders will inform a new arrangement of the network of general schools in Latvia. An independent research report, Development of an Optimal Model of General Education Institutions Network (2017), is also used for discussions with municipalities regarding the optimisation of the school network. It includes analysis from a geospatial planning platform on demographic and migration tendencies and forecasts, availability and quality of education institutions, social and economic profiles of municipalities and transport conditions.
Of course, students themselves are the key stakeholders in reforms aimed at improving their educational outcomes. As education systems move towards putting students and their learning at the centre of policy development (OECD, 2015b), one way for policy reforms to promote student learning is to give students a voice and ensure that reforms meet their needs and enhance their well-being (Mitra, 2007; Simmons, Graham and Thomas, 2015). Students have unique perspectives of their learning environments, and their views should be sought and considered, recognising that students themselves have a diversity of views on requirements of the education system, based on their background and experiences (Cook-Sather, 2006).
In the Education Policy Outlook evaluations database, a majority of reported policy evaluations did not directly attempt to evaluate reforms from the point of view of students. Most qualitative reform evaluations focused on practitioners and, to a lesser extent, on parental perspectives on implementation. The literature on the effect of parental involvement on children’s education outcomes in general does not always show positive impact (Fan and Chen, 2001), but there is evidence suggesting that well-designed home-school interventions can lead to improved outcomes (Cox, 2005). Evidence also suggests that fostering and motivating parental engagement and partnerships between home, school and community can be an effective component of successful educational reform (Barton et al., 2004; Bryan, 2005).
The attitudes and beliefs of students, as well as their home lives, form a crucial part of the education policy ecosystem. Deeper analysis of the relationships between students’ life satisfaction, their home context, their sense of belonging at school, their classroom climates and their performance and outcomes is becoming increasingly available (e.g. OECD, 2017a). In the context of implementing successful reforms in increasingly complex systems, it is likely to become more important that policy evaluations related to student outcomes include consideration of the direct experience of students, both in and out of school, to ensure that the policies implemented can adequately meet their needs.
Key principle 2. Elevate evidence and use data strategically in the policy implementation and evaluation process
As the comments of evaluators in Canada, Norway and the United Kingdom (Scotland) demonstrate, elevating evidence and using data strategically in the policy implementation and evaluation process can improve results.
Canada Summative Evaluation of the Budget 2008 Canada Student Loans Programme Enhancements |
“The purpose of the literature and file review was to provide an overview of the information already available in the public domain and internal Employment and Social Development Canada documents, completed surveys and research papers, and then compile the information (from over 40 documents) in order to provide evidence for a series of evaluation questions. This review demonstrated that much information was already available in the literature.” (Employment and Social Development Canada, 2016) |
Norway Evaluation of upper secondary changes as part of the Knowledge Promotion Reform |
“We are able to comfortably conclude that the structural measures implemented have not led to fundamental changes in the programmes offered, and have not changed students' preferences or completion in any decisive way… We cannot exclude the possibility that the financial crisis that occurred while the Knowledge Promotion Reform was being implemented may have made the transition from school to apprenticeship harder than it would otherwise have been, thus masking the possible positive effects of broader courses.” (NIFU, 2012) |
United Kingdom (Scotland) Improving Schools in Scotland: An OECD Perspective |
“Evaluative evidence did need to be gathered that would serve to inform future direction…The proposed National Improvement Framework has the potential to provide robust evaluative evidence to complement the inspection reports and other forms of evaluation.” (OECD, 2015c) |
Along with the increased importance that societies are conferring on education, there is also a heightened awareness of the importance of developing education policies and practices through evidence, as well as a greater call globally for all education policies and practices to be evidence-based (Slavin, 2002; Wiseman, 2010, Aarons, Hurlburt and Horwitz, 2011). Robust evidence helps to convince voters and stakeholders that a reform is needed and allows for well-informed decisions based on the best available evidence (Davies, 1999). At the same time, there is a greater public demand for evidence on the effectiveness of education systems. This draws on growing concern for student outcomes, the increase of available data (thanks to greater use of testing and assessment) and wider access to information via new technologies (OECD, 2007). As reported in the Education Policy Outlook Survey 2016-17, countries have worked to strengthen the role of evidence in reform processes:
By developing or using national capacities for designing evidence-informed policies…
Japan’s formulation of its Third Basic Plan for the Promotion of Education (2018) was based on a comparative analysis of Japan’s strengths and weaknesses from an international perspective. This plan allows the continuity of current principles while solving the issues based on its progress, as well as the anticipated social changes beyond 2030.
Latvia has committed to the promotion of evidence-informed policy planning and implementation. While the country is working to develop its national data capacities, the use of data from previous cycles of international studies is also becoming more widespread, which helps to identify trends and areas to improve.
Spain’s Programme to Reduce Early Dropout in Education and Training (2008) was built based on the study of data that define where Spain stands in comparison to other countries, members of the European Union or the OECD, aiming to establish an unbiased overview of where Spain ranks.
The Updating of Programmes in Turkey in response to changing needs in the labour market was based on a series of questionnaires published on website of the Ministry of National Education, face-to-face interviews with principals, teachers and students, and articles, papers, reports, brochures and Internet sites. Opinions and recommendations on the weekly course schedule and curriculum were also requested from all education faculties in the country and carefully analysed by the government.
By drawing on other international experiences…
Portugal’s Sistema Nacional de Creditos (2017) established credit systems based on the experiences of other countries. Thanks to several exchanges between the Finnish and Portuguese governments, the Finnish model was chosen as the main example to define the Portuguese model.
Anti-Segregation Measures (2016) were adopted in the Slovak Republic following a United Nations Development Programme survey on the living conditions of the Roma people which found that that most students from the marginalised Roma community were over-represented in the system of special education.
However, the growing role of evidence in the education policy process also presents difficulties. Ever-increasing volumes of often conflicting data and information pose challenges in assessing relevance and quality and convincing stakeholders of the legitimacy of an evidence-informed approach. Education systems need to increase systemic capacity to curate and synthesise evidence effectively for different audiences in order to properly integrate it into the policy implementation process (Slavin, 2002). This also covers evidence from policy evaluations. There is a need to adequately mine the evaluative information available and make judgements as to the quality and meaning of evaluation results. As the volume of policy evaluations carried out continues to grow, this “meta-evaluation imperative” is also likely to become stronger (Stufflebeam, 2001; Stufflebeam, 2010).
Such institutionalised educational evidence evaluation and dissemination infrastructures do not yet appear to be the norm across the OECD area, but some have been developed in recent years. Examples include the Educational Endowment Foundation, part of the UK Government “What Works” network, which conducts research and extracts information to present as toolkits. These toolkits summarise evidence in a dashboard style, showing the comparative cost, evidence strength and measured impact on a visual scale, covering a wide range of policy reform options and initiatives. The Danish Clearinghouse for Educational Research analyses educational research and attempts to identify meaningful lessons, through its systematic mappings (which aim to compile relevant research for a particular policy area) and its systematic reviews (which compile, analyse and synthesise relevant evidence to tackle a specific research question). The Norwegian Knowledge Centre for Education conducts systematic evidence reviews and analyses, as well as state-of-the-field reviews, which summarise major international developments in a given educational field since the beginning of the century. Through its web portal, the Centre publishes summary overviews of its research that explicitly state who the research is primarily aimed at (policy makers, practitioners etc.).
Finally, the acknowledged requirement for greater productivity and efficiency across education systems, along with the need to ensure that education offerings remain relevant in a rapidly changing society, creates a strong innovation imperative for education policy. As the emphasis on evidence-informed culture grows, it is important to ensure that there is still room for innovation to flourish. Evaluation practices need to be synergistic with innovation, while also striving to be robust enough to underpin the decision process (Earl and Timperly, 2015).
In support of this notion, there are nascent indications that OECD countries are developing mechanisms for balancing funding and support for education policy according to the level of evidence available. For example, in the United States, many federal grants for educational programmes are awarded according to a tiered-evidence system which awards funding based on the levels of evidence of efficacy of the programme. Untested programmes for which there is little evidence are funded on a smaller scale, with increasing levels of funding made available according to the strength of the supporting evidence, thus balancing between supporting evidence-informed policy and allowing for innovation. In Norway, a dedicated Programme for Research and Innovation in the Educational Sector (FINNUT) awards funding for projects focused on compiling knowledge and evidence on current systemic contexts and practices and those that propose innovations in key identified priority areas of education.
Key principle 3. Develop a common understanding of concepts and shared goals and standards
As the comments of evaluators in Australia, Iceland and the United Kingdom (England) demonstrate, developing a common understanding of concepts and shared goals and standards can help navigate bumpy implementation processes.
Australia Final report of the evaluation of the Aboriginal and Torres Strait Islander education action plan 2010-14 |
“Stakeholders consistently reported that the Action Plan has helped to create a common language and understanding across school sectors about needs and activities to support Aboriginal and Torres Strait Islander students… The common language has also aided the sharing of practice within and across school sectors in each of the domains.” (ACIL Allen Consulting, 2014) |
Iceland External Audit of Inclusive Education |
“A key finding was that there appeared to be many different understandings of inclusive education – hence the urgent need for further clarification of this policy.” (European Agency for Special Needs and Inclusive Education, 2017) |
United Kingdom (England) Pupil Premium Evaluation Report |
“There is a tension between the criteria that are used to allocate Pupil Premium funding and the criteria that have been used by schools to define and respond to educational disadvantage more generally…..schools could be given clearer messages about the distinction between the two.” (Carpenter et al, 2013) |
Effective policy implementation requires a whole-system approach with aligning roles, and having shared values and a shared mission can foster the collaborative processes essential for success (Huffman, 2003; Innes and Booher, 2010). In many school systems, this may require a greater focus on long-term goals in school systems in order to meet the immediate challenges a reform may bring (Duckworth, Quinn and Seligman, 2009). In addition, regardless of the level of decentralisation of a system, national leadership to “co-ordinate through partnership”, by developing clear guidelines and goals and providing feedback on progress, remains very important to support stakeholders in implementation processes (Burns and Köster, 2016). A recipe of clear leadership forged in a context of strong relationships with stakeholders, along with a shared and articulated vision for the future of the system, can help organisations become truly committed to consistent improvement and learning (Huffman and Hipp, 2001).
As discussed earlier, there is extensive evidence that consensus between stakeholders is an important factor for successful implementation of policy reforms (Corrales, 1999; Connell and Klem, 2000). Enabling this consensus to extend to a sense of shared values and shared mission can improve educational outcomes. An additional point highlighted in many reviewed policy evaluation reports and supported by evidence (Kania and Kramer, 2011; Penuel et al., 2011) is the need for stakeholders in education reform to also have shared knowledge and understanding of the challenges they are seeking to address and the meaning of the different facets or tools of the reform.
Evaluation reports highlighted a number of instances where definitions and understandings differed across the system. Even well-recognised key terms are not always understood in the same way. The Pupil Premium evaluation in the United Kingdom (England) noted that each school worked according to its own definition of educational disadvantage. In some cases, allowing for autonomy of interpretation can be beneficial in adapting educational offerings to individual students, but competing understandings of a reform’s purpose and intended operation can lead to dilemmas. For example, the evaluation of the introduction of Individual Student Plans in Denmark found that while differentiation of plans for each student was the desirable ideal, differences still arose between the expectations of teachers and parents as to what should be covered in the student plans. Developing modalities for ensuring that policies are well understood and not taking for granted that understanding of phenomena and specific challenges will be the same across the system can help avoid problems in implementation processes.
Countries’ reports on recent policies show many examples of shared engagement by governments across the OECD to build a common, coherent strategy at the different levels of governance in order to optimise implementation processes. Actions focusing on promoting consistency of vision in policy implementation are evident in many countries’ policy reports:
By seeking and adapting to feedback from different levels of the system…
When Germany’s Educational Chains project (2012) was first implemented, with the goal of connecting the various steps in a person’s education career and providing a structured approach to transition, the multiplicity of actors (including the federal government, the Federal Employment Agency and the Länder) and the various types of funding and structures did not initially lead to coherent operation. When this dysfunction in the implementation process was identified, the different stakeholders combined their efforts to develop specific contracts for each Land.
Estonia is implementing a new approach to learning focusing more on 21st century skills and life skills. The main tools are providing training to teachers, applying a variety of assessment methods (including formative assessment), and introducing new concepts of learning to local communities and parents. This requires more complex policy efforts, involving different stakeholders, and new initiatives are added over time as the system learns more from feedback about the actual needs of education institutions.
Ireland has established regional clusters of higher education institutions to set joint priorities, support increased collaboration and co-ordination between higher education institutions and also to support increased engagement with stakeholders. The first priorities set for the clusters are improved academic planning and student pathways. A network of nine regional Skills Fora, which involve further education and training and higher education providers as well as government departments and agencies and employer representatives, has also been established to increase engagement with employers in each region.
By setting common standards and targets…
Japan’s Go Global project (2012), which aimed to raise the competitiveness of selected Japanese universities, set common universal quantitative and qualitative targets, and programmes were implemented to reach them.
Mexico’s New Education Model (2017) defines the learning objectives and outcomes expected at the end of each education level, as stakeholders demanded a basic quality threshold that education authorities should enforce in all schools and better alignment across education levels in terms of definition of competencies, skills and outcomes.
In Spain, the Organic Law for Improvement of Education Quality (2014) aims to tackle the large differences in students’ outcomes across regions by defining core common basic education throughout the country, while taking into account the special requirements of regional governments.
Key principle 4. Ensure a fair distribution of resources and equal capacity to use them on the ground
As the comments of evaluators for Australia, Turkey and the United Kingdom (Scotland) demonstrate, ensuring a fair distribution of resources and equal capacity to use them on the ground can help to fully engage all targeted students and practitioners.
Australia Review of the National Partnership System for Early Childhood Education |
“While future funding should be governed by who should pay rather than who can pay, the latter cannot be ignored and fiscal capacity will be an important consideration.” (Deloitte Access Economics, 2015) |
Turkey World Bank evaluation of the Secondary Education Project |
“The specific needs for improving learning conditions vary greatly between schools and can best be identified at the local level…..After finding in the course of implementation that real priorities varied greatly between schools, more [expenditure] discretion was given to the school in the revised Operational Manual for subsequent projects.” (World Bank, 2012) |
United Kingdom (Scotland) Review of Further Education Governance |
“…some Colleges serve areas of considerably greater social disadvantage than others. While this would probably always be the case no matter how the sector was constructed, the way in which these differences are dealt with by individual Colleges varies greatly.” (Griggs, 2012) |
Education policy makers can often focus more on design aspects and give less consideration to the capacity for implementation at the level of individual institutions (OECD, 2010, Viennet and Pont, 2017). Regional and local education authorities and education institutions can differ with regard to their capacity to implement policy reforms, if they do not have access to the same resources or cannot use those resources effectively. In schools, the capacity to implement reforms can depend on the funding base of the school, which can differ due to its location and socio-economic profiles (Roscigno, Tomaskovic-Devey and Crowley, 2006; Rubenstein et al, 2007) and the ability of the school to convert funding into productive resources (Grubb, 2009; OECD, 2017c). Similarly, funding difficulties and uneven capacity across tertiary education institutions of many OECD countries have led to initiatives to raise additional private-sector funding and enhance performance and resource efficiency in individual institutions.
Differences in baseline levels of productive resources across the system can lead to differences in implementation, as evaluation reports often show. For example:
Norway’s evaluation of free kindergarten in targeted areas noted that different resource levels among childcare centres led to different approaches by the centres in childcare provision.
Spain noted in the mid-term evaluation of the Second National Strategic Plan for Children and Adolescents (2013-16) that almost daily monitoring of the implementation of the Plan has been very positive, in terms of effectively mobilising resources.
While financial resources can often be allocated quickly in a crisis situation or when an issue assumes high political importance, ensuring the competencies to capitalise fully on financial resources within individual institutions for the benefit of students is a longer-term objective. Recent OECD research (OECD, 2017b) has strongly highlighted the need to ensure that school funding policies are well designed and combined with effective institutional governance arrangements to maximise equity, quality and efficiency. In a reform context, this entails ensuring that there is fiscal sustainability to support higher-need institutions and sectors, effective budget management skills at the appropriate levels of the system, and a strong regulatory framework to ensure that public funds are well-targeted and achieve maximum impact.
Other possible inequities in resources which act as barriers to policy success may be less obvious. For example, the inequity may be present at the level of individual students rather than at the level of the institution. In Ontario’s Student Success programme, evaluators noted that not all students were able to physically access programmes locally and thus were required to travel, with the cost of transport in some areas acting as a barrier to participating in the programme. The government responded by allocating additional funding for subsidised transport. Therefore, a wide range of potential barriers to policy success often must be considered when implementing reforms, and a proactive approach to mitigate these barriers can help to further policy success.
Looking to the future
Implementing policies and having the knowledge and capacity to improve them are becoming critical in increasingly accountable education systems, where more advanced skills needs and greater social diversity are creating growing demand for greater policy efficacy.
Two key points for reflection can be drawn from this chapter:
Engage actors, including students, based on a shared understanding and supported by strong capacities and resources: Actors do not necessarily share the same understanding of aspects as basic as the need for a policy, key concepts underpinning it or even the processes or objectives it entails. Policy implementation processes need to aim for transparency (i.e. clarity in language and objectives) and foster a shared ability to relate to the policy. Students and their parents should have a voice in this process, to ensure that the system can address their needs. Developing clear implementation strategies, a transparent vision and communicating priorities to stakeholders at all levels of governance are key at every stage of the process. To make this engagement possible, there must also be a fair distribution of resources and equal capacity to use them on the ground.
Ensure that conditions for better evaluation and contextual understanding are present throughout the system: Policy evaluation for the purpose of learning and policy improvement need to become an inherent part of the education system. This requires developing a mindset of evaluative thinking for all everyday processes and a capacity to generate and evaluate evidentiary material from a variety of sources. Building this culture of evidence and creating the conditions to use data strategically can also be supported by incentives to manage information and implement greater accountability and efficiency mechanisms. Education systems need to be broad and robust, taking care before making causal associations and considering a variety of tools to generate evidence. But as evidence-based policy making becomes more commonplace and policy evaluation culture becomes embedded, it is also important to always make room for innovation to flourish in education systems.
References
Aarons, G.A., M. Hurlburt and S.M. Horwitz (2011), “Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors”, Administration and Policy in Mental Health, Vol. 38/1, pp. 4-23, https://link.springer.com/content/pdf/10.1007%2Fs10488-010-0327-7.pdf.
ACIL Allen Consulting (2014), Evaluation of the Aboriginal and Torres Strait Islander Education Action Plan 2010-2014, www.educationcouncil.edu.au/site/DefaultSite/filesystem/documents/ATSI%20documents/ATSI%202010-2014%20Final%20Evaluation%20Report/0Appendices_ATSIEAP_ACILAllenConsulting.pdf.
Barton, A. C. et al. (2004), “Ecologies of Parental Engagement in Urban Education”, Educational Researcher, Vol. 33/4, pp. 3-12, https://doi.org/10.3102%2F0013189x033004003.
Bryan, J. (2005), “Fostering Educational Resilience and Achievement in Urban Schools through School-Family-Community Partnerships”, Professional School Counseling, Vol. 8/3, pp. 219-227, http://graingered.pbworks.com/f/Resilience-+School+%26+Family+Partnerships.pdf.
Buckley, J. et al. (2015), “Defining and Teaching Evaluative Thinking: Insights from Research on Critical Thinking”, American Journal of Evaluation, Vol. 36/3, pp. 375-388, www.socialresearchmethods.net/research/2015/2015%20-%20Buckley%20et%20al%20-%20Evaluative%20Thinking.pdf.
Burns, T. and F. Köster (eds.) (2016), Governing Education in a Complex World, Educational Research and Innovation, OECD Publishing, Paris, http://dx.doi.org/10.1787/9789264255364-en.
Butler, M.J. and P.M. Allen (2008), “Understanding Policy Implementation Processes as Self-Organizing Systems”, Public Management Review, Vol. 10/3, pp. 421-440, www.researchgate.net/publication/40498763_Understanding_Policy_Implementation_Processes_as_Self-Organizing_Systems.
Canadian Council of Learning (2008), Evaluation of the Ontario Ministry of Education’s Student Success / Learning to 18 Strategy, Canadian Council of Learning, www.edu.gov.on.ca/eng/teachers/studentsuccess/CCL_SSE_Report.pdf.
Carpenter, H., Papps, I., Bragg, J., Dyson, A., Harris, D., Kerr, K., Todd, L. & Laing, K., Evaluation of Pupil Premium, Crown, Manchester, http://dera.ioe.ac.uk/18010/1/DFE-RR282.pdf
CLEAR (Centers for Learning on Evaluation and Results) (2013), Embracing Evaluative Thinking for Better Outcomes: Four NGO Case Studies, CLEAR, www.theclearinitiative.org/resources/embracing-evaluative-thinking-for-better-outcomes-four-ngo-case-studies.
Connell, J. and A. Klem (2000), “You Can get There from Here: Using a Theory of Change Approach to Plan Urban Education Reform”, Journal of Educational and Psychological Consultation, Vol. 11/1, pp. 93-120, http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.108.5277&rep=rep1&type=pdf.
Cook, T.D. (2002), “Randomized Experiments in Educational Policy Research: A Critical Examination of the Reasons the Educational Evaluation Community has Offered for not Doing Them”, Educational Evaluation and Policy Analysis, Vol. 24/3, pp. 175-199.
Cook-Sather, A. (2006). “Sound, Presence, and Power: ‘Student voice’ in Educational Research and Reform”, Curriculum Inquiry, Vol. 36/4, pp. 359-390, www.jstor.org/stable/4124743.
Corrales, J., (1999), “The Politics of Education Reform: Bolstering the Supply and Demand; Overcoming Institutional Blocks”, The Education Reform and Management Series, Vol. 2/1, Education, The World Bank, Washington, DC, http://documents.worldbank.org/curated/en/957741468740407109/pdf/multi0page.pdf.
Cox, D.D. (2005), “Evidence-Based Interventions Using Home-School Collaboration”, School Psychology Quarterly, Vol. 20/4, pp. 473-497, www.researchgate.net/publication/232453141_Evidence-based_interventions_using_home-school_collaboration.
Davies, P. (1999), “What is Evidence-based Education?”, British Journal of Educational Studies, Vol. 47/2, pp. 108-121, http://dx.doi.org/10.1111/1467-8527.00106.
Deloitte Access Economics (2015), Review of the National Partnership Agreement on Universal Access to Early Childhood Education: Contextual comments from the Australian Government, States and Territories, Deloitte Access Economics Australia, http://www.educationcouncil.edu.au/site/DefaultSite/filesystem/documents/Reports%20and%20publications/EC%20Publications/NP_UAECE%20Review-220415.pdf.
Department of Education and Skills (2017), Report on the Review of DEIS, Department of Education and Skills, Dublin, https://www.education.ie/en/Schools-Colleges/Services/DEIS-Delivering-Equality-of-Opportunity-in-Schools-/DEIS-Review-Report.pdf.
Duckworth, A.L., P.D. Quinn and M.E.P. Seligman (2009), “Positive predictors of teacher effectiveness”, The Journal of Positive Psychology, Vol. 4/6, pp. 540-547.
Dumas, M.J., and G. Anderson, G. (2014), “Qualitative Research as Policy Knowledge: Framing Policy Problems and Transforming Education from the Ground Up, Education Policy Analysis Archives, Vol. 22/11, https://epaa.asu.edu/ojs/article/view/1483/1201.
Earl, L. and H. Timperley (2015), “Evaluative thinking for successful educational innovation”. OECD Education Working Papers, No. 122OECD Publishing, Paris, http://dx.doi.org/10.1787/5jrxtk1jtdwf-en.
Employment and Social Development Canada (2016), Summative Evaluation of the Budget 2008 Canada Student Loans Program (CSLP) Enhancement, www.canada.ca/content/dam/canada/employment-social-development/migration/documents/assets/portfolio/docs/en/reports/evaluations/2016/summ_eval_budget_2008-en.pdf.
EC (European Commission) (2015), Commission Staff Working Document: Better Regulation Guidelines, EC, Brussels, http://ec.europa.eu/smart-regulation/guidelines/docs/swd_br_guidelines_en.pdf.
EC (2013), Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: Strengthening the foundations of Smart Regulation – improving evaluation, EC, Brussels, http://ec.europa.eu/smart-regulation/docs/com_2013_686_en.pdf.
European Agency for Special Needs and Inclusive Education (2017), Education for all in Iceland: External Audit of the Icelandic System for Inclusive Education, European Agency for Special Needs and Inclusive Education, Odense, www.stjornarradid.is/media/menntamalaraduneyti-media/media/frettatengt2016/Final-report_External-Audit-of-the-Icelandic-System-for-Inclusive-Education.pdf.
Fan, X. and M. Chen (2001), “Parental involvement and students' academic achievement: A meta-analysis”, Educational Psychology Review, Vol. 13/1, pp. 1-22.
Finlay, I. (1998), “Stakeholders, Consensus, Participation and Democracy”, in I. Finlay, S. Niven and S. Young (eds.), in Changing Vocational Education and Training: An International Comparative Perspective, Routledge, London.
Fischer, F., G.J. Miller and M.S. Sidney (eds.) (2007), Handbook of Public Policy Analysis: Theory, Methods and Politics, CRC Press, Boca Raton, www.untag-smd.ac.id/files/Perpustakaan_Digital_2/PUBLIC%20POLICY%20(Public%20Administration%20and%20public%20policy%20125)%20Handbook%20of%20Public%20Policy%20Analysis%20Th.pdf.
Golden, G. (forthcoming), “Education Policy Evaluation in OECD countries: Surveying the landscape”, OECD Education Working Papers, OECD Publishing, Paris, www.oecd-ilibrary.org/.
Gradstein, M. and M. Justman (2002), “Education, Social Cohesion, and Economic Growth”, The American Economic Review, Vol. 92/4, pp. 1192-1204, www.researchgate.net/publication/4719166_Education_Social_Cohesion_and_Economic_Growth.
Griggs, R. (2012), Report of the Review of Further Education Governance in Scotland, The Scottish Government, www.gov.scot/resource/0038/00387255.pdf.
Grubb, W.N. (2009), The Money Myth: School Resources, Outcomes, and Equity. Russell Sage Foundation, New York.
Haynes, L., B. Goldacre and D. Torgerson (2012), “Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials”, Cabinet Office, Behavioural Insights Team, United Kingdom, www.gov.uk/government/uploads/system/uploads/attachment_data/file/62529/TLA-1906126.pdf.
Hemerijck, A. (ed.) (2017), The Uses of Social Investment, Oxford University Press, Oxford.
Hooge, E., T. Burns and H. Wilkoszewski (2012), “Looking Beyond the Numbers: Stakeholders and Multiple School Accountability”, OECD Education Working Papers, No. 85, OECD Publishing, Paris, http://dx.doi.org/10.1787/5k91dl7ct6q6-en.
Howlett, M. (2004), “Beyond Good and Evil in Policy Implementation: Instrument Mixes, Implementation Styles, and Second Generation Theories of Policy Instrument Choice”, Policy and Society, Vol. 23/2, pp. 1-17, https://doi.org/10.1016/S1449-4035(04)70030-2.
Huffman, J. (2003), “The Role of Shared Values and Vision in Creating Professional Learning Communities”, NASSP Bulletin, Vol. 87/637, pp. 21-34, www.researchgate.net/publication/249794781_The_Role_of_Shared_Values_and_Vision_in_Creating_Professional_Learning_Communities.
Huffman, J.B. and K.A. Hipp (2001), “Creating Communities of Learners: The Interaction of Shared Leadership, Shared Vision, and Supportive Conditions”, International Journal of Educational Reform, Vol. 10/3, pp. 272-281.
Innes, J. E. and D.E. Booher (2010), Planning with Complexity: An Introduction to Collaborative Rationality for Public Policy, Routledge, New York.
Kania, J., and M. Kramer (2011), “Collective impact”, Stanford Social Innovation Review, pp. 36-41, https://ssir.org/images/articles/2011_WI_Feature_Kania.pdf.
Manzanares Moya, A. and S. Ulla Díez (2013), La evaluación estatal del Plan de Refuerzo, Orientación y Apoyo (PROA): Análisis tras seis años de evaluación continuada [National Evaluation of the Programme for Reinforcement, Support and Guidance (PROA): Analysis After Six Years of Continued Assessment], Revista de educación nº extraordinario 2012, pp. 89-116, Ministerio de Educación, Cultura y Deporte [Ministry of Education, Culture and Sport], Madrid, www.mecd.gob.es/revista-de-educacion/numeros-revista-educacion/numeros-anteriores/2012/re2012/re2012_04.html.
Marsh, D. and A. McConnell (2010), “Towards a framework for establishing policy success”, Public Administration, Vol. 88/2, pp. 564-583, www.researchgate.net/publication/229613787_Towards_a_Framework_for_Establishing_Policy_Success.
McConnell, A. (2010), “Policy Success, Policy Failure and Grey Areas In-Between”, Journal of Public Policy, Vol. 30/3, pp. 345-362, http://web.pdx.edu/~nwallace/PATF/McConnell.pdf.
Mitra, D. (2007), “Student Voice in School Reform: From Listening to Leadership”, International Handbook of Student Experience in Elementary and Secondary School, pp. 727-744.
Morel, N., B. Palier and J. Palme (eds.) (2012), Towards a Social Investment Welfare State?: Ideas, Policies and Challenges, Policy Press, Bristol.
Morrison, K. (2001), “Randomised Controlled Trials for Evidence-based Education: Some Problems in Judging 'What Works'”, Evaluation and Research in Education, Vol. 15(2), pp. 69-83, www.researchgate.net/publication/237931382_Randomised_Controlled_Trials_for_Evidence-based_Education_Some_Problems_in_Judging_%27What_Works%27.
NIFU (Nordic Institute for Studies in Research, Innovation and Education) (2012), Kunnskapsløftet som styringsreform - et løft eller et løfte? Forvaltningsnivåenes og institusjonenes rolle i implementeringen av reformen [Knowledge promise as a governance reform - a boost or a promise? Governance levels and institutions' role in implementing the reform], NIFU, Oslo, https://brage.bibsys.no/xmlui/handle/11250/280885.
NIRN (National Implementation Research Network (n.d.), “Implementation defined”, http://nirn.fpg.unc.edu/learn-implementation/implementation-defined (accessed 2 November 2017).
OECD (2017a), PISA 2015 Results (Volume III): Students' Well-Being, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264273856-en.
OECD (2017b), Fostering Innovation in the Public Sector, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264270879-en.
OECD (2017c), The Funding of School Education: Connecting Resources and Learning, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264276147-en.
OECD (2016), Trends Shaping Education 2016, OECD Publishing, Paris. http://dx.doi.org/10.1787/trends_edu- 2016-en.
OECD (2015a), Government at a Glance 2015, OECD Publishing, Paris, http://dx.doi.org/10.1787/gov_glance-2015-en.
OECD (2015b), Education Policy Outlook 2015: Making Reforms Happen, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264225442-en.
OECD (2015c), Improving Schools in Scotland: An OECD Perspective, OECD Publishing, Paris, www.oecd.org/education/school/Improving-Schools-in-Scotland-An-OECD-Perspective.pdf.
OECD (2010), Making Reform Happen: Lessons from OECD Countries, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264086296-en.
OECD (2007), Evidence in Education: Linking Research and Policy, OECD Publishing, Paris. http://dx.doi.org/10.1787/9789264033672-en.
Penuel, W. R. et al. (2011), “Organizing Research and Development at the Intersection of Learning, Implementation, and Design”, Educational Researcher, Vol. 40/7, pp. 331-337.
Roscigno, V. J., D. Tomaskovic-Devey and M. Crowley (2006), “Education and the Inequalities of Place”, Social Forces, Vol. 84/4/, pp. 2121-2145, www.researchgate.net/publication/236823381_Education_and_the_Inequalities_of_Place.
Rubenstein, R. et al. (2007), “From districts to schools: The distribution of resources across schools in big city school districts”, Economics of Education Review, Vol. 26/5, pp. 532-545, https://steinhardt.nyu.edu/scmsAdmin/uploads/005/814/From%20Districts%20to%20Schools%20-%20Rubenstein%2C%20Schwartz%2C%20Stiefel%2C%20Bel%20Hadj%20Amor%20%282007%29.pdf.
Simmons, C., A. Graham and N. Thomas (2015), “Imagining an ideal school for wellbeing: Locating student voice”, Journal of Educational Change, Vol. 16/2, pp. 129-144, https://epubs.scu.edu.au/cgi/viewcontent.cgi?referer=https://www.google.ca/&httpsredir=1&article=2133&context=educ_pubs.
Skipp, A. and Hopwood, V. 2016, Mapping user experiences of the Education, Health and Care process: a qualitative study, ASK Research, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/518963/Mapping_user_experiences_of_the_education__health_and_care_process_-_a_qualitative_study.pdf.
Slavin, R.E. (2002), “Evidence-Based Education Policies: Transforming Educational Practice and Research”, Educational Researcher, Vol. 31/7, pp. 15-21.
Spillane, J. P., B.J. Reiser and T. Reimer (2002), Policy Implementation and Cognition: Reframing and Refocusing Implementation Research. Review of Educational Research, Vol. 72/3, pp. 387-431, www.researchgate.net/publication/258183248_Policy_Implementation_and_Cognition_Reframing_and_Refocusing_Implementation_Research.
Stufflebeam, D. L. (2010), “Meta-evaluation”, Journal of MultiDisciplinary Evaluation, Vol. 7/15, pp. 99-158, http://journals.sfu.ca/jmde/index.php/jmde_1/article/view/300.
Stufflebeam, D.L. (2001), “The meta-evaluation imperative”, American Journal of Evaluation, Vol. 22/2, pp. 183-209, http://journals.sfu.ca/jmde/index.php/jmde_1/article/download/220/215.
Stufflebeam, D. L. and C.L. Coryn (2014), Evaluation Theory, Models, and Applications, 2nd Edition, Wiley, Oxford.
Viennet, R. and B. Pont (2017), “Education policy implementation: A literature review and proposed framework”, OECD Education Working Papers, No. 162, OECD Publishing, Paris, http://dx.doi.org/10.1787/fc467a64-en.
Wiseman, A.W. (2010), “The Uses of Evidence for Educational Policymaking: Global Contexts and International Trends”, Review of Research in Education, Vol. 34/1, pp. 1-24 http://dx.doi.org/10.3102/0091732X09350472.
World Bank, (2012), Implementation completion and results report (IBRD-47670), World Bank, Washington, DC, http://documents.worldbank.org/curated/en/874001468110647154/pdf/NonAsciiFileName0.pdf.
Wylie, C. and Felgate, R. (2016), “I enjoy school now” Outcomes from the Check & Connect trials in New Zealand, New Zealand Council of Educational Research, www.educationcounts.govt.nz/__data/assets/pdf_file/0006/176316/1088-Check-Connect-LF-050716-131216.pdf.