This chapter discusses the remaining barriers to evidence informed policy making, including data availability and use, as well as recruitment and retention of staff. The chapter highlights opportunities to strengthen the full range of policy analysis, the options for strengthening capacity for social research and for evidence synthesis. The chapter offers options for moving from knowledge management to knowledge brokerage to build the use of results into policymaking as well as for strengthening communications and branding.
The Irish Government Economic and Evaluation Service
3. Overall effectiveness of IGEES and areas for further investment
Abstract
Remaining barriers to evidence informed policy making
While this section highlights a number of remaining challenges, this should not obscure the fact that overall the general impression is that IGEES has made a significant difference and that progress is being achieved in many areas in strengthening capacity for policy making. As a successful small open economy, Ireland does not seem to be facing some of structural challenges that are faced in some of the larger neighbouring countries, namely the lack of trust, the perceived impartiality of the civil service, and preserving the political/administrative interface. The level of trust, the recognised professional skills in the civil service and a functioning political administrative interface, still appear to remain structural characteristics of the Irish systems. Still, some of the pressures remain, and the management and rationalisation of choices for major public infrastructure projects still present significant challenges at time.
Data availability and use
Progress towards evidence informed policy making is hampered by the availability of adequate data and the Department’s capacity to use it in Ireland, as is the case in many OECD countries. In some cases, the challenge was to understand what data and data sets currently existed within departments, and how departments could use the data for policy analysis providing adequate incentives to data producers. Some departments had created a ‘Data Map’, setting out the relevant departmental and agency data to increase awareness of the data that is held by the policy units. Other departments were still in the process of building up their departmental data infrastructure, which needed dedicated analytical resource to be effective. Departments that worked with agencies and arm’s length bodies faced particular challenges in data access because these Departments were not typically the ‘owner’ of much of the data and evidence needed to carry out policy analysis. The sharing of data across departments also seemed to represent very significant challenges.
There were also a number of challenges in making full use of the available data. There is some concern about using individual level data to evaluate policies and programmes, with data protection issues assumed to be an obstacle. Academics spoke of their frustration at the complexity of the processes they were required to go through for even repeat requests to use the same data sets of multiple occasions. It also seems that existing government’s IT systems are facing limitations, resulting in challenges to carry out statistical and other forms of analysis. Given the achievements and progress that the Irish government is making in terms of improving data infrastructures, IGEES should consider the opportunity of better inscribing its work within the broader data-driven public sector strategies of the country, by collaborating or aligning agendas with digital government teams for instance.
While IGEES has been meant as a scheme to strengthen capacity for rigorous economic evaluation through an investment in skills, it has not been broadly associated with a strategy to strengthen evidence and promote smart data, which would have more systematically linked with the statistical side and have fostered systematic access to administrative data. Some other countries, such as the US and Japan are now implementing more systematic approaches to implement an evidence-based policy making agenda, with institutional resources, promoting internal champions and exploring the possibility to fully use existing data on a more systematic basis (See Box 3.1. and Box 3.2 for US and Japan EBPM initiatives).
Box 3.1. Building the foundations for evidence-based policymaking in the US
For federal agencies to better acquire, access, and use evidence to inform decision-making, the Evidence Act was signed into law on January 14th, 2019. This Act mandates evidence-generating activities across agencies, open government data, confidential information protection, as well as skills and capacity building. It also includes provisions for appointing a Chief Data Officer (CDO) by each agency and establishing a CDO council. (Office of Management and Budget, 2018[1])
The Evidence Act requires agencies to designate an Evaluation Officer, which strengthens an agency’s capacity to build evidence. Senior evaluation officials are responsible for coordinating the agency’s evaluation activities, learning agenda and information reported to the Office of Management and Budget (OMB) on evidence. They also have to establish and use multi-year learning agendas, document the resources dedicated to program evaluation, and finally improve the quality of the information provided to OMB on evidence-building activities. (OECD, 2018[2])
Precisely, the multi-year learning agendas allow agencies to identify and address policy questions and include operational and strategic questions concerning agencies’ human resources, internal processes, etc. The evidence-building plans include the policy questions that the agency intends to answer, the data needed, and the challenges faced in generating evidence to support policymaking. (Office of Management and Budget, 2019[3])
Additionally, the OMB has a dedicated Evidence Team that works with other OMB offices in order to set research priorities and ensure the use of appropriate evaluation methodologies in federal evaluations. Interestingly, as of July 2019, the team will create and coordinate an interagency council that regroups Evaluation Officers. This council will serve as a forum to exchange information and advise the OMB on issues affecting the evaluation functions such as evaluator competencies, best practices for programme evaluation, and capacity building. The council will also allow coordination and collaboration between evaluators and the government and will play a leadership role for the larger Federal evaluation community. To ensure evidence is used in policy design, the Evidence Team is also actively involved offering technical assistance to Federal Agencies. (Clark, 2019[4])
Box 3.2. Institutionalisation of Policy Evaluation in Japan
To provide the Japanese policy evaluation system with a clear-cut framework and improve its effectiveness, Japan has enacted The Government Policy Evaluations Act of 2001. It enunciates the obligation of each ministry to evaluate adopted policies and specifies detailed steps for evaluation processes conducted by the Ministry of Internal Affairs and Communications (MIC).
Following the Act, the Japanese Cabinet created “Basic Guidelines for Implementing Policy Evaluation”, which offer advice to facilitate planned and steady policy evaluation across government. Each ministry thereby determines a "Basic Plan for Policy Evaluation," over 3 to 5-year periods to incorporate policy evaluation into the policy management cycle. Under the basic plan, each ministry should also determine an "Implementation Plan for Policy Evaluation", which is a 1-year plan that is revised after expiration. This plan specifies the policies to be evaluated in the year and how they should be implemented (Japanese Cabinet, 2017[5]).
Under the Act, the Administrative Evaluation Bureau (AEB) of the MIC formulates guidelines for implementing policy evaluations, aggregates all policy evaluation reports across the government, and conducts reviews to improve the quality of those evaluations (The Ministry of Internal Affairs and Communications, 2012[6]). From FY2012, the AEB introduced a standard format for ex-post evaluations reports, which made them more easily understood and shared across ministries. Besides, the AEB set up a portal site for policy evaluation that provides links to policy evaluation data including analysis sheets and evaluation reports publicized by each ministry to ensure transparency and accountability (The Ministry of Internal Affairs and Communications, 2010[7]).
Moreover, to promote EPBM practices across government, the Statistics Reform Promotion Council was established in the Prime Minister's office in 2017. This entailed setting up a new Director-General (DG) for Evidence-Based Policymaking in each ministry and ensuring that these EBPM DGs work together across government (The Statistical Reform Promotion Council, 2017[8]).
The Committee on Promoting EBPM, which includes the responsible DGs, developed "Policy on Recruitment and Capacity Building of Human Resources to Promote EBPM ". The government thereby addresses public relations related to EBPM by holding cross-ministerial training and studying sessions (The Committee on Promoting EBPM, 2017[9]).
Besides, ministries practice trial EBPM (where they identify problems, set objectives, predict and measure effects) and use the results for changing and making policies. Ministries have to report on their EBPM trials to the Committee. The AEB of the MIC also implements EBPM in joint empirical studies with line ministries (The Cabinet Secretariat, 2019[10]).
Sources: (Japanese Cabinet, 2017[5]), (The Ministry of Internal Affairs and Communications, 2012[6]), (The Ministry of Internal Affairs and Communications, 2010[7]), (The Statistical Reform Promotion Council, 2017[8]), (The Committee on Promoting EBPM, 2017[9]), (The Cabinet Secretariat, 2019[10]) (in Japanese).
All of the abovementioned challenges also point to the need of building solid data governance foundations supporting the use of evidence in decision-making. This would contribute to enhancing coherent implementation and coordination, and strengthening the institutional, regulatory, capacity and technical foundations to better control and manage the data value cycle, i.e. collect, generate, store, secure, process, share, and re-use data, as means to enhance trust and deliver value (OECD, 2019).
In light of the above, data sharing tools such as data infrastructures or the use of APIs and open data towards greater data sharing, should complement the more tactical and strategic elements relevant of data governance (see Figure 3.1) from data stewardship to public sector capacity and the enabling regulatory frameworks (e.g. for data sharing). It might be useful to consider how to coordinate the need for evidence and economic analysis with the data governance agenda, as is the case for example in the US through the implementation of the “Evidence Informed Policy Making” Act. It is important to avoid fragmentation and duplication of efforts (e.g. in developing separate data sharing infrastructures), and to promote public sector integration and cohesion.
Recruitment and retention of staff
The robust jobs market in Ireland means that IGEES competes with a diverse range of public and private sector organisations to recruit from a relatively small pool of graduate economists. The variety of job positions offered, the quality of analytical work and the possibility of internal mobility were signalled as strong assets that motivate candidates to apply for IGEES jobs. Departments were unanimously positive about IGEES having broadened up the recruitment to other graduates with relevant policy analysis skills such as social scientists. It appears desirable to continue the process of diversifying the pool of entrants. Many departments still have capacity gaps, with only a small number of analysts to service the policy analysis needs of the entire department. It is also necessary to build a critical mass of analysts in departments in order for the policy analysis function to truly realise its potential, which is the case in only part of the departments surveyed. While retention of analysts is not generally seen as a major challenge, the fact is that many analysts are also finding career paths elsewhere inside the civil service in some more managerial positions.
While the focus on strengthening provision of analytical capacity at the Administrative Officer and Assistant Principal is a core asset of IGEES, an open question remains about whether further developments should continue with the Principal Officer (PO) grade. Currently, there are no formally designated IGEES staff at PO grade, although there are cases where departments had badged roles such as ‘Chief Economist. This might affect the career prospects of some IGEES recruits, notably those less willing to progress into generalist positions.
However, such issues for leadership in analytical jobs should not only be seen from the perspective of internal IGEES management, but also as opportunities to create “champions” for an evidence and economic evaluation agenda within government, in a way that would be clearly recognisable, and also identified as such up to the political level. Some other countries have designated positions such as “Chief Economist”, “Chief Statisticians”, or Chief scientist, Chief Evaluator such as the UK, Canada, the US or France. This is also a way to signal the importance of such evidence driven tasks within governments, and the fact that the post holders for such positions should also be seen as a source for authoritative advice, among both citizens and politicians.
Overall effectiveness and key areas for further investment
IGEES as a capacity building initiative within the Civil Service
There is no doubt that IGEES has been successful in building capacity for evidence informed policy making in Ireland. These efforts will need to continue over a number of years. It is also critical that IGEES initiatives connect with government wide initiatives such as the Public Service Reform Plan (PSRP). In OECD’s assessment of the PSRP a key recommendation is that in order to meet its objectives a careful look at the skills and capabilities is required in order to deliver the required changes in Ireland’s civil and public service – and a number of the question keys identified in that report are still pertinent:
What are the current skills and/or capacity gaps that are limiting the successful implementation of reform? What are the priority sectors and levels for addressing these gaps?
Beyond technical and professional skills, how can the Irish public service foster risk taking, evaluation and learning (through experimentation, evaluation, etc.) to support innovation and build learning organisations?
While the PSRP has developed additional capacity (PMOs, economic analysis for example), how can it ensure that it leverages organisational change throughout ministries and agencies so that there is broad ownership of the reform process and results?
OECD’s work on Building Capacity for Evidence Informed Policy Making (REF) would provide useful material for Ireland to make a more in depth assessment of the strengths and weaknesses across the different dimensions of EIPM. Six skills clusters are identified which could be used to identify skill and capacity gaps that continue to impede EIPM.
Opportunities to strengthen the full range of policy analysis
IGEES has built up strengths in core economic analysis over the medium term, but in the longer term the full range of policy analysis tools need mainstreaming throughout the system. Key areas for focus include developing and mainstreaming skills for impact evaluation, for evidence synthesis and for wider social research.
Strengthening capacity for impact evaluation
The assessment reveals a need to develop capacity for impact evaluation. Sometimes referred to ‘counterfactual impact evaluation’ such studies typically privilege internal validity, which pertains to inferences about whether the observed correlation between the intervention and outcomes reflect an underlying causal relationship. Determining the efficacy of an intervention is a complex process, involving considerations on the evaluation design, sample, measurements, methods of analysis, and findings. A range of different evaluation designs can establish the efficacy of a policy or programmes including both experimental and quasi-experimental approaches (see Box 3.3).
Box 3.3. Standards for establishing the efficacy of an intervention
The What Works Centre for Local Economic Growth (WWG) is an independent organisation from the UK, mainly, focus on producing systematic reviews of the evidence on a broad range of policies in the area of local economic growth.
WWG assessment is based on the Maryland Scientific Methods Scale (SMS), which ranks policy evaluations from 1 (least robust: studies based on simple cross sectional correlations) to 5 (most robust: Randomised Control Trials.). The ranking aims to present the extent to which the methods deal with the selection biases inherent to policy evaluations (robustness), and the quality of its implementation to achieve efficacy, as following:
Level 2: Use of adequate control variables and either (a) a cross-sectional comparison of treated groups with untreated groups, or (b) a before-and-after comparison of treated group, without an untreated comparison group.
Level 3: Comparison of outcomes in treated group after an intervention, with outcomes in the treated group before the intervention, and a comparison group used to provide a counterfactual (e.g. difference in difference). Techniques such as regression and propensity score matching may be used to adjust for difference between treated and untreated groups, but there are likely to be important unobserved differences remaining.
Level 4: Quasi-randomness in treatment is exploited, so that it can be credibly held that treatment and control groups differ only in their exposure to the random allocation of treatment. This often entails the use of an instrument or discontinuity in treatment, the suitability of which should be adequately demonstrated and defended.
Level 5: Reserved for research designs with Randomised Control Trials (RCTs) providing the definitive example. Extensive evidence provided on comparability of treatment and control groups, showing no significant differences in terms of levels or trends. Additionally. Attention paid to problems of selective attrition, and there should be limited or, ideally, no occurrence of ‘contamination’ of the control group with the treatment.
Source: What Works Centre for Local Economic Growth ([100]).
Although there were some examples of high quality impact evaluation from a number Departments, overall impact evaluation is still a rarity in the Irish government. Strengthening Ireland’s data infrastructure will increase the opportunities to carry out impact evaluations using quasi-experimental methods based on existing administrative data (see above). Although experimental approaches, such as randomised control trials, can sometimes take advantage of existing administrative data, it is often necessary to collect new data using social research methods. Developing capacity for social research is another area where IGEES can focus over the longer term, including through leveraging its research funds (see next section).
A focus on developing theories of change and logic models can improve the quality of both policy design and impact evaluation. A theory of change can be defined as set of interrelated assumptions explaining how and why an intervention is likely to produce outcomes in the target population (European Monitoring Centre for Drugs and Drug Addiction, 2011[12]). Engaging in the process of developing a theory of change leads to better policy planning, implementation, and monitoring because the policy or programme activities are linked to a detailed and plausible understanding of how change actually happens. A logic model sets out the conceptual connections between concepts in the theory of change to show what intervention, at what intensity, delivered to whom and at what intervals would likely produce specified short term, intermediate and long term outcomes (Axford et al., 2005[13]; Epstein and Klerman, 2012[14]). A logic model is a critical tool to allow detailed coherent and realistic policy planning. A full list of the benefits of developing both a theory of change and logic model in reproduced in Box 3.4.
Box 3.4. The benefits of developing an intervention theory of change and logic model
1. The evaluability of the programme—for both implementation and outcomes—is facilitated, by signposting appropriate metrics.
2. The original intentions of the programme developers are clearly set out, and are explicit and open to critique.
3. The underlying logic of the assumptions made in the theory, for example, that undertaking a certain activity will lead to a particular outcome, can be scrutinised.
4. The realism of the assumptions made by the programme developers can be checked against wider evidence of ‘what works’, to assess the likelihood of the programme being successful.
5. Commissioners can check the programme meets their needs; and providers and practitioners delivering the programme can check their own assumptions and the alignment of their expectations against the original intentions of the programme developers.
In addition, scope might exist to expand the role of IGEES in developing professional approaches to equality-related analysis. For instance, operational tools for equality budgeting could be developed, expanding beyond the performance-budgeting foundation in which Ireland already has significant strengths. The (Forthcoming) OECD Equality Budget Scan of Ireland precisely suggests that ex ante assessments of policy areas such as poverty and its intersection with various equality dimensions should be complemented by ex post equality impact assessment, to track whether policies are meeting equality objectives and if they have equality-related impacts. In particular, equality-related analysis could be integrated in a systematic and structured manner in the Spending Review process.
Developing such tools may require developing additional tools to those already available, such as the Public Spending Code and other frameworks, as well as actively promoting some of the existing cases of impact evaluation in order to emulate change and provide incentives towards greater adoption of such approaches. More precisely, the Public Spending Code is a set of procedures and rules that ensures that value-for-money is obtained in public spending. A key portion of the Code is devoted to developing theories of change and logic models (Department of Public Expenditure and Reform, 2019[15]).To assess expenditure programmes, two main methodologies used across the Irish civil service are the Value for Money Reviews (VFMRs) and Focused Policy Assessments (FPAs). The former consists in the evaluation or major spending programmes or specific policies, while the latter is an evaluation methodology specifically related to policy configuration and delivery (IGEES, 2018[16]). Still within the scope of government spending and impact evaluation, the Department of the Taoiseach provided guidelines on conducting regulatory impact analysis (Department of the Taoiseach, 2009[17]), which now sit with DPER as part of the Public Spending Code. The Department of Finance provides guidelines specifically targeted at the evaluation of tax expenditure (Department of Finance, 2014[18]). The latter sets best practices for ex ante and ex post evaluation of tax expenditure.
As such, the currently existing tools have a major focus on value-for-money and optimising government spending. Although of incontestable value, this approach lacks a broader perspective on well-being and inclusive growth, which could be supported by the development of appropriate analytic tools. Such instruments have been developed in countries such as Scotland and New Zealand, which shows policy frameworks that address wellbeing and inequalities, in line with economic development (See Box 3.6 on Approaches to well-being for policy making in Scotland).
Strengthening capacity for social research
IGEES was already making progress in augmenting capacity for social research. As with impact evaluation, whilst there were many high quality examples of social research across the Irish government, overall the government has not well institutionalised social research methods. The comparison with a country with a Government Social Research (GSR) profession such as the UK is informative (See Box 3.5). This is due to the origins of IGEES, which has developed in the aftermath of the global financial crisis, to strengthen capacity for sound expenditure management.
Box 3.5. The UK Government Social Research Profession
The GSR profession is one of the Civil Service Professions that works alongside other analysts (economists, statisticians and operational researchers). GSR professionals use the core methods of social scientific enquiry, such as surveys, controlled trials, qualitative research, case studies and analysis of administrative and statistical data in order to explain and predict social and economic phenomena for policymaking.
Members of the GSR profession come from a wide variety of social science backgrounds, including candidates with degrees in psychology, geography, sociology and criminology. The GSR profession has its own competency framework that begins with entry-level graduates as part of the fast stream to members of the senior civil service and most UK government departments would have a Chief Social Researcher who leads and supports the activity of social researchers within the department.
Source: UK Government, “Government Social Research Profession”. Accessed 2 September 2019. https://www.gov.uk/government/organisations/civil-service-government-social-research-profession/about.
Such broadening of the capacity of social research might also be facilitated by some slight adjustment of focus, for example with increased attention to well-being outcomes, in line with international trends (see Box 3.6 and example of Scotland).
Box 3.6. Approaches to Well-Being for policymaking in Scotland
Scotland’s National Performance Framework (NPF) represents an instructive experience on the promotion of government accountability for inclusive growth with a well-being perspective. This Framework involves a co-ordination mechanism that ensures alignment of strategies and programmes across sectors, in support of broader national outcomes. It sets out a wide range of indicators (81 as of September 2019) against which the progress of the Scottish government is measured and reported on a publicly accessible website. The indicators measure national and societal wellbeing, incorporating economic, social and environmental targets that are updated with available data. (Scottish Government, 2019[19]).
Precisely, these targets relate to business, employment, education and skills, child well-being, health, inequalities, social exclusion, safety, sustainable consumption, etc., which goes beyond traditional measures of GDP. These indicators were built on public consultations, through extensive surveys and workshop across the country involving diverse social groups. Moreover, Scottish Ministers have a duty to consult on, develop and publish a new set of National Outcomes for Scotland, and to review them at least every five years (Acquah, Lisek and Jacobzone, 2019[20]).
Such outcomes-based framework enables translating inclusive and well-being goals into reality, by selecting specific policy interventions based on evidence and aligning high-level goals onto budgetary allocations and other policy interventions (OECD, 2016[21]). Some Departments within the Scottish Government, such as the Department of Justice, has shown great success in linking the targets and outcomes of the NPF to their strategic actions and decision-making. The full achievement of well-being outcomes requires overcoming the remaining challenge of incorporating the NPF into government’s actions and spending programmes.
Strengthening capacity for evidence synthesis
Evidence synthesis is another area that would benefit from development in Ireland, particularly through increased knowledge management at the level of the various analytical units. Evidence syntheses, through secondary processing of existing studies, provide a vital tool for policy makers and practitioners to find what works, how it works – and what might do harm, often at a cost that is lower than that of conducting one new evaluation. . Evidence syntheses are also critical in informing what is not known from previous research. As the number of studies increases, it becomes more difficult for policy makers and practitioners to keep abreast of the literature. Furthermore, policies should ideally be based on assessed of the body of evidence, not single studies, which may not provide a full picture of the effectiveness of a policy or programme. This need to draw on bodies of evidence has led to an increase in the use of evidence synthesis.
Whilst there were a small number of Departments already engaged in evidence synthesis, overall knowledge of the different forms of evidence synthesis was at the early stages in Ireland and most of the evidence synthesis was restricted to literature reviews. Although literature reviews can be useful for providing information on a topic in a short period, literature reviews have a number of serious weaknesses. Literature reviews are prone to selection and publication bias and because they are often unclear on their methodology, assessing the strength of the conclusions is challenging. To overcome these and other challenges, more formal and transparent methodologies for evidence synthesis have been developed, including quick scoping reviews, rapid evidence assessments and systematic reviews (See Box 3.7).
Box 3.7. Different methodologies for reviewing the evidence base
Effective policymaking requires using the best available evidence, which itself requires reviewing and choosing from the already existing evidence on the policy question. Different reviewing methods enable managing and interpreting the results of this large evidence base:
Quick Scoping Review: this non-systematic method can take from 1 week to 2 months. It consists in doing a quick overview of the available research on a specific topic to determine the range of existing studies on the topic. It allows mapping the literature concerning a delimited question by using only easily accessible, electronic and key resources, going up to two bibliographical references.
Rapid Evidence Assessment (REA): this systematic and more time-consuming method (2 to 6 months) consists in quickly overviewing the existing research on a specific policy issue and synthesising the evidence provided by this research. It intends to rigorously and explicitly search, and critically appraise this evidence. To gain time, it may limit certain aspects of the systematic review process, such as narrowing the REA question or the type and breadth of data considered. Shortening the traditional systematic review process provides a rapid synthesis of the existing relevant evidence, but suffers the risk of introducing bias.
Systematic Review: this is the most robust method for reviewing, synthesising and mapping existing evidence on a particular policy topic. It is more resource-intensive, as it can take up to 8 to 12 months minimum and requires a researcher team. It has explicit objectives and a thorough search strategy that considers a broad range of data. Studies are chosen and screened according to explicit and uniform criteria, and reasons for excluding certain studies have to be stated. This transparent and comprehensive method maximally reduces bias in the search, choice and synthesis of the existing research. Moreover, it allows the creation of a cumulative and sound evidence base on a specific policy subject. Lastly, systematic reviews are applicable to quantitative studies as well as other types of questions.
Source: The UK Civil Service, What is a Rapid Evidence Assessment? https://webarchive.nationalarchives.gov.uk/20140402163359/http://www.civilservice.gov.uk/networks/gsr/resources-and-guidance/rapid-evidence-assessment/what-is (Accessed 12 August 2019).
Developing capacity to carry out full systematic reviews, which can take up to two years, is beyond the scope of all but the most analytically well-resourced Departments. Fortunately, there exist a number of high-quality databases of existing systematic reviews such as Cochrane, Campbell and the UK What Works Centres. These resources could become more integral resources that Departments consider during the policy design phase of any new proposal. There is scope for integrating such approaches more broadly in the Irish system, in a way that could be facilitated by IGEES tools and resources, including dissemination seminars.
Building the use of results into decision making
OECD’s review of Ireland’s Public Service Reform Plan concluded that whilst building evaluation and experimentation processes into policymaking is an important first step, a further challenge is to ensure that the results get used in decision-making. This is a common challenge that governments face: despite an increase in policy analysis and the potential for policies to be based on evidence, in reality an effective connection with many types of evidence in policymaking can be elusive.
Moving from knowledge management to knowledge brokerage
In terms of knowledge management, most Departments attempted to publish the majority of reports on their website. Ireland could consider moving beyond these strategies aimed at pushing research at policymakers and instead develop strategies to build the demand for evidence. Some departments had already developed strategies to encourage interaction between analysts and policy makers. Strategic policy discussions, policy forums and presentations are some of the examples of activities Departments had engaged in to stimulate demand for evidence. Across OECD countries, governments have also invested in structured and long term approaches to building demand. In Australia, the Policy Liaison Initiative was an attempt to improve the use of evidence synthesis. This involved creating an ‘Evidence-Based Policy Network’ to facilitate knowledge sharing between policy makers and researchers, alongside seminars by national and international researchers in field of evidence synthesis and implementation (see Box 3.8). The current framework of IGEES related events provides an opportunity to develop such approaches.
Box 3.8. The Policy Liaison Initiative for improving the use of Cochrane systematic reviews
The Policy Liaison Initiative (PLI) is a long-term knowledge translation initiative designed to support the use of Cochrane systematic reviews in health policy. A joint initiative between the Australasian Cochrane Centre and Australian Government Department of Health and Ageing, the PLI includes three core elements.
1. A community of practice for evidence-informed policy. This comprised of an Evidence- Based Policy Network to facilitate knowledge sharing between policy makers and the Cochrane Collaboration. The members of the network receive bulletins alerting them to new and updated reviews. Seminars by national and international researchers in field of evidence synthesis and implementation were also provided.
2. Skills building workshops. These covered a range of topics including types of evidence, research study design and matching, searching for empirical and review evidence, critical appraisal and applying evidence to the local context. The training material and resources from the workshops were made available on the website.
3. A website and summaries of policy relevant reviews. A web portal for indexing and accessing policy relevant Cochrane reviews and summaries was created. A tailored summary format was also created to present the findings of reviews.
Source: Adapted from (Brennan et al., 2016[22]).
Other interventions adopt more structured approaches to bring policy makers into contact with individual scientists, through collaborating in the development of research projects as well as ad-hoc or formalised systems of parliamentary advice where researchers are called to provide advice. In 2015, the UK Cabinet Office set up the ‘Cross-Government Trial Advice Panel’ in partnership with the Economic and Social Research Council. The Trial Advice Panel brings together a team of experts from academia and within the civil service to support the use of experiments in public policy (What Works Network, 2018[23]). It also offers a means of combining expertise, allowing departments with limited expertise in evaluation to work with departments that do, as well top academic experts. In so doing, the Trial Advice Panel aims to reduce the barriers that departments face in commissioning, conducting evaluations and using the resulting evidence to improve public policies.
Communications and branding
IGEES has certainly made significant progress in establishing a brand for quality economic evaluation within the Irish civil service. Existing stocktaking of papers highlight the quantity of work that is being conducted. However, efforts in this area have to strike a careful balance, between the need to ensure visibility for IGEES while also making sure that the work is being integrated and owned by departments. Now, the production of such documents seems to be driven more by the spending review process and the need for accountability for the resources that have been invested in IGEES.1
Issues of branding may also require establishing a logo, as well as possibly some form of social media activity, to socialise the results with the broader public. This may require help from specialised staff, such as communication specialists, data journalists and community managers for rewriting. This might need to be coordinated with government departments’ communications units or even the Taoiseach. On that matter, developments of the IGEES website are sought, and a communication strategy is led by the department of Taoiseach. Another type of public could also be the academic community, for example through reaching out to national, European or Northern American research networks (e.g. European Economic Association, National Bureau of Economic Research).
Of course, establishing a communications strategy may require identifying the desirable areas where an increased focus on communication and outreach could be desirable, and achieving a shared understanding across the IGEES policy community within government so that such moves be well understood.
References
[20] Acquah, D., K. Lisek and S. Jacobzone (2019), “The Role of Evidence Informed Policy Making in Delivering on Performance: Social Investment in New Zealand”, OECD Journal on Budgeting, Vol. 19/1, https://dx.doi.org/10.1787/74fa8447-en.
[13] Axford, N. et al. (2005), “Evaluating Children’s Services: Recent Conceptual and Methodological Developments”, British Journal of Social Work, Vol. 35/1, pp. 73-88, http://dx.doi.org/10.1093/bjsw/bch163.
[22] Brennan, S. et al. (2016), “Design and formative evaluation of the Policy Liaison Initiative: a long-term knowledge translation strategy to encourage and support the use of Cochrane systematic reviews for informing health policy”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 12/1, pp. 25-52, http://dx.doi.org/10.1332/174426415X14291899424526.
[4] Clark, C. (2019), OMB Moving Ahead to Steer Agencies on Evidence-Based Policymaking - Government Executive, https://www.govexec.com/management/2019/07/omb-moving-ahead-steer-agencies-evidence-based-policymaking/158381/ (accessed on 16 September 2019).
[18] Department of Finance (2014), “Incorporating Department of Finance Guidelines for Tax expenditure evaluation”, http://www.budget.gov.ie/budgets/2015/documents/tax_expenditures_oct14.pdf (accessed on 11 September 2019).
[15] Department of Public Expenditure and Reform (2019), The Public Spending Code, https://publicspendingcode.per.gov.ie/ (accessed on 11 September 2019).
[17] Department of the Taoiseach (2009), “RIA Guidelines”, https://govacc.per.gov.ie/wp-content/uploads/Revised_RIA_Guidelines_June_2009.pdf (accessed on 11 September 2019).
[14] Epstein, D. and J. Klerman (2012), “When is a Program Ready for Rigorous Impact Evaluation? The Role of a Falsifiable Logic Model”, Evaluation Review, Vol. 36/5, pp. 375-401, http://dx.doi.org/10.1177/0193841X12474275.
[12] European Monitoring Centre for Drugs and Drug Addiction (2011), “European drug prevention quality standards”, http://dx.doi.org/10.2810/48879.
[16] IGEES (2018), “Value for Money Review (VFMR) and Focused Policy Assessments”, https://publicspendingcode.per.gov.ie/wp-content/uploads/2018/06/VFMR-and-FPA-Guidelines-Jan2018.pdf (accessed on 11 September 2019).
[5] Japanese Cabinet (2017), “Basic Guidelines for Implementing Policy Evaluation (Revised)”, http://www.soumu.go.jp/main_content/000556221.pdf (accessed on 16 September 2019).
[11] OECD (2019), The Path to Becoming a Data-Driven Public Sector, OECD Publishing, https://doi.org/10.1787/059814a7-en.
[2] OECD (2018), Building Capacity for Evidence Informed Policy Making: Towards a Baseline Skill Set, http://www.oecd.org/gov/building-capacity-for-evidence-informed-policymaking.pdf (accessed on 3 September 2019).
[21] OECD (2016), “The Governance of Inclusive Growth”, https://www.oecd-ilibrary.org/docserver/9789264257993-en.pdf?expires=1568647443&id=id&accname=ocid84004878&checksum=8F711F5F9A58B39A458CC239AC893743 (accessed on 16 September 2019).
[3] Office of Management and Budget (2019), FY2020 President’s Budget (Evidence Chapter), https://www.whitehouse.gov/wp-content/uploads/2018/06/Gov-.
[1] Office of Management and Budget (2018), Executive Office of the President Office of Management and Budget Memorandum for Heads of Executive Departments and Agencies, https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/memoranda/20.
[19] Scottish Government (2019), Scotland Performs, National Performance Framework, National Outcomes, https://nationalperformance.gov.scot/ (accessed on 16 September 2019).
[10] The Cabinet Secretariat (2019), The status-quo about the promotion of statistics reform, http://www.kantei.go.jp/jp/singi/toukeikaikaku/dai5/siryou1.pdf (accessed on 2 September 2019).
[9] The Committee on Promoting EBPM (2017), Guidelines on securing and developing human resources for the promotion of EBPM, https://www.gyoukaku.go.jp/ebpm/img/guideline1.pdf (accessed on 2 September 2019).
[6] The Ministry of Internal Affairs and Communications (2012), “Policy Evaluation Implementation Guidelines”, http://www.soumu.go.jp/main_content/000354069.pdf (accessed on 16 September 2019).
[7] The Ministry of Internal Affairs and Communications (2010), “Guidelines for Publication of Information on Policy Evaluation”, http://www.soumu.go.jp/main_content/000067741.pdf (accessed on 16 September 2019).
[8] The Statistical Reform Promotion Council (2017), The final report of the Statistical Reform Promotion Council. (In Japanese), http://www.kantei.go.jp/jp/singi/toukeikaikaku/pdf/saishu_honbun.pdf (accessed on 2 September 2019).
[23] What Works Network (2018), The Rise of Experimental Government: Cross-Government Trial Advice Panel Update Report, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/753468/RiseExperimentalGovernment_Cross-GovTrialAdvicePanelUpdateReport.pdf (accessed on 24 January 2019).
Note
← 1. See Hayes and Behan, (2017) a selection of IGEES Output.