Employment and Social Development Canada (ESDC) has committed to an open and transparent appraisal of its policies. It routinely publishes its impact assessments on the Canadian Government websites and makes efforts to present technical and non-technical summaries of its impact assessments. Evaluation is conducted separately for Provinces and Territories (PTs), enabling evaluation results to be shared that speak to local effects of active labour market policy. Working relationships with PTs developed over years of collaboration facilitate the smooth transfer of knowledge between federal and provincial government. Analytical teams make efforts to share research with external technical working groups. Minor changes to how ESDC presents the information to the public, in order to better frame key messages coming from the analysis, may help to communicate analysis even more effectively and to a broader audience.
Assessing Canada’s System of Impact Evaluation of Active Labour Market Policies
5. Communication and evidence‑based policy making
Abstract
5.1. Introduction
Clarity of communication is essential so that audiences can understand the analysis and clearly see how it influences the key questions at hand. Without a clear and articulate narrative on what the evidence shows, it is hard to effect change. Effective communication and transparency are crucial throughout the whole analytical lifecycle (OECD, 2021[1]). Communication needs to be pitched at the appropriate audience to deliver effectively the key messages. For the public this may mean simple language and representation. Whilst for an academic seminar audience a careful and detailed examination of the specifics may be required. However it is delivered, this communication is essential so that policy development and implementation makes effective use of what is known about its likely effects. This chapter briefly reviews how Employment and Social Development Canada (ESDC) disseminates and communicates the analysis its conduct.
5.2. Analytical dissemination and communication
The combination of a federal evaluation framework and work within ESDC to foster transparency, means that evaluation work programmes are well defined, clear and accountable. The federal Policy on Results, which was introduced in 2016, sets out a clear set of instructions which departments must adhere to, in order to continuously evaluate programmes. This includes obligations on evaluation frameworks, reports on programme impacts and monitoring and reporting requirements. There is also a high-level commitment within senior management in ESDC to open and transparent policy evaluation. Following a commitment by the Canadian Government to sign up to the Open Government Partnership in 2011, officials in the Evaluation Directorate in ESDC were keen for ALMPs to be an exemplar in this respect. A paper was published that set out their intended approach to evaluating ALMPs, the reasons why this approach was chosen and how lessons had been learned from previous work (Gingras et al., 2017[2]). Having a public commitment of this kind is important in establishing trust with citizens that policies will be appraised fairly and efficiently (Grimmelikhuijsen, 2012[3]; Güemes, 2019[4]). A culture of transparent policy evaluation is vital to build and embed meaningful and effective policy analysis communication within an organisation (Cairney and Kwiatkowski, 2017[5]; HM Treasury, 2020[6]).
The focus on transparent and open communication from leadership is also visible in the work that evaluation staff deliver and how they communicate externally. The evaluation team in ESDC has showcased its work at national and international conferences and meetings over the years. These presentations have covered the entirety of the work from data collation to impact analysis and looking further to the next stage of their analytical development, machine learning. The presentations continue the commitment made by the ESDC on open government and allow the team to share the knowledge that they have gained whilst gathering feedback and insights that may help their own analytical development.
Results from the evaluations are shared with the public online, via the Canada Government website. There is a publication portal that documents research reports and allows users to access analysis. The results are often published in two separate formats; a more comprehensive research report, which contains more of the details on the underlying analysis, the methodology and the results, and a shorter press-release type evaluation summary, which is mainly displayed via the website but is available as a document as well. This appeals to a fairly widespread audience and is written in a non-technical manner, such that external parties can easily assimilate the data.
Greater efforts made to proactively communicate information to the public may help to further spread information, to the media and directly to the public. It does not appear that there are widespread press releases to accompany, advertise and share evaluation results as they are published, either using the news service of the Government of Canada or on social media. ESDC press and media teams should consider how best to communicate results more widely. MDRC in the United States provides a good example of an institution that uses newsletters and its social media accounts to proactively share information on its research on social policy (MDRC, 2021[7]). Doing this has allowed to expand the reach of its work. ESDC could make better use of the news service on the Government of Canada website and reach out to its 120 000 Twitter followers, with summaries of its key evaluation results.
The location of the evaluation reports, as they are currently placed, means they are more likely to be viewed by a member of the press, or other external researcher who is specifically researching the topic. This is not a problem per se, but it does suggest that further thought could be given to how the general public could access information on how effective the Labour Market Development Agreements (LMDAs) are, with a view to encouraging participation (especially among those former employment insurance claimants, who might not have any other interaction with counselling services). A potentially helpful place to have further information on the positive effects of the LMDAs might be on the “Employment Benefits” section of the Canada Government website. This is much more likely to be the place that an unemployed jobseeker goes to for information and so could present an opportunity for information to be shared passively, without the individual actively searching for that specific information.
The evaluation summaries (for an example see ESDC (2017[8])) that are produced may also benefit from some re‑organisation to make the key messages even clearer. Like the main evaluation reports, separate summaries are available for Canada and all of the separate Provinces and Territories (PTs). The key results are found halfway down the page and are rather passive, talking in general terms rather than about the impact on individuals. Terminology should be reviewed to reduce jargon. Further efforts could also be made to improve the quality of the visual information provided on the website, particularly to the infographic information displayed. It contains information that is largely extraneous to the casual observer, such as the basic description of the how the estimates are made and the spending and volume of the different programmes. The infographic is trying to simultaneously describe the programmes, how the funding delivers interventions, how the evaluation was made, and the results of that evaluation. A clearer exposition here of what purpose and for whom the communication is for would allow a better focus on how to establish the key messages. Efforts to communicate the range of possible evaluation results, building on the sensitivity analysis conducted, to more technical audiences would be helpful to a deeper discussion of the results (Manski, 2011[9]).
The Danish Agency for Labour Market and Recruitment (STAR) offers a good example of how to package often complex and detailed analysis into content that can be easily digested by audiences with differing technical abilities. Its Jobeffekter website, https://www.jobeffekter.dk/, is a knowledge bank jointly created in collaboration with independent researchers. It categorises research into easily definable groups (“Unemployed”, “Vulnerable social benefits recipients”, “People on sick leave”), once a group has been selected, specific interventions can be chosen (for example, “Interviews” or “Training and Education”). It then produces summary information on the number of study results in this area, assesses the strength of evidence and details a summary of the job effect (for example, “Positive”, “Contradictory evidence”, or “Few studies) across a range of outcome variables. This allows the reader to see at a glance the strength of evidence in an accessible and non-technical manner. Further navigation within the specific results allows the more inquisitive or expert viewer to scrutinise the studies contained within this aggregate assessment and to see how each contributes to the overall score. The presentation of information and its formatting offers some insights on how to organise information in a systematic and accessible way for individuals. For ESDC this means clearly separating out communications for different audiences, packaging information for jobseekers in a clear and compelling manner and making information accessible. ESDC working with PTs, they could jointly discuss the most appropriate way to advertise this information across Canada, via ESDC and individual PTs communication channels.
5.2.1. Communication of analytical results to PTs
The channel of communication between ESDC and the PTs is vital to the functioning of ALMPs. This communication reportedly functions very well between the evaluation directorate and the officials within the PTs responsible for ALMPs. This is a result of the structures that have been put in place and the years of relationship building as the evaluation of the LMDAs have been jointly delivered. The first stage of communications is via the ongoing communication with PTs to plan and organise the analysis. The evaluation steering group that organises this work is seen as an inclusive and nimble working-level organisation that is successful at facilitating this communication. The second iteration of LMDA evaluation has also allowed ESDC to become more responsive to policy needs at the regional level, via the improvement in delivery speed of analysis. There is a collegiate relationship between them and they state that they work well together towards common goals.
Officials with responsibility for ALMPs in the PTs are cognisant of the evidence that has been built for their area and are actively involved in the planning and delivery of the LMDA evaluation. The joint responsibility for conducting the evaluations, alongside the process for planning and agreeing the objectives for each cycle of analysis contribute to a shared sense of purpose and ownership for the work. Whilst the quantitative analytical work is centrally conducted by ESDC, the fact that each PTs get a separate personalised report for their area means that they are each able to take and communicate evidence that has been generated unique to their locality and that assesses the issues affecting their populations. Those PTs that lack their own analytical capacity to conduct rigorous evaluation of their policies have been able to communicate with ESDC and ask for extra federal analytical support for specific research questions. ESDC has been forthcoming with this help where resources allow.
Despite the efforts made to provide individual evaluation reports to the PTs, there are constraints of operating federal evaluations of regional programme, because there is less resource for detailed individual qualitative work and there are data limitations on how much the quantitative analysis can say about how the specific delivery methods in a PTs impact upon participation outcomes. Removing surveys and utilising administrative data has significantly reduced delivery costs, but has come at the expense of detailed qualitative insight into regional delivery. The recommendations stemming from each of the regional reports are largely generic and not specific to the individual region. Interviews with local officials engaged in the delivery of ALMPs are conducted and these do provide some important contextual information, but they are limited in scope (they sometimes comprise five to ten individuals only) and there is no inclusion of the views of programme participants. Because of this, it is difficult to communicate an in-depth and intuitive feeling for how the policies are delivered at a local level. For local officials though, this does not hinder their interpretation of the findings, because they have all of the information they need to make this assessment separately.
Communication functions well between federal government and the officials in provincial government who plan and deliver the LMDA. Opportunities to share best practice and learning and to build relationships in a face to face setting have been disrupted by COVID‑19, the reintroduction of these will help to ensure evidence is shared and used widely among the individual PTs. Evaluations may not on their own allow external stakeholders as deep an insight into the how the programmes work. In this instance, further qualitative work, and testing using smaller-scale trials, would permit better assessment and communication of delivery and implementation methods.
5.3. Analysis and policy making cycle
Over time the delivery of high quality and robust policy evaluations has allowed evaluation officials to forge a deeper relationship with ministers and policy and programme counterparts. When the LMDA evaluations were first produced, results were just provided to ESDC policy and programme staff for them to utilise in their day-to-day policy and implementation work. Now there is greater co-working between analysts in the evaluation directorate and these policy and programme ESDC staff, which allows analysis to be fed into policy development and delivery on a more consistent and systematic basis. This helps to ensure that the narrative that is built around policy development and implementation is evidence‑based. Because the evaluation directorate has been able to deliver high-quality estimates of the LMDAs and their underlying programmes, it has allowed them to build trust and forge stronger working relationships. Part of the development of a closer working relationship between evaluation analysts and programme colleagues has meant that evidence on programme effectiveness is now brought in to annual budget discussions and used by the deputy minister to defend and advocate for the policy. This shift started gradually since 2011, after the first round of LMDA evaluation results. Evaluation findings now have much greater role in informing policy recommendations. This has been built upon the successful delivery of evaluation work, which could confidently demonstrate that programmes offer value‑for-money and is leveraged further by the closer working relationship between analysts and policy and programme colleagues, ensuring that analysis is now a bedrock of the policy narrative. The annual monitoring and assessment reports published by ESDC document the latest evaluation results, demonstrating how they provide value for money (ESDC, 2021[10]) and allowing the department to provide evidence of the use and value of its work to the broader public.
The Forum of Labour Market Ministers also offers an opportunity to share information to senior federal and provincial policy makers, so they can have an informed discussion on policy and delivery planning. Federal and provincial officials use this forum and the working groups associated with it to socialise results and reach consensus. Evaluation results have informed various debates at these meetings since their production.
There are several examples where the results from the impact evaluations have been directly helpful in policy discussions. Evidence on the effectiveness of the LMDAs was cited as being critical to the expansion of employment insurance eligibility in 2017. The Treasury Board approves each year the programme funding agreements and the impact evaluations undertaken by ESDC are used to support securing this funding. On the softer side, evidence is reported to be instrumental for the induction of new ministers and senior officials, as it helps to set the scene and allows officials to confidently articulate the policy and its benefits. Evidence on the timing of interventions (Handouyahia et al, 2014[11]) helped ESDC make the case to PTs about the need for timely and robust monitoring data, so that individuals could be helped swiftly back into work. This helped ESDC make the case for the need for them to receive these data from PTs so they could further develop evidence on what works. Evidence has also allowed ESDC to counter negative publicity claims, particularly when external organisations question the effectiveness of ALMPs.
But there are also some limitations on the extent to which analysis has changed policy delivery. The type of impact assessment conducted means that programme results are compared against “no programme”. In reality this is unlikely to occur, as the PTs receive a set level of funding from federal government and spend this money regardless. There are not more nuanced results available, such as whether or not more intensive programmes produce better outcomes, what sort of contracting framework best incentivises private providers, or whether government deliver services better than private providers. These second level of questions may be of greater use in altering how ALMPs are actually delivered by PTs. A graphical inspection of Figure 2.3 does not suggest any great shift in the proportional mix of programme type, after wave two evaluation results were delivered in 2017‑ suggesting that PTs largely continued on as usual, even when it appeared that some policies offered vastly better value‑for-money than others.
To summarise, ESDC makes concerted efforts to share the results of its evaluation work with external stakeholders. The legislative framework that mandates cyclical evaluation of the LMDAs helps to provide a bedrock for openness, and has been taken up by senior management within ESDC to commit to a transparent analytical work programme. This provides assistance across several domains. It helps to set the tone for collaborative working with PTs, it provides encouragement to evaluation analysts to disseminate and share their work with external experts and researchers, and it helps to broaden the reach of analysis within ESDC by facilitating the development of working relationships with policy and programme colleagues. This work could be further built upon by reviewing the content and delivery of key messages to different external audiences to improve the understanding of the evaluation results, particularly among the general public.
References
[5] Cairney, P. and R. Kwiatkowski (2017), “How to communicate effectively with policymakers: combine insights from psychology and policy studies”, Palgrave Commun 3, Vol. 3/37, https://doi.org/10.1057/s41599-017-0046-8.
[10] ESDC (2021), 2019/2020 Employment Insurance Monitoring and Assessment Report, Employment and Social Development Canada, http://www12.esdc.gc.ca/sgpe-pmps/p.5bd.2t.1.3ls@-eng.jsp?pid=72896.
[8] ESDC (2017), 2012-2017 Evaluation of the Labour Market Development Agreements Evaluation Summary, Employment and Social Development Canada, https://www.canada.ca/en/employment-social-development/corporate/reports/evaluations/labour-market-development-agreements/summary.html.
[2] Gingras, Y. et al. (2017), “Making Evaluation More Responsive to Policy Needs: The Case of the Labour Market Development Agreements”, Canadian Journal of Program Evaluation, Vol. 32/2, https://doi.org/10.3138/cjpe.31119.
[3] Grimmelikhuijsen, S. (2012), “Linking transparency, knowledge and citizen trust in government: An experiment”, International Review of Administrative Sciences, Vol. 78/1, pp. 50-73, https://doi.org/10.1177/0020852311429667.
[4] Güemes, C. (2019), ““Wish you were here” confianza en la administración pública en Latinoamérica”, Revista de Administração Pública, Vol. 53/6, pp. 1067-1090, https://doi.org/10.1590/0034-761220180230.
[11] Handouyahia et al (2014), Effects of the timing of participation in employment assistance services : technical study prepared under the second cycle for the evaluation of the labour market development agreements, Employment and Social Development Canada, https://publications.gc.ca/site/eng/9.834560/publication.html.
[6] HM Treasury (2020), Magenta Book: Central Government guidance on evaluation, https://www.gov.uk/government/publications/the-magenta-book.
[9] Manski, C. (2011), “Policy analysis with incredible certitude”, Economic Journal, Vol. 121/554, https://doi.org/10.1111/J.1468-0297.2011.02457.X.
[7] MDRC (2021), News & Media, https://www.mdrc.org/news/news-media.
[1] OECD (2021), OECD Report on Public Communication: The Global Context and the Way Forward, OECD Publishing, Paris, https://doi.org/10.1787/22f8031c-en.