This chapter discusses the importance of monitoring and evaluation in guiding policy making. The chapter reviews four aspects that are considered crucial for the implementation of a rigorous Monitoring and Evaluation (M&E) system: data infrastructure, ethical oversight and data privacy considerations, data quality and dissemination. For each of these topics, the chapter provides examples of good practices in OECD countries and ends with some reflections about how Australia could strengthen its M&E system in the youth domain.
Adolescent Education and Pre-Employment Interventions in Australia
5. Embedding interventions into a strengthened monitoring and evaluation system
Abstract
The implementation of rigorous Monitoring and Evaluation (M&E) systems is key for the improvement of policy making (Lonean, 2020[1]). All decisions and implementation processes throughout the policy making cycle may be informed by and benefit from policies implemented earlier or elsewhere that have been proven to lead to the desired results and objectives (or, alternatively, where reasons for their failure are understood) thanks to careful monitoring and evaluation processes. Besides supporting learning and the development of better policies, the M&E of public policies also contributes to accountability, as it provides detailed information on how policies are planned and implemented and can help promote specific results to all relevant stakeholders.
Strengthening accountability through M&E is particularly important for the youth policy arena, given the cross-sectional nature of youth policy and the numerous interactions established with other sectoral policies (i.e. education, employment, social inclusion, health, etc.). However, the interconnectedness of the sectors and the intersectionality of risk factors and needs make it more difficult to monitor policies. In addition, the monitoring and evaluation of interventions that are meant to have a preventive impact in the medium and long term require longitudinal data collection across a variety of areas, including labour market and education outcomes. While this is true for all preventive interventions and not just in the youth policy area, the latter faces additional constraints related to: (i) obtaining consent for data collection from minors and their parents; (ii) the fact that young people tend to be more geographically mobile than the general population and so may be more difficult to track and convince to participate; and (iii) the practical and potential ethical difficulties in varying access to interventions that are thought to be beneficial.
This section reviews four axes to strengthen M&E systems and their impact on youth policy making, focusing on: (1) strengthening the data infrastructure, including to capture intersectionality; (2) putting in place a robust ethical oversight and data privacy considerations; (3) promoting the use of high-quality data for M&E; and (4) disseminating M&E results.
5.1. Generating and maintaining a strong data infrastructure
The quality and availability of data (survey data, administrative data, programme implementation data, etc.) is a key factor for how easily a policy can be monitored and evaluated and how rigorous the resulting evaluation can be (OECD, 2019[2]). Indeed, evidence‑informed policy making can be hindered by a lack of adequate information and by capacity gaps among government departments and agencies to generate such information in a format that is suitable for evaluation purposes. It is key for policy analysis and evaluators to understand what evidence currently exist across institutions, what information such resources cover, and how it can be accessed.
Australia benefits from a strong commitment to evidence‑based policy making and investments in strong data infrastructure. The country scores highly on the OECD OURdata Index, which rates countries according to data availability, data accessibility, and government support for data re‑use. Australia ranks sixth in the 2019 index, with above average scores on all three indicators. Australia is one of the leaders in terms of promoting government data re‑use both within and outside the public sector and the country made one of the most noticeable improvements in data availability among OECD countries since 2017 (OECD, 2020[3]).
The availability of timely and comprehensive longitudinal data can support the monitoring and evaluating of “what works” in preventing that young people end up not in education, employment or training (NEET). The relevance of the availability of longitudinal data for this sort of research is shown by a review commissioned by England’s Department of Education to collate and synthesise available evidence on how interventions targeting young people at key transition points (mainly aged between 14 and 16) can lead to future improvements in education and employment attainment (Learning and Work Institute, 2020[4]). Out of the 58 studies analysed in the review, some of which are cited in the previous chapters of this report, almost two‑thirds made use of longitudinal data to evaluate the effects of the intervention on the outcomes of interest.
The fact that Australia is further improving the generation of high-quality longitudinal datasets is demonstrated by the Multi-Agency Data Integration Project (MADIP), which through the integration of data from multiple sources makes it possible to identify outcomes for young people across a variety of dimensions (allowing for the identification of intersectionality) and follow them over time (Box 5.1).
Box 5.1. Australia’s Multi-Agency Data Integration Project (MADIP)
MADIP was first established in 2015, and further developed between 2017 and 2020. MADIP is a longitudinal dataset that combines individual-level information on demographics (including the Census), health, education, government payments, income and taxation, and employment over time.
The Australian Bureau of Statistics (ABS) is the accredited Integrating Authority for MADIP, and they are responsible for collecting and preparing the data, as well as providing access to authorised researchers. To undertake its role, the ABS collaborates with a wide range of agencies, including the Australian Taxation Office, the Department of Education, Skills and Employment, the Department of Health, and the Department of Social Services, among others.
Source: Australian Bureau of Statistics (2022[5]), Multi-Agency Data Integration Project (MADIP), https://www.abs.gov.au/about/data-services/data-integration/integrated-data/multi-agency-data-integration-project-madip
As of today, a considerable number of projects make use of the MADIP data for evaluation purposes. A key example is the VET National Data Asset project “Measuring the outcomes of VET students”, which integrates data from MADIP and the Business Longitudinal Analysis Data Environment (BLADE) to enhance the evidence base around employment and social outcomes of VET students in Australia. Similarly, the Post-School Destinations Project leverages the data from MADIP and combines it with assessments from the National Assessment Program – Literacy and Numeracy (NAPLAN) to investigate the post-school destinations and outcomes for students at the national and state/territory level, especially those from disadvantaged backgrounds.
In addition to administrative data, cohort studies covering children and young people, including the multiple cohorts of the Longitudinal Surveys of Australian Youth (since 2003 recruited from schools taking part in PISA), are also available and provide complementary information that cannot be found in administrative records.
Despite the advances made in this arena, however, cohort specific research is still lacking for the assessment of the influence of interventions focused on the “middle years” (i.e. from 8 to 15) on NEET prevention.
A leading example of integrated national administrative data is New Zealand’s Integrated Data Infrastructure (IDI), established and maintained by Stats NZ. Data in the IDI form an “ever-resident” Aotearoa New Zealand population of around 9 million people and their households (Jones et al., 2022[6]). The aim of the IDI, which was established in 2011, is to provide a research tool to understand complex social and economic issues in more depth, to inform policy, to help with the targeting of resources, and to undertake impact evaluations. The IDI links data from different government agencies, statistical surveys, and non-government organisations, and enables researchers to compare population outcomes across a wide range of variables, including education, income, benefits, migration, justice and health, and allows for the adoption of an intersectionality lens in the design of policies.
As set out in Jones et al. (2022[6]), New Zealand has unique characteristics that may not be easily transferable to other contexts but their data integration process can provide interesting lessons for other countries wanting to implement a similar approach.1 The split accountability between federal and state governments in Australia may make such integration more difficult and may require additional policy or legislative structures to give data suppliers confidence that information will not be used to the detriment of the people they serve. Key elements in the NZ data integration process have been solid infrastructure design, political support, a strong regulatory environment, good data quality, close collaboration with analysts, and last, but not least trust.
5.2. Putting in place robust ethical oversight and data privacy considerations
Generating and maintaining a high-quality data infrastructure relies on strong ethical oversight that guarantees privacy and delivers value. Individuals’ trust in the collection and use of their data is intrinsically dependent on the level of data security and the extent to which data are used for their benefit and the development of effective policy making.
The success of New Zealand’s IDI is entirely dependent on the safe and ethical use of the data it contains. Two frameworks guide decision making about access to the IDI: Five Safes and Ngā Tikanga Paihere (a collection of Māori customary behaviours). Both frameworks are intended to ensure data are treated in responsible and culturally appropriate ways (Jones et al., 2022[6]). All applicants and their proposed research must meet the Five Safes conditions (safe people, projects, settings, data and output) and demonstrate they have appropriate cultural safeguards in place to conduct research in a way that will be beneficial to Māori and other priority populations (e.g. people with disabilities).More generally, any use of data or information about people, families and communities (whether it can identify people or not) must be done in a safe, transparent, and trustworthy way. Not only will this approach help to increase public trust and confidence in governments’ legitimate role in the collection, processing and use of data, it will contribute to the design and delivery of more effective, user-centred policies and services. Data use, including the decisions and actions that derive from it, should prevent, avoid, or at the very least limit intentional harm. It should not lead to or perpetuate discrimination. It should instead promote inclusion, respect diversity, and ensure that individuals and communities are treated and benefit equally from the outcomes a data-driven public sector aims to deliver (OECD, 2020[7]).
For instance, the New Zealand Government undertook an extensive public engagement process in 2019 to create the Data Protection and Use Policy (DPUP) (Social Wellbeing Agency, 2021[8]). The engagement process revealed a complex landscape of privacy legislation, regulation and rules that people struggled to navigate. The DPUP provides clear and practical guidance (principles, guidelines and a toolkit) on how personal information can and cannot be used in the social sector to provide confidence to those collecting and using the data, and to those to whom the data belongs.
There are many examples of good practice guidance for governing and managing data ethics and privacy including for example, the OECD’s The Path to Becoming a Data‑Driven Public Sector (OECD, 2019[2]). Australia has its own best practice examples to draw on like, for example, the public consultation on the proposed data sharing and release legislation in 2019 (Australian Government Department of the Prime Minister and Cabinet, 2019[9]) and how the results of that engagement shaped the final legislation, the Data Availability and Transparency Act 2022. This Act establishes a new, best practice scheme for sharing Australian Government data, underpinned by strong safeguards and consistent, efficient practices.
The approach chosen to identify and address any risks associated with use of personal information for M&E purposes – whether it be guidelines, a framework and/or processes – must comply with relevant legislation, policies and guidance and be able to accommodate a range of data uses including new and emerging uses, for example using data to identify and target young people in greatest need of intervention, i.e. predictive risk modelling. A framework (and/or guidelines) that steps a decision maker through relevant technical, ethical, privacy, public interest (and other) considerations at different stages of a project would support an assessment of whether any risks outweigh the benefits or can be sufficiently and safely mitigated. A decision might be to refer the project to a relevant ethics committee.
5.3. Promoting the use of high-quality data for M&E purposes
The existence of a strong data system can make the creation of a sustainable M&E system much easier, but it alone is not a sufficient condition. Among other factors, government departments, agencies, and programme providers need to either have the capabilities themselves to use the existing data to undertake M&E tasks, or to outsource this task – but they would still need to understand the results and ensure that they are reflected in evidence‑based policy making.
In principle, Australia benefits from a long-embedded M&E culture. The country’s evaluation strategy driven by the Department of Finance that was in place from 1987 to 1997 and was complemented by more attention to monitoring from 1995 onwards is a well-known example of evidence‑based decision making (Mackay, 2011[10]). More recently, the Public Governance, Performance and Accountability Act 2013 emphasised the importance of performance reporting.
Despite this legal framework, different institutions make the criticism that current M&E efforts are neither frequent enough nor of sufficient quality (Bray and Gray, 2019[11]). In particular, for policies that concern young people, the scope of M&E activities in Australia appears to be mixed. On the one hand, evaluations exist of a number of different youth-related programmes, including three evaluations of the 2009 National Partnership on Youth Attainment and Transitions, and of the National Support for Child and Youth Mental Health Programme. On the other hand, the Australian Government’s new Youth Engagement Model, which will establish an Office for Youth and ensure young people from diverse and at-risk cohorts are represented in consultations and engagement with government, does not make any explicit reference to the development of an M&E strategy.
Different strategies are being used to encourage the uptake of M&E practices to inform policy making across OECD countries. For example, in the United States, the federal government sought to increase the use of evidence in policy making across all federal agencies, acknowledging that some agencies were already excellent at using evidence while others lacked the skills or capacity necessary. As a result, the Foundations for Evidence‑Based Policy making Act was approved in 2019. The law pushes agencies to adopt stronger evaluation practices in order to generate more evidence about what works and what needs improvement and establishes that any data collected should be made accessible across agencies and to external groups for research purposes.
Other countries have opted for creating a dedicated team or agency to evaluate policies across the board. A key example is the UK Cabinet Office’s “What Works Network”, which is aimed at improving the generation, sharing and use of high-quality evidence within the government (Box 5.2). In the same line, the US Office of Management and Budget’s Evidence Team acts as a central hub of expertise on setting research priorities and selecting appropriate evaluation methodologies in federal evaluations, and in Korea, the government Performance Evaluation Committee is responsible for evaluating the policies of central government agencies on an annual basis (OECD, 2022[12]). Other relevant examples include Mexico’s National Council for the Evaluation for Social Development Policy, which is a decentralised public body responsible for generating objective information and undertaking evaluations across a wide range of social policies.
Box 5.2. UK Cabinet Office’s “What Works Network”
The “What Works Network” was created in 2013 to ensure that high-quality and updated evidence on “what works” across different policy areas is available and used by key decision makers (UK Government, 2022[13]). The main objective of the initiative was to embed robust evidence at the heart of policy making in the United Kingdom, supporting more effective and efficient services across the public sector at both national and local levels.
The network currently consists of 10 independent “What Works” centres, three affiliate members and one associate member. Each centre is committed to increasing both the supply of and demand for evidence in their policy area, and their responsibilities include, among others: i) collating and synthesising existing evidence on the effectiveness of programmes and practices; ii) identifying areas where evidence is lacking and commissioning new evaluations to fill in the information gaps; and iii) supporting and encouraging policy makers to use the existing evidence to inform their decisions. A report summarising the achievements made by the network during its first five years of activity shows that between 2013 and 2018, the network produced more than 280 evidence reviews, and commissioned for supported over 160 new evaluations (What Works Network, 2018[14]).
5.4. Facilitating the dissemination of M&E findings
A robust M&E function cannot be complete without the results of M&E activities being made available to their intended users. M&E results need to be communicated and disseminated to key stakeholders, as making evaluation results public is an important element in ensuring their impact and increasing the use of findings for evidence‑based policy making (OECD, 2020[15]).
An increasing number of OECD countries make evaluation results public, encouraging openness and transparency in the public sector. For example, all evaluations commissioned in Poland, including those concerning the implementation of EU funds, must now be made accessible to the public. For this purpose, a national database of evaluations has been created, and their findings are published on a dedicated website. This repository currently shares the results of more than a thousand studies conducted since 2004, as well as a series of methodological tools and other resources aimed at evaluators. A similar approach has been taken by Norway, which has created a dedicated, public website to gather all the findings of evaluations carried out by the central government. The portal, operated by the Directorate for Financial Management and the National Library of Norway, contains evaluations carried out on behalf of government agencies since 2005, as well as a selection of central evaluations conducted between 1994 and 2004. The website also provides knowledge‑sharing resources, such as evaluation guidelines, a calendar of key activities in the evaluation area, and external links to professional papers and other resources of interest. In the same manner, the Institute of Education Science (IES), the research arm of the United States Department of Education, has set up a web portal called the What Works Clearinghouse (WWC), whose main objective is to facilitate policy makers, researchers and other education practitioners learn about policies and interventions that have a proven impact on improving students’ outcomes. The WWC collects evidence through a rigorous systematic review methodology, and results are presented through an interactive portal where users can sort by type of intervention, desired outcomes and effectiveness of the intervention.
Concerning child rather than youth outcomes, the European Platform for Investing in Children (EPIC) is an evidence‑based online platform that consolidates information on policies for children and their families in Europe. The main objective of the platform is to serve as a tool to monitor activities implemented across member states triggered by the Recommendation for Investing in Children, whose main objective is to encourage member states to implement multidimensional policies to tackle child poverty and social exclusion in Europe. It also helps as a repository for sharing the best of policy making for children and families and to foster co‑operation and mutual learning in the field (Box 5.3).
Box 5.3. The European Platform for Investing in Children (EPIC)
The European Platform for Investing in Children (EPIC) is an evidence‑based online platform which was launched in 2013 by the European Commission. The platform’s main objective is to disseminate evidence‑based information about policies, programmes and practices on children and their families in Europe.
The platform is a key tool for member states to monitor activities designed and implemented under the framework of the European Commission Recommendation “Investing in Children: breaking the cycle of disadvantage”. Furthermore, EPIC also collects and disseminates innovative and evidence‑based practices that have been demonstrated to have an impact on children and their families. This cross-regional learning capacity is enabled through three separate platform features: i) a collection of evidence‑based practices which are being implemented across the EU; ii) the Social Innovation Repository, which features innovative practices sustained by a clear theory of change but that may not have sufficient evidence of their effectiveness given their recent implementation; and iii) the User Registry, which provides an overview of practices being implemented across Europe in the spirit of collaboration.
Source: European Commission (2022[16]), European Platform for Investing in Children (EPIC), https://ec.europa.eu/social/main.jsp?catId=1246&langId=en
Key policy lessons for a strengthened M&E system
Rigorous Monitoring and Evaluation (M&E) systems can improve policy making by informing decisions and implementation processes throughout the policy making cycle. M&E may also contribute to the accountability and transparency of public policies by providing detailed information on how policies are planned and implemented.
Strengthening accountability through M&E is particularly important for the youth domain, given the cross-sectional nature of youth policy and the numerous interactions across sectors. However, monitoring and evaluation of preventive interventions require longitudinal data collection across a variety of areas, including labour market, education, social and health outcomes. Such data collection might be particularly problematic for young people due to a range of factors, including: (i) data collection consent is required for both minors and their parents; (ii) young people tend to be more geographically mobile than the general population and thus more difficult to track; and (iii) there are both practical and potential ethical difficulties in varying access to interventions that are thought to be beneficial.
Continue investing in data infrastructure. Overall, Australia benefits from a strong data infrastructure, and the country has made relevant advances with respect to the generation of high-quality longitudinal datasets which allow to track individual-level outcomes over time. Nevertheless, cohort specific research is still lacking for the assessment of the influence of interventions focused on the “middle years” (i.e. from 8 to 15) on NEET prevention. Continued investment in the existing efforts to create comprehensive longitudinal datasets would allow further monitoring and evaluation of the outcomes of interest for younger target populations.
Develop a fit-for-purpose approach to ensuring ethical oversight and data privacy. M&E should always include an approach to identifying and addressing any risks associated with the use of personal information for M&E purposes that complies with relevant legislation, policies and guidance, and accommodates a range of uses including new and emerging uses. A framework (and/or guidelines) that steps a decision-maker through relevant technical, ethical, privacy, public interest (and other) considerations at different stages of a project would support an assessment of whether any risks outweigh the benefits or can be sufficiently and safely mitigated.
Promote the use of high-quality data for M&E purposes. Despite a legal framework that encourages a M&E culture, the scope of these sorts of activities on youth policy interventions in Australia is mixed. Other OECD countries use a range of strategies to encourage the uptake of M&E practices to inform policy making, including the creation of a dedicated team or agency responsible for the monitoring and evaluation of policies across different sectors like in the United Kingdom, Korea or Mexico; or a legal framework like in the United States that pushes agencies to adopt stronger evaluation practices in order to generate more evidence about what works and what needs improvement.
Facilitate the dissemination of M&E findings. An increasing number of OECD countries make evaluation results public, encouraging openness and transparency in the public sector, for instance through the creation of a unique platform/repository for dissemination of M&E findings. Publicly available M&E results can contribute to the accountability of public policies, as it provides detailed information on how policies are planned and implemented and can help promote specific results to all relevant stakeholders.
References
[5] Australian Bureau of Statistics (2022), Multi-Agency Data Integration Project (MADIP), https://www.abs.gov.au/about/data-services/data-integration/integrated-data/multi-agency-data-integration-project-madip (accessed on 16 November 2022).
[9] Australian Government Department of the Prime Minister and Cabinet (2019), Data Sharing and Release: Legislative Reform, https://www.datacommissioner.gov.au/sites/default/files/2022-08/Data%20Sharing%20and%20Release%20Legislative%20Reforms%20Discussion%20Paper%20-%20Accessibility.pdf.
[11] Bray, B. and M. Gray (2019), Evaluation and learning from failure and success: An ANZSOG research paper for the Australian Public Service Review Panel, https://www.apsreview.gov.au/sites/default/files/resources/evaluation-learning-failure-success.pdf.
[16] European Commission (2022), European Platform for Investing in Children (EPIC), https://ec.europa.eu/social/main.jsp?catId=1246&langId=en (accessed on 16 November 2022).
[6] Jones, C. et al. (2022), “Building on Aotearoa New Zealand’s Integrated Data Infrastructure”, Harvard Data Science Review, https://doi.org/10.1162/99608f92.d203ae45.
[4] Learning and Work Institute (2020), Evidence review: What works to support 15 to 24-year olds at risk of becoming NEET?, Learning and Work Institute, https://learningandwork.org.uk/wp-content/uploads/2020/04/Evidence-Review-What-works-to-support-15-to-24-year-olds-at-risk-of-becoming-NEET.pdf (accessed on 16 November 2022).
[1] Lonean, I. (2020), Insights into youth policy evaluation, Council of Europe and European Commission, https://pjp-eu.coe.int/documents/42128013/47261953/122018-Insights_web.pdf/99400a12-31e8-76e2-f062-95abec820808 (accessed on 16 November 2022).
[10] Mackay, K. (2011), “The Australian Goverrnment’s M&E System”, PREM Notes and Special Series on the Nuts and Bolts of Government M&E Systems, No. 8, World Bank, Washington, D.C., https://documents1.worldbank.org/curated/en/577181468220168095/pdf/643850BRI0Aust00Box0361535B0PUBLIC0.pdf.
[12] OECD (2022), Evolving Family Models in Spain: A New National Framework for Improved Support and Protection for Families, OECD Publishing, Paris, https://doi.org/10.1787/c27e63ab-en.
[15] OECD (2020), “Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences”, OECD Public Governance Reviews, OECD Publishing, https://doi.org/10.1787/86331250-en.
[7] OECD (2020), Good Practice Principles for Data Ethics in the Public Sector, OECD, Paris, https://www.oecd.org/gov/digital-government/good-practice-principles-for-data-ethics-in-the-public-sector.pdf (accessed on 3 April 2023).
[3] OECD (2020), OECD Open, Useful and Re-usable data (OURdata) Index: 2019, OECD, Paris, https://www.oecd.org/governance/digital-government/ourdata-index-policy-paper-2020.pdf (accessed on 16 November 2022).
[2] OECD (2019), The Path to Becoming a Data-Driven Public Sector, OECD Digital Government Studies, OECD Publishing, Paris, https://doi.org/10.1787/059814a7-en.
[8] Social Wellbeing Agency (2021), Data Protection and Use Policy (DPUP), https://www.digital.govt.nz/standards-and-guidance/privacy-security-and-risk/privacy/data-protection-and-use-policy-dpup/.
[13] UK Government (2022), What Works Network, https://www.gov.uk/guidance/what-works-network (accessed on 16 November 2022).
[14] What Works Network (2018), The What Works Network - Five Years On, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/677478/6.4154_What_works_report_Final.pdf (accessed on 16 November 2022).