This chapter provides an analysis of the monitoring and evaluation system in Honduras in relation to OECD good practices. It analyses the legal framework for monitoring and evaluation; the mandates of different institutional actors and their competencies; the availability and quality of data; communication and dissemination strategy; and the impact of these on decision and policy making within and beyond the executive. The chapter concludes with policy recommendations for improving the quality and impact of monitoring and evaluation of policies in Honduras.
OECD Public Governance Reviews: Honduras
4. Strengthening Monitoring and Evaluation in Honduras
Abstract
Introduction
Monitoring and evaluation are key functions of the state in all countries. These functions are normally carried out by centre-of-government (CoG) institutions, notably the institution serving the head of government (the presidency, prime minister’s office or cabinet office) and the ministry of finance. Monitoring and evaluation help governments make better decisions, improve policy making, inform citizens about government’s actions, and ensure accountability on the development and implementation of public policies and programmes (OECD, 2016[1]).
Monitoring and evaluation are two complementary but distinct practices, with different dynamics and goals. Monitoring consists in following up on progress in implementing public policies and programmes through systematic data collection through specific indicators. It provides the government, parliament and citizens with information regarding the progress and achievements of ongoing initiatives and/or the use of allocated public resources. Evaluation refers to the structured and objective assessment of the design, implementation and/or results of a future, ongoing or completed initiative (OECD, 2021[2]). Its aim is to analyse the final effects and causes of public interventions, determine the relevance and fulfilment of their objectives, and assess dimensions such as interventions’ efficiency, effectiveness, impact and sustainability.
The monitoring set-up in Honduras is well developed, especially when compared to the country’s evaluation framework. Honduras has been carrying out monitoring activities on the implementation of its national plans (Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22) and institutional plans. These activities are mainly led by the Directorate for Results-Based Management (Dirección de Gestión por Resultados, DIGER), and previously by the Secretariat of General Co-ordination of the Government (Secretaría General de Coordinación de Gobierno, SCGG), and supported by the secretariats of state through special monitoring units. As part of the monitoring set-up, Honduras has developed the Presidential System for Results-Based Management (Sistema Presidencial de Gestión por Resultados, SGPR), an IT support tool developed and implemented by the SCGG to collect and save information regarding the follow-up and monitoring of the national and institutional planning. Additionally, the SCGG developed some guidelines and training courses to develop further the competencies of those carrying out monitoring activities, especially on the definition of key performance indicators. However, as also analysed in Chapter 2, the monitoring set-up in Honduras mainly served reporting and accountability goals rather than focusing on supporting the high-level decision-making process. Aiming at overcoming these challenges, in 2022 the incoming government replaced the Presidential System for Results-Based Management by the Public Management System for Results and Transparency (Sistema de Gerencia Pública por Resultados y Transparencia, SIGPRET), administered by DIGER.
In terms of evaluation, despite efforts from the SCGG to develop a culture of evaluation across government, Honduras lacks a sound and robust evaluation system both from a whole-of-government perspective, and for its national and institutional plans. First, there is little awareness of the importance of evaluation and its double objective of promoting public accountability and supporting the learning processes for improving policy outcomes. Second, few evaluations have been conducted by the SCGG or by other government institutions. Nevertheless, SIGPRET is aiming to change this and has produced three evaluations in 2022. Third, Honduras also faces challenges in using evaluation results in policy making, as there is no coherent whole-of-government approach in this area, and there is a lack of appropriate skills and capacities to carry out evaluation. However, the SCGG developed guidelines and training courses to raise awareness on the importance of evaluation and its different approaches and has implemented actions aiming at developing a legal framework for policy evaluation.
This chapter provides an overview of the monitoring and evaluation practices in Honduras, including comparisons with OECD countries’ practices. It provides a description of the institutional framework for monitoring and evaluation, as well as the tools in place for promoting the quality and use of monitoring and evaluation results. It closes with a series of recommendations aimed at helping the Honduran Government to strengthen its monitoring and evaluation culture and promoting the use of evidence and results in decision and policy making.
Building a sound institutional framework for monitoring and evaluation in Honduras
Having a robust monitoring and evaluation system requires first and foremost the existence of a sound institutional framework for monitoring and evaluation. Such a framework can help countries to co-ordinate isolated and unplanned monitoring and evaluation efforts into more formal and systematic approaches, as well as provide incentives to ensure that these activities are effectively conducted (OECD, 2020[3]).
Although there is no one-size-fits-all approach, a solid institutional framework usually includes the following four components (OECD, 2020[4]):
clear and comprehensive definitions of monitoring and evaluation
clearly mandated institutional actors with allocated resources to oversee or carry out monitoring and evaluation activities
a legal or policy basis to guide and undertake monitoring and evaluation activities
macro-level guidance on when and how to carry out monitoring and evaluation activities.
Honduras could improve its definitions of monitoring and evaluation
The first component of a sound institutional framework for monitoring and evaluation consists in having clear and comprehensive definitions of those activities, which could be included in legal or policy documents. Such definitions should allow identification of the characteristics of each type of practice and clearly state the objective of carrying out monitoring and evaluation activities. According to OECD data, most OECD countries (23 out of 35) have one or several definition(s) of evaluation (OECD, 2021[2]). In some countries, this definition is embedded in a legal document, while other countries define evaluation in guidelines or manuals (Box 4.1).
Box 4.1. Country examples of definitions of evaluation
Definitions embedded in legal documents
Argentina defines evaluation in the Decree 292/2018, which designates the body responsible for preparing and executing the annual monitoring and evaluation plan for social policies and programmes.
Japan defines evaluation in the Government Policy Evaluations Act (Act No. 86 of 2001).
Definitions embedded in guidelines or manuals
Colombia defines evaluation in the guide for monitoring and evaluation of public policies of the National Department of Planning.
Costa Rica defines evaluation in the manual for evaluating public interventions.
Mexico defines evaluation in the general guidelines for the evaluation of general public administration programmes.
Source: (OECD, 2021[2])
In the case of Honduras, there are clear and distinct definitions for monitoring and evaluation, both embedded in government guidelines:
The Guide for the Formulation of Indicators defines monitoring as an "independent verification of the progress of a policy, programme or project" (Secretaría Técnica de Planificación y Cooperación Externa, 2012[5]).
The Methodological Guide for Design Evaluation defines evaluation as "a systemic process of observation, measurement, analysis and interpretation aimed at understanding an action, in order to reach an evaluative judgment based on evidence in relation to its design, implementation, effects, results and impacts” (Secretaría de Coordinación General de Gobierno, 2017[6]).
In addition to this general definition for evaluation, Honduras has several definitions of specific types of evaluation. Having a general definition for evaluation creates a shared understanding within the public sector of both the objective and features of evaluation, while having specific definitions corresponding to the different types of evaluation carried out throughout the policy cycle allows clarification of the different goals and methods of evaluation (Table 4.1).
Table 4.1. Definitions of evaluation in Honduras
Concept |
Definition |
Source |
---|---|---|
Design evaluation |
“Systematic study of the conception and planning of an intervention in all the dimensions of its design: rationality (relevance, objectives, target population), internal coherence, assumptions and risks, results-based management, institutional coherence and external coherence, amongst others.” |
Methodological guide for design evaluation (2017) |
Results evaluation |
“Systematic study of the effects (changes in behaviour or quality on people or goods) of an intervention in all its dimensions: efficacy, sustainability, coverage, contribution.” |
Methodological guide for results evaluation (2020) |
Impact evaluation |
“It aims to measure the net effect of an intervention on a series of target variables. It seeks to establish a causal relationship between the programme and its results; this is to explain whether a programme is responsible for the changes observed in a target population.” |
Methodological guide for impact evaluation (2020) |
Institutional performance evaluation |
“It analyses the management of the organisation's programmes and processes, and the organisation’s performance, both in terms of the results obtained and the efficiency in the use of resources.” |
Methodological guide for institutional performance evaluation (2020) |
Source: Author’s own elaboration based on responses to the OECD questionnaire of the OECD-EU project “Public Governance Review of Honduras” (2021)
Definitions of monitoring and evaluation in Honduras lack clarity on the main objectives of these practices. The definition of monitoring does not provide information on objectives. The OECD identifies the following objectives for monitoring: monitoring is expected to identify delays and bottlenecks in ongoing programmes and policies by providing descriptive information regarding their implementation. It also facilitates planning and operational decision making by providing evidence to measure performance (OECD, 2019[7]). Monitoring can also strengthen accountability and transparency, as it encourages the continuous measurement and publication of information regarding the use of resources, the efficiency of internal processes, and the delivery of outputs and outcomes of a policy or programme (OECD, 2019[7]).
In the case of evaluation, it is important to ultimately recognise the full objectives and potential value of this activity. According to the OECD, evaluation has the potential to improve public accountability and transparency by providing citizens and stakeholders with information on the results of governments’ efforts. Moreover, it can facilitate learning by informing policy makers on the policies and programmes that were, or have the potential to be, successful and the main reasons for their success of failure (OECD, 2018[8]). In this sense, conducting evaluations could allow Honduras to pursue both accountability and learning.
Making the various objectives of monitoring and evaluation clear and communicating these objectives in a legal framework would help create a shared understanding among the main government actors and citizens of the importance and purpose of these activities. Having a clear and comprehensive definition of monitoring and evaluation in Honduras that includes information on the several objectives and advantages of these activities would also facilitate co-operation among the main government actors, both by eliminating any confusion regarding the roles of or differences between monitoring and evaluation, and by making stakeholders aware of the benefits of carrying out these exercises, in particular as they support decision-making processes.
The SCGG played a central role in the monitoring and evaluation framework
In Honduras, several actors located at the level of centre of government play an important role in co-ordinating and promoting monitoring and evaluation. Different decrees establish the mandates and main responsibilities of these institutional actors in terms of monitoring and evaluation; the following institutions and their mandates correspond to the institutional set-up as of November 2021-:
The SCGG had the mandate to define mechanisms and procedures for monitoring and evaluating the government's management results, and to provide recommendations to the President of the Republic to improve the effectiveness and impact of government’s policies and programmes (article 1 of Decree 266-2013). Most of the functions of the SCGG were assumed by DIGER, created in April 2022 by Decree PCM 05-2022.
The Presidential Directorate for Monitoring and Evaluation, one of the three directorates of the SCGG, had the mandate to monitor and evaluate the results of national and institutional plans, sustainable development objectives, public policies, programmes, and projects. It was also responsible for proposing and co-ordinating the annual agenda of evaluations of policies, programmes and projects and their corresponding processes (article 8 of Executive Decree PCM-025-2018).
The Office of Presidential Priorities, established in 2020 to enhance the delivery of high-level government priorities, was responsible for monitoring strategies, goals, objectives, and action plans of the Presidency of the Republic as well as providing recommendations for the development of and compliance with the strategic priorities and goals of the Presidency of the Republic (article 1 of Executive Decree PCM-044-2020).
The sectoral cabinets were responsible for monitoring and evaluating compliance with the objectives and goals of strategic, sectoral, and institutional plans. They were dissolved in 2022. As explained in Chapter 2, sectoral cabinets, which took the form of inter-ministerial committees, were created by Decree PCM-001-2014 with the aim of enhancing government co-ordination under the guidance of the SCGG. Sectoral cabinets also had the mandate to propose and follow up on the impact evaluation of priority sectoral policies and their contribution to the government's long-term objectives (article 9 of Executive Decree PCM-009-2018).
The Planning and Evaluation Management Units (UPEGs) within each secretariat of state complemented these actors at the centre of government (Box 4.2). The UPEGs are responsible for monitoring and evaluating the secretariat’s policies, programmes, and projects (article 31 of Decree 146-1986, General Law of Public Administration), and served as the main point of contact between the SCGG and the secretariat of state, particularly for issues related to planning, monitoring and evaluation.
Finally, the Secretariat of Finance is responsible for the formulation, co-ordination, execution and evaluation of the General Budget of Revenues and Expenditure (article 45, Decree 83-2004, General Budget Law). It evaluates the execution of the General Budget of Revenues and Expenses both during and at the end of the fiscal year, by using information contained in the Integrated System for Financial Administration (SIAFI). In many OECD countries, secretariats of finance play an important role in promoting the monitoring and evaluation of public policies and programmes by including performance and evaluation evidence in the budget cycle. This is not the case in Honduras, as there is not a strong or clear link among monitoring results, performance management, and budgeting.
Box 4.2. Planning and Evaluation Management Units in Honduras
The Planning and Evaluation Management Units (UPEGs) are located within each secretariat of state. Their mandates are defined in article 31 of Decree 146-1986 (General Law of Public Administration). The UPEGs have the mandate to carry out the analysis, design and evaluation of policies, programmes, and projects, and periodically evaluate the efficiency and effectiveness of the programmes of the secretariat and the corresponding sector’s decentralised institutions.
In practice, the UPEGs served as the main point of contact for the SCGG within each secretariat of state and are in charge of collecting and reporting information on the implementation progress of national and institutional plans in the SGPR.
Source: Author’s own elaboration based on responses to the OECD questionnaire of the OECD-EU project “Public Governance Review of Honduras” (2021)
Monitoring and evaluation activities are not sufficiently embedded in a whole-of-government legal framework
Another key component of a sound institutional framework for monitoring and evaluation is the existence of a legal or policy framework to guide and undertake monitoring and evaluation activities. According to OECD data, two-thirds of OECD countries (23 out of 35) have developed a legal framework that guides evaluation, and half of OECD countries (17 out of 35) have developed a policy framework for organising evaluation across government (OECD, 2020[3]). This shows that having a legal basis for carrying out evaluation activities is a key element for the systematisation of these practices across government.
There are several paths for the institutionalisation of monitoring and evaluation practices. The need for evaluation, for instance, can be recognised at the highest level in the country’s constitution, in primary and/or secondary legislation, or it can be developed in a policy framework1 (Box 4.3).
Box 4.3. Country examples of legal and policy frameworks for evaluation
The legal framework for the evaluation of public policies in France
France developed a legal framework for policy evaluation embedded at three distinct levels:
Constitution: Article 47-2 of the Constitution mandates the French Supreme Audit Institution (Cour des Comptes) to assist the Parliament and the Government in evaluating public policies. The evaluations are published, making results available to the government and citizens. Evaluative activities are also included in articles 39 and 48 of the Constitution.
Primary legislation: Articles 8, 11 and 12 of the Organic Law 2009 requires legislative proposals to be subject to ex ante impact assessment. Assessment results are annexed to the legislative proposal as soon as they are sent to the Supreme Administrative Court (Conseil d’État).
Secondary legislation: Article 8 of Decree No. 2015-510 states that all legal draft proposals affecting the missions and organisation of decentralised state services should be subject to an impact assessment. The main objective is to check the alignment between the objectives pursued by the proposal and the resources allocated to decentralised services.
Additionally, France has several circulars from the Prime Minister’s Office that relate to evaluation, including the circular on the evaluation of norms (October 2015) and the circular on the impact evaluation of new law projects and regulatory texts (May 2016).
The policy framework for the evaluation of public policies in Canada
In July 2016 Canada launched the Policy on Results, which seeks to improve achievement of results across government and understanding of the desired results and resources used to achieve them. Responsibility for the implementation of this policy mainly falls on the Treasury Board of Canada, tasked with promoting the use of evaluation findings into policy making, as well as defining and updating the evaluation policy.
For the implementation of the policy, the Treasury Board:
can require departments to undertake specific evaluations and participate in centrally led evaluations
can initiate or undertake resource alignment reviews
approves line ministries’ departmental result frameworks and any changes to their organisations’ core responsibilities.
Additionally, all government departments are expected to have an evaluation unit, while line ministries are responsible for establishing a departmental results framework.
Source: (OECD, 2020[3]).
In the case of Honduras, there are general references to monitoring and evaluation in primary and secondary legislation (Box 4.4). However, there are a number of issues preventing the country from promoting the use of results from these activities for decision making and building a culture of monitoring and evaluation across government in the long term, including i) the lack of provisions to ensure the use of performance-monitoring results of priority public policies and national plans in the decision-making process, and ii) the lack of a general long-term framework for monitoring and evaluation.
Box 4.4. References to monitoring and evaluation in primary and secondary legislation in Honduras
The main references to monitoring and evaluation in primary and secondary legislation in Honduras are:
Article 45 of Decree 83-2004 (General Budget Law) mandates the Secretariat of Finances, through the Budget Directorate, to evaluate the execution of the General Budget of Income and Expenses during and at the end of each fiscal year.
Decree 286 of 2009 includes different references to the monitoring process of the national plans:
Article 10 establishes that the Council of the Nation Plan, which depends on the President of the Republic, is responsible for, among others, monitoring execution of the Nation Plan and formulating the appropriate recommendations and indications to improve it.
Article 31 empowers the National Convergence Forum (FONAC) to establish the system for monitoring and reporting progress in the execution of the National Plans.
Article 33 empowers the National Congress to permanently collaborate, monitor and participate in the execution of the Country Vision 2010-38 and the Nation Plan 2010-22.
Article 1 of Decree 266-2013 mandated the SCGG to define mechanisms and procedures for monitoring and evaluating the government's management results.
Executive Decree PCM-025-2018 stated that the Presidential Directorate for Monitoring and Evaluation of the SCGG was responsible for monitoring and evaluating the results of national and institutional plans, sustainable development objectives, public policies, programmes, and projects.
Regarding the lack of provisions to ensure the use of performance monitoring results of priority public policies and national plans for the decision-making process, it is necessary to start by analysing the different objectives that monitoring activities may have. Monitoring should strengthen reporting, accountability, and transparency, as information regarding the use of resources, internal management processes and outputs of initiatives is routinely measured and systematically publicised. It should also facilitate planning and operational decision making as it provides evidence to measure performance, allows identification of implementation delays, and facilitates drawing lessons from the execution of initiatives.
In the case of Honduras, monitoring is developed around the reporting, accountability and transparency objective. Indeed, there is a legal framework for monitoring the planning system established in Decree 286 of 2009 (Box 4.4), which until November 2021 was co-ordinated from the centre of government by the SCGG and the sectoral cabinets. However, the results derived from the monitoring activities – including the monitoring report of the Nation Plan – were not systematically discussed at meetings where those responsible for leading the decision-making process participate, limiting the impact that monitoring activities could have in Honduras. In this sense, as assessed in the previous chapters, Honduras lacks a framework that ensures that performance monitoring results of priority public policies and national plans are discussed at the decision-making level and analysed around lessons learned, bottlenecks and implementations delays informing the decision-making process.
Regarding the lack of a general long-term framework for monitoring and evaluation, it is necessary to consider recent reforms implemented in Honduras. Indeed, in 2020 Honduras made tangible efforts to institutionalise monitoring and evaluation practices by including specific mandates to make clear to institutional actors when and how to conduct monitoring and evaluation in Legislative Decree 182-2020 (the General Revenue and Expenditure Budget of the fiscal year 2021) (Table 4.2). As a result of that legal framework, during the first semester of 2021, 11 public institutions prepared and delivered design evaluations of one of their strategic programmes.
Table 4.2. Mandates on monitoring and evaluation contained in Legislative Decree 182-2020 (General Revenue and Expenditure Budget, fiscal year 2021)
Article |
Mandate |
---|---|
Article 6 |
Mandated the SCGG, through the Presidential Directorate of Monitoring and Evaluation, to monitor the national plans and elaborate an annual report on the progress of the global results and the corresponding indicators established in the Strategic Government Plan. This report was expected to be published annually on the SCGG website and presented to the President of the Republic, the Superior Court of Accounts, the Institute for Access to Public Information and the National Congress through the Ordinary Budget Commission. |
Article 7 |
Mandated the SCGG to monitor institutional planning on a monthly, quarterly, and annual basis, based on the information reported by public institutions in the SGPR. The monthly reports were expected to be sent to the head of each public institution, while the quarterly reports were expected to be sent to the Deputy Co-ordinators of the sectoral cabinets and the Secretariat of Finance. |
Article 8 |
Mandated the SCGG, through the Presidential Directorate for Monitoring and Evaluation, to prepare a quarterly report containing the synthesis results of the ex ante evaluations conducted by 10 public institutions on one of its strategic programmes. The report was expected to be sent quarterly to the head of each public institution and the Deputy Coordinators of the sectoral cabinets. |
Article 9 |
Mandated all public institutions to carry out a quarterly report on the physical and financial execution of their annual operating plan and budget. The report was expected to be sent quarterly to the Secretariat of Finance. |
Source: Author’s own elaboration based on (Congreso Nacional de Honduras, 2020[9]).
Although these efforts are in the right direction towards implementing a legal framework for monitoring and evaluation and promoting a culture of evaluation across government, they have important limitations. First, the mandates included in the Legislative Decree ordered the SGCC to send a copy of the reports to a number of representatives across and outside government (e.g., the president, the Superior Court of Accounts, the National Congress, etc.). However, rather than send copies of long and generic monitoring reports to such authorities, it is important to create communication channels and instances with decision makers around priority public policies and key government areas to ensure that performance evidence is used to inform decision making. It is also important to prepare fit-for-purpose monitoring analyses that give users quick and easy access to clear monitoring results that can translate to better uptake of the outcomes in decision making. Second, the mandates are only valid for the short term (the fiscal year), as they are applicable for the current fiscal year only (2021). Third, and related to the second point, the mandates on monitoring and evaluation included in the General Revenue and Expenditure Budget Law are subject to changing political willingness, as renewal of the mandates depends on the political environment and a political consensus in each fiscal year.
The government of Honduras could benefit from integrating the monitoring and evaluation legal framework into the planning system/performance framework, rather than providing monitoring and evaluation mandates through the annual General Revenues and Expenditure Budget law. By setting the monitoring and evaluation legal framework and integrating it into long-term policies such as the planning system and performance framework, Honduras would ensure that there exist clear mandates for institutional actors on when and how to carry out these practices beyond the fiscal year, as well as political consensus on the importance of monitoring and evaluation activities for the country beyond electoral mandates (OECD, 2020[3]). Additionally, this integration would ensure that evidence and results from monitoring and evaluation activities are used not only for reporting and accountability purposes but also as inputs for the decision-making process.
Macro-level guidance for monitoring and evaluation could be developed further
The existence of a legal framework for monitoring and evaluation is not sufficient to sustain a robust monitoring and evaluation system. It is also important to have macro-level guidelines to support the implementation of monitoring and evaluation across government. Such guidelines generally intend to assist all those participating in the implementation of a policy in better planning, commissioning and managing its monitoring and evaluation activities. For instance, guidelines for evaluation mostly refer to the reporting of evaluation results, followed by the identification and design of evaluation approaches, quality standards for evaluations, and use of evaluation evidence (OECD, 2020[3]). Evidence shows that the majority of OECD countries (26 out of 35) have guidelines to support implementation evaluation across government (OECD, 2020[3]).
Honduras has guidelines to assist public institutions in planning, implementing, and managing monitoring and evaluation, including the guidelines for the formulation and approval of public policies, and the methodological guides on design evaluation, impact evaluation and results evaluation. These guidelines are published in the evaluation repository of the SGPR and were communicated to public institutions in specific training courses designed by the SCGG working in tandem with the School for Senior Management in the Public Administration (Escuela de Alta Gerencia Pública). The existence of guidelines and manuals in Honduras shows that there is a general understanding of their importance to assist policy makers in conducting monitoring and evaluation successfully.
However, some essential plans are not included within these guidelines; examples include monitoring of the Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22. In the case of Honduras, a robust monitoring and evaluation system may need additional guidelines on monitoring that clearly state the actors involved, their mandates, and the timeline, tools and methodology for monitoring. Guidelines on monitoring should also clarify the articulation of the monitoring activities for the different national plans as well as the institutional plans.
Additionally, detailed manuals on evaluation practices could be developed that involve sectoral stakeholders. Conscious of the limited competencies to carry out evaluation within the secretariats of state, the Presidential Directorate for Monitoring and Evaluation started developing more detailed manuals on evaluation practices. This process could have benefited from comments and suggestions of the secretariats of state, which could have provided insights into the main challenges for and weaknesses of implementing the guidelines in their specific sectors. Encouraging the co-production of these more detailed manuals between the CoG institution responsible for monitoring and evaluation and representatives from the secretariats of state may also be an opportunity to raise awareness about the importance of evaluation and create a sense of ownership across government.
Moreover, article 1 of Executive Decree PCM-025-2018 specified that the Presidential Directorate for Monitoring and Evaluation of the SGCC was expected to evaluate the results of the national and institutional plans, sustainable development objectives, public policies, programmes and projects. However, these are large activities that cannot be totally carried out within a single year. Conducting a proper evaluation requires time and significant resources, and – most importantly – needs to be supported by a clear methodology (OECD, 2021[2]).
To that end, Honduras has already begun to implement a more focused approach to evaluation by selecting one programme to be evaluated each year. Defining a limited number of evaluations to be carried out in a given year is considered a good practice, taking into account that proper evaluation requires a more focused approach as evaluation activities demand time and significant resources. For instance, since 2018 the Presidential Directorate for Monitoring and Evaluation commissioned four external evaluations on specific strategic programmes and projects, including the programmes CONVIVIENDA (With House), Con Chamba Vivís Mejor (With Work you Live Better) and Vida Mejor (Better Life).
However, the CoG institution responsible for monitoring and evaluation could further develop this focused approach by clearly defining and communicating an annual evaluation agenda and developing a specific timeline for evaluations.
Promoting the quality of monitoring and evaluation processes
Performance indicators need to be improved as a first step towards producing robust monitoring evidence
Monitoring a policy, programme or project implies identifying indicators that are methodologically robust. For indicators to provide decision makers with information that can be used to define what course of action to take to achieve the intended policy objectives, they should be accompanied by information that allows for their appropriate interpretation (OECD, 2021[2]). Regardless of their typology, all indicators should be presented in a way that provides the following information:
description of the indicator: name, unit of measurement, data source and formula
responsibility for the indicator: institution, department, or authority responsible for gathering and reporting the data
frequency of data collection and update of the indicator
baseline that serves as a starting point to measure progress
target or expected result.
In the case of Honduras, the country could still improve the indicators of its national and institutional plans, by developing a mix of sound indicators that include process and outcome/impact indicators, allowing both monitoring the implementation of policies/programmes as well as measuring the real effect of the government’s initiatives. Indeed, process indicators and output/outcome indicators are complementary, in the sense that they allow monitoring of different objectives. Process indicators are useful and recommended to track the implementation of the programmes and accountability purposes, since they provide regular flows of information on the implementation of a programme/plan. Output/outcome indicators, meanwhile, are useful to improve high-level decision-making processes by proving information on whether the programme is achieving its intended effects.
At the national level, indicators of Country Vision 2010-38, Nation Plan 2010-22 and the Strategic Government Plan 2018-22 fulfil practically all criteria of a sound indicator, but specific improvements could be considered:
The indicators in the Country Vision 2010-38 and Nation Plan 2010-22 are explicitly stated and include information on the data source, the baseline, and the target values for 2013, 2017, 2022 and 2038, as well as on the institution responsible for collecting and updating the indicator. However, indicators could be improved by clearly stating the unit of measurement and formula for their calculation.
The indicators in the Strategic Government Plan 2018-22 are explicitly stated, include information on the baseline and the target values for 2018, 2019, 2020, 2021 and 2022. However, indicators could be improved by explicitly stating the institution or person responsible for collecting the data and updating their information, and including the unit of measurement and formula for their calculation.
At the institutional level, secretariats of state struggle to set key performance indicators and mainly use process indicators (Box 4.5). However, Honduras could benefit from having a mix of process indicators, calculated based on the information collected monthly in the SGPR, and outcome/impact indicators, calculated on the basis of administrative data or even ad hoc perception survey data. Additionally, institutional plans with key performance indicators should be public and communicated with key stakeholders, promoting both transparency and accountability.
Box 4.5. Typology of governance indicators
Policy makers must continuously decide what elements of a policy should be monitored and how these can be tracked through various indicators. A typology of governance indicators distinguishes between:
Input indicators measure the quantity and type of resources, such as staff, money, time, equipment, etc., that the government invests to attain a specific public policy.
Process indicators refer to actual processes employed, often with assessment of the effectiveness from individuals involved in the policy.
Output indicators refer to the quantity, type and quality of goods or services produced by the government’s policy. They can include operational goals such as the number of meetings held.
Outcome/Impact indicators measure the strategic effect and change produced by the policy implemented. Outcome indicators commonly refer to short-term or immediate effect, while impact indicators refer to long-term effect.
Source: (OECD, 2020[4]).
Additionally, Honduras could benefit from developing a systematic framework to link institutional indicators with national priority goals and the strategic lines of national plans. Developing performance indicators, their baseline and targets is an important stage in the institutional planning and identification of policy priorities (OECD, 2021[2]). Article 1 of Decree 266-2013 established that the SCGG was responsible for defining mechanisms and procedures to monitor and evaluate government's management results. However, there still is not an explicit or systematic framework for the design of monitoring and evaluation indicators.
Indeed, as analysed in Chapter 3, there was a lack of systematic linkage between the national plans (Country Vision 2010-38, Nation Plan 2010-22, and Strategic Government Plan 2018-22) and the institutional plans. This makes it hard for stakeholders to monitor progress in terms of national priority goals and strategic lines, or to understand how institutional plans contribute to strategic plans. While secretariats of state have identified a set of indicators in their own plans, these are not presented in a way that clearly indicates their connection with elements of the Country Vision 2010-38 (national priority goals) and the Nation Plan 2010-22 (strategic lines).
Therefore, explicitly linking each indicator and national priority goals would be essential to clarify the monitoring structure of the national plans. This link could be done visually in the institutional planning documents. The exercise should be undertaken by the CoG institution responsible for monitoring and evaluation, together with the different UPEGs, to inform secretariats of state on the national priority goals and strategic lines they contribute to. Such analysis would also benefit from linking output indicators updated on a regular basis to inform the government on how their administration is performing to outcome-level objectives included in its Country Vision.
Honduras could consider implementing initiatives to overcome the lack of sufficient data and the difficulties in accessing information
A good monitoring and evaluation system relies on comprehensive, multi-source and high-quality data (Box 4.6), that are readily available and in a format easy to be used as part of the evaluation process. Indeed, implementation of an evidence-informed agenda implies leveraging the data that are available for analytical purposes as part of the monitoring and evaluation process (Mathot and Giannini, 2022[10]). Policy evaluation, for instance, can be hindered by the lack of available adequate, easy to use data. In this sense, a high-quality national statistics system and up-to-date databases and registers that mutually communicate and disaggregate data at the desired level are an integral part of a robust monitoring and evaluation system.
Box 4.6. Potential sources of quality data for policy evaluation
Important quality data sources for policy evaluation are:
Administrative data – This type of data is generally collected through administrative systems managed by government departments or ministries, and usually involves whole sets of individuals, communities and businesses that are concerned with a particular policy. Examples of administrative data include housing data and tax records.
Statistical data – This type of data is commonly used in research and corresponds to census data or more generally to information on a given population collected through national or international surveys.
Big data – This type of data is broadly defined as “a collection of large volumes of data” (UN Global Pulse, 2016[11]). Mainly drawn from a variety of sources such as citizen inputs and the private sector, big data are most often digital and continuously generated. They have the advantage of coming in greater volume and variety, and thus represent a cost-effective method to ensure a large sample size and the collection of information on hard-to-reach groups.
Evaluation data – This type of data is collected for the purpose of evaluation. It can take the form of qualitative questionnaires, on-site observations, focus groups, or experimental data.
Combining different data sources has the potential to unlock relevant insights for policy evaluation.
Source: (OECD, 2020[3]; UN Global Pulse, 2016[11]).
In Honduras, there is no integrated data infrastructure to facilitate access or sharing of administrative data horizontally among secretariats of states, a situation that creates data siloes and prevents evaluators from having access to relevant data or information for their own analytical purposes (a challenge highlighted in Chapter 3). As a result, secretariats of state that operate in similar and complementary sectors cannot easily share data and information between them or are not necessarily aware of all the data that exist and could be used in evaluation.
Aware of these limitations, Honduras may consider implementing initiatives to avoid fragmentation and duplication of efforts (e.g., by developing a separate data-sharing infrastructure) across secretariats of state and promoting public sector integration and cohesion. To do so, Honduras may consider starting by carrying out a comprehensive data inventory that accounts for all data assets created and collected by secretariats of state, and developing a strategy to encourage systematic access to, and use of, administrative data. The United States, for example, has institutionalised and implemented a more systematic structural approach to facilitate evidence-informed policy making (Box 4.7).
Box 4.7. The Foundations for Evidence-Based Policymaking Act in the United States
The Foundations for Evidence-Based Policymaking Act of 2018 in the United States was signed and enacted into law on 14 January 2019. The Evidence Act aims to have federal agencies better acquire, access and use evidence to inform decision making, and to ensure that the necessary data quality and review structures are in place to support the use of administrative data in evaluations.
The Evidence Act incorporates the Open Government Data Act, which requires agencies to publish information on line as open data, using standardised and machine-readable data formats. The Evidence also emphasises co-ordination to advance agencies’ data management and data access functions by mandating an open government approach to data. The website Data.gov, launched in 2009 and managed by the US General Services Administration, provides access to government datasets on a wide range of topics. The US General Services Administration must maintain a “Federal Data Catalogue” as an online point of entry dedicated to sharing agency data assets with the public. The Office of Management and Budget is preparing additional guidelines for open data access and management and for data access for statistical purposes.
Agencies are requested to develop and maintain a comprehensive data inventory that accounts for all data assets created and collected by them. The Office of Management and Budget has established an Advisory Committee on Data for Evidence Building at the federal level to review, analyse and make recommendations on how to promote the use of federal data for evidence building and how to facilitate data sharing and data linkage.
Source: (Mathot and Giannini, 2022[10]).
Honduras moreover has several information systems, including the Integrated Financial Management System (Sistema de Administración Financiera Integrada, SIAFI), the Presidential System for Results-Based Management (SGPR), and the National Public Investment System (Sistema Nacional de Inversión Pública, SNIPH), among others. To ensure that relevant data can be compared and combined across sources to support better-informed decision making and public policies, Honduras could promote interoperability across existing and new information systems within the public sector. Interoperability refers to the ability of different information systems to connect, work together and communicate with one another in a co-ordinated way. By allowing system interoperability, Honduras ensures that information systems can communicate and share data in a more effective way, strengthening the decision and policy making and improving monitoring and evaluation activities. Such a recommendation is aligned with the 2021 OECD Recommendation of the Council on Enhancing Access to and Sharing of Data, which recommends that Adherents “foster where appropriate the findability, accessibility, interoperability and reusability of data across organisations, including within and across the public and private sectors” (OECD, 2021[12]).
Additionally, although the country has a National Statistics System led by the National Institute of Statistics of Honduras, the system has been unable to generate the statistical data that secretariats of state need to carry out programme and policy evaluation. A strong national statistics system is fundamental to improve how the government collects, manages, shares and stores data to make them more useful for evidence-based policy making (Mathot and Giannini, 2022[10]).
The creation of an office for statistical analysis and evaluation of indicators within DIGER, which works with the UPEGs in establishing indicators, represents a step in the right direction. In this sense, Honduras may also consider strengthening its national statistics system by having statistical officers within the UPEGs of the secretariats of state. The role of such statistical officers would be to advise on statistical policy, techniques and procedures throughout the policy cycle and to help guarantee that data needed in the evaluation process are systematically collected from the beginning of an intervention. Statistical officers would also regularly collaborate and consult with the National Institute of Statistics of Honduras to make sure that national statistical data meet the purpose and needs of secretariats of state to conduct evaluation. Considering that the skills of statistical officers are high level and specific, Honduras could also consider developing close co-operation between the National Statistics System and academia to guarantee that the right competencies in the statistical and economic domains are being developed in future graduates.
Quality assurance mechanisms and quality control mechanisms could be further developed
Quality assurance mechanisms seek to ensure that the findings of an evaluation are based on an objective and defensible interpretation of the results, and relate to the original objectives of the evaluation (HM Treasury, 2020[13]). Quality control mechanisms seek to ensure that the evaluation design, planning and delivery have been properly conducted to meet predetermined quality criteria (OECD, 2021[2]). Most OECD countries (24 out of 35) have in place one or several mechanisms in order to promote the quality of evaluations through various means (OECD, 2020[3]).
In Honduras, the Presidential Directorate for Monitoring and Evaluation implemented actions to promote quality monitoring of reports and evaluations, including by developing methodological guidelines aimed at addressing both the technical quality and the good governance of monitoring and evaluation. These guidelines assist with design, impact, and results evaluation, and include advice on the formulation and approval of public policies.
Considering that quality assurance mechanisms are focused on evaluation practices rather than performance monitoring, Honduras could issue additional guidelines to clarify the working methods and tools that will support monitoring practices across government. These guidelines could also specify quality assurance processes in the context of the monitoring exercise that should be applied by every secretariat of state.
Additionally, Honduras does not have quality control mechanisms for its evaluations, such as peer reviews of the evaluation product, meta-evaluations, self-evaluation tools and checklists, or audits of the evaluation function. OECD data show that quality control mechanisms are much less common than quality assurance mechanisms, with only approximately one-third of countries surveyed using them (OECD, 2020[3]). However, these mechanisms are fundamental to ensure that evaluation reports and evaluative evidence meet a high-quality standard. In this sense, Honduras could develop one or several control mechanisms among the ones presented below.
The most common quality control mechanism used by countries is the peer review process. Peer reviews consist of a panel or reference group, composed of external or internal experts, that subjects an evaluation to an analysis of its technical quality and substantive content (OECD, 2020[3]). The peer review process helps determine whether the evaluation meets adequate quality standards and therefore can be published. In Honduras, the CoG institution responsible for monitoring and evaluation could consider submitting its evaluations for peer review by experts (for instance, academics and international experts) before they are published.
Countries have also developed tools aimed either at the evaluators themselves (i.e., self-evaluation) or at the managing and/or commissioning team (e.g., quality control checklists) to help them verify whether their work meets the appropriate quality criteria (OECD, 2020[3]). Self-evaluation is a critical review of project/programme performance by the operations team in charge of the intervention, as they serve to standardise practices when reviewing evaluation deliverables. Quality control checklists are aimed at standardising quality control practices of evaluation deliverables and as such can be useful to evaluation managers, commissioners, decision makers or other stakeholders to review evaluations against a set of predetermined criteria (Stufflebeam, 2001[14]). The CoG institution responsible for monitoring and evaluation could consider designing such a checklist, to help the secretariats of state and itself control the quality of their work. Examples such as the Polish Ministry of Infrastructure and Development’s self-assessment checklist (Box 4.8) show how self-evaluation checklist initiatives can be implemented to foster the technical quality of evaluations.
Box 4.8. The Polish Ministry of Infrastructure and Development’s self-assessment checklist
This self-assessment checklist, presented in the national guidelines on evaluation of cohesion policy for 2014-20, aims to prevent recommendations from poor-quality evaluations from being implemented. The checklist, which is also used by evaluation units at the regional level, is one of the components of meta-evaluations, focusing on the skills and practices of the evaluators rather than the evaluation more broadly. It includes criteria such as the extent to which the objectives were achieved, the methodology used, and data reliability. Each criterion is given a numerical rating that can be supplemented with qualitative comments.
Source: (OECD, 2021[15]).
Finally, OECD data show that Supreme Audit Institutions (SAI) may also take on an active part in the promotion of evaluation quality (OECD, 2020[16]). SAIs may become key players in the national discourse concerning evaluation quality. Thanks to their particular expertise in performance auditing, they may give governments external insights on how to better manage performance evidence and improve the quality of their evaluation systems. Additionally, Supreme Audit Institutions may sometimes perform evaluations themselves, including on systems for managing information and on policy evaluation systems, employing their own standards for quality. In Honduras, although the Supreme Audit Court reported that it has also started to carry out performance audits to measure the impact of specific programmes – for instance, related to the Sustainable Development Goals – and durante audits on budget execution, its focus continues to be ex post compliance, procurement, and financial audits.
Considering that the Supreme Audit Court is just starting to conduct performance audits itself, Honduras could consider encouraging implementation of such audits that allow analysing the efficiency and effectiveness of key public policies and programmes. To that end, the Supreme Audit Court could consider including in its annual audit plan a minimum amount of performance audits of specific policies or programmes that the government considers strategic.
Competencies within the whole of government to strengthen the monitoring and evaluation processes in Honduras need to be further developed
To put in place a monitoring and evaluation system capable of producing credible and relevant data and analyses, the individuals carrying out these activities must have the appropriate skills, knowledge, experience, and abilities. OECD countries are aware of the crucial role of competencies in promoting quality evaluations: survey data show that 13 out of 35 OECD countries use mechanisms to support the development of competencies of evaluators (OECD, 2020[3]).
In Honduras, the Presidential Directorate for Monitoring and Evaluation implemented actions to promote the competencies of individuals carrying out monitoring and evaluation. The SCGG together with the School for Senior Management in the Public Administration developed online and face-to-face training courses aimed at helping develop the competencies to carry out monitoring and policy evaluation across secretariats of state.
However, Honduras may consider further developing appropriate competencies for monitoring. Monitoring requires having sufficient resources and capacities to collect data on a regular basis, calculate indicators, analyse data, etc., all of which in turn require a critical mass of trained individuals. In Honduras, these resources were located in the UPEG and the Presidential Directorate for Monitoring and Evaluation in the Monitoring Division. Having units dedicated to this function in the secretariats of state is an important step in the mobilisation of resources towards monitoring activities. A further step consists in continuing to strengthen the appropriate competencies for the monitoring units within both the secretariats of state and the CoG institution responsible for monitoring and evaluation; key among these is the ability to define the particular performance indicators that allow collection of relevant data for different policy priorities and policy targets that link performance information across related single- and multi-sector policies.
In terms of evaluation, such activity requires having the relevant technical skills to conduct them. In Honduras, evaluation was mainly carried out by external evaluators from the private sector or academia due to the lack of technical skills and limited personnel within the secretariats of state and the Presidential Directorate for Monitoring and Evaluation in the Evaluation Division. Additionally, there are recurring difficulties in hiring personnel with the appropriate competencies for evaluation, mainly because civil service human resource rules make it difficult and time-consuming to hire specialised staff from outside the civil service.
To strengthen the competencies for monitoring and remedy the lack of skills and personnel to conduct high-quality evaluations, it is necessary to continue implementing mechanisms to support the development of appropriate competencies for both practices. Indeed, several interviewees stressed that Honduras was facing challenges in attracting and developing the appropriate competencies within the SCGG and secretariats of state needed to conduct in-house monitoring and evaluation activities. In order to ensure the technical quality of the results of these activities, the CoG institution responsible for monitoring and evaluation may wish to implement different complementary mechanisms, that include the following:
First, Honduras could continue developing and implementing the online and face-to-face training courses that the School for Senior Management in the Public Administration began offering in 2018. These courses have made it possible to train hundreds of individuals across the different secretariats of state; create the basis for conducting in-house monitoring and evaluation; and start building a coherent understanding of the monitoring and evaluation system in Honduras.
Second, Honduras may consider developing specific training courses that complement the existing general ones. Indeed, secretariats of state also require specific training courses that allow them to address the particular challenges that arise from the specificity of their sectors. In this sense, the CoG institution responsible for monitoring and evaluation and the School for Senior Management in the Public Administration could create evaluator-training curricula at the level of individual secretariats of state, allowing evaluators to deepen their knowledge of the evaluation of a specific policy topic relevant to their particular sector.
Finally, another way to develop competencies of evaluators is to foster knowledge-sharing networks of evaluators. According to OECD data, a frequently used quality assurance mechanism that countries have implemented is the establishment of a network of evaluators for exchanging practical and technical experiences related to evaluation (OECD, 2021[2]), such as the Cross-Government Evaluation Group in the United Kingdom (Box 4.9). The CoG institution responsible for monitoring and evaluation could consider strengthening its role as an evaluation champion in Honduras by creating a network of evaluators that connect people responsible for monitoring and evaluation within the UPEG across the different secretariats of state with academics, the private sector and the international community.
Box 4.9. The Cross-Government Evaluation Group in the United Kingdom
In the United Kingdom, evaluators within government departments have created informal mechanisms to exchange information across government on monitoring and evaluation practices.
The Cross-Government Evaluation Group is a cross-departmental and cross-disciplinary group consisting of people responsible for evaluation within the different government departments. The objective of this Group is to improve the supply of, stimulate the demand for, and encourage the use of good-quality evaluation evidence in government decision making by sharing good practices and solutions for common problems, and working together on joint projects. For instance, the Group steered the rewriting of the Magenta Book in 2020, one of the main guidelines on what to consider when designing an evaluation. The Cross-Government Evaluation Group meets around five times per year and is currently chaired by the Head of Evaluation of the Department for Transport.
Source: Author’s own elaboration.
In addition, the CoG institution responsible for monitoring and evaluation could consider developing quality standards for outsourcing and commissioning policy evaluations. Currently, Honduras mainly relies on external evaluators’ competencies to conduct evaluations due to the lack of internal competencies. Considering this and considering that developing the appropriate competencies to conduct in-house evaluations requires time, the CoG institution responsible for monitoring and evaluation could define some quality standards to be included in the terms of reference (ToRs) for outsourcing for and commissioning policy evaluations to external stakeholders. The ToRs provide the guidelines for the work that will have to be carried out during the evaluation process and therefore constitute an essential tool for quality assurance (OECD, 2020[3]). The CoG institution responsible for monitoring and evaluation could also develop additional guidelines to specify that ToRs should be drafted by the evaluation manager (OECD, 2020[3]), and to make sure ToRs appropriately and clearly outline the purpose, objectives, criteria, and key questions for the evaluation.
Promoting the use of monitoring and evaluation results and evidence
One of the main goals of monitoring and evaluation is to support decision making with useful insights on public issues and evidence on the impact of policies and their underlying change mechanisms (OECD, 2020[3]). Regardless of their many potential users, the use of monitoring and evaluation results remains a constant challenge and often fails to meet expectations.
As of November 2021, in Honduras, monitoring of national and institutional plans took place in the following scenarios:
Monthly, the Planning and Evaluation Management Units of the different secretariats of state reported the progress of the implementation of their institutional plans and the corresponding indicators of the National Plan in the SGPR. This process was supported by the sectoral cabinets, which directly co-ordinated the reporting process of the different institutions of their corresponding sector. However, there were no discussions around policy performance.
Twice a year, the Presidential Directorate for Monitoring and Evaluation, with the support of the sectoral cabinets, prepared a monitoring report of the National Plan, based on the information sent by the different institutions. As part of the report, the Presidential Directorate for Monitoring and Evaluation identified the main challenges regarding the implementation process of the Plan as well as those challenges that should be considered to improve management in the immediate future.
However, as highlighted in Chapter 3, there was an absence of connection between the planning system (monitoring for reporting and accountability) and the decision-making process carried out at the centre of government by the president, secretaries of state, heads of public institutions, etc. Although there was a system of monitoring presidential goals (Monitoreo de Metas Presidenciales), it was disconnected from the regular planning system (Country Vision 2010-38 and Nation Plan 2010-22), making it difficult to achieve the use of monitoring results beyond the reporting and accountability purpose in the decision-making process. These challenges need to be considered in the establishment of new planning, monitoring and evaluation structures.
Honduras could improve publicity surrounding and communication of monitoring and evaluation results
The SCGG communicated the different results of monitoring and evaluation activities both internally and externally by sharing its monitoring and evaluation reports with internal and external stakeholders. Communication of monitoring and evaluation results included the following:
According to Legislative Decree 182-2020, the SCGG was expected to publish the report on the progress of the Country Vision 2010-38 and the Nation Plan 2010-22 on its website (since 2012, this report has been prepared and published every two years, not annually as initially established in the Legislative Decree). The SCGG was also expected to submit these reports to the President of the Republic, the Superior Court of Accounts, the Institute for Access to Public Information and the National Congress (through the Ordinary Budget Commission), to directly inform them on the implementation progress of the plans.
The SCGG elaborated and published annual reports on the implementation of the corresponding Government Plan.
The SCGG was expected to send quarterly reports containing the synthesis results of ex ante evaluations carried out by government departments to the Deputy Coordinators of the sectoral cabinets and the head of each government department.
All the reports concerning the monitoring and evaluation of public policies were expected to be published by the SCGG in the Presidential System for Results-Based Management, accessible to the public and policy makers.
However, in order to promote the use of these results, evidence should not only be accessible to the public and policy makers, but also be presented in a strategic way and driven by the monitoring and evaluation’s purposes as well as the needs of intended users (OECD, 2021[2]). Evidence shows that tailored and contextualised syntheses, seminars and advice from knowledge brokers and researchers seem to be the most promising means of improving access to evidence (OECD, 2020[17]).
To tailor evidence and results to different publics, the CoG institution responsible for monitoring and evaluation could consider developing a communication and dissemination strategy to adapt the way monitoring reports and evaluation findings are presented to their potential users (policy makers, civil servants, high-level decision makers, National Congress, citizens, academia, etc.). Such a strategy could include the use of infographics, tailored syntheses of evidence (e.g. in the form of policy briefs or executive summaries), seminars to present the main findings of evaluations, “information nuggets” and fragments of storytelling that can be disseminated through social media accounts, to spread the main messages of key policy and evaluation reports (OECD, 2020[17]). This strategy could also cover recommendations arising from the strategic evaluations conducted by the CoG institution responsible for monitoring and evaluation (presented in the section; macro-level guidance for monitoring and evaluation could be developed further). Indeed, setting a specific methodology to communicate the results of these evaluations could allow informing of the type of formal responses that are expected from public institutions, improve the implementation of the recommendations, and allow follow-up. Countries where such tailored communication and dissemination strategies, which increase access to clearly presented research findings, have been developed include Mexico, New Zealand, and the United Kingdom (Box 4.10).
Box 4.10. Country examples of tailored communication and dissemination strategies
Mexico – The National Council for Evaluation of Social Policy (CONEVAL) regularly shares on its website different infographics that summarise, with brief texts and pictures, some evaluation initiatives undertaken by CONEVAL and their results for citizens, and executive summaries that include the main results of their evaluations.
New Zealand – The Ministry of Foreign Affairs and Trade has invested efforts in condensing evaluation findings and varying the formats of presentations in order to make the information available for a number of purposes. Knowledge cafes and evaluation workshops are helping not only to share information but also to solicit support from colleagues in problem solving on specific projects or evaluations.
The United Kingdom – The What Works Network, which includes the Education Endowment Foundation, Early Intervention, and the What Works Centre for Local Economic Growth, produces a range of policy briefs to disseminate key messages to its target audience. The What Work Network intends to support the government and other organisations in creating, sharing and using high-quality evidence to make better decisions for the improvement of public services.
Source: (OECD, 2021[2]); (OECD, 2020[17]); (OECD, 2016[18]).
Embedding monitoring and evaluation results into the policy-making cycle could help Honduras improve the use of evaluation results
The use of evaluations is linked to the existence of organisational structures and systems that enable and encourage the production (supply) and use (demand) of evidence. These structures and systems can be found at the level of specific institutions, such as management response mechanisms, or within the wider policy cycle, such as through the incorporation of policy evaluation findings into the budget cycle or discussions of findings at the highest political level (OECD, 2021[2]). Incorporation of evaluation findings in the budgetary cycle is one of the most commonly used mechanisms for promoting the use of evaluations. According to OECD data, 21 OECD countries incorporate findings from evaluations into the budgetary cycle (OECD, 2020[3]).
In Honduras, the National Congress is responsible for approving the General Budget of the Republic, which is prepared every year by the Secretariat of Finance together with government institutions. Every year before 15 September, the Secretariat of Finance sends the budget proposal to the National Congress, where a series of technical discussions between the Budget Office and the Congress Budget Commission take place. Secretariats of state may also participate in these discussions, justifying exceptional changes to their specific budgets. One way of improving the use of evaluation results in Honduras would be to encourage the use of policy evaluations conducted by the centre of government and the different secretariats of state as part of budgetary discussions in Congress to inform budget decisions. For instance, specific policy and programme evaluations could be included as an annex in the main budget document, when relevant.
Additionally, the centre of government could consider systematically holding discussions on evaluation results at the highest political level. The Council of Ministers was already carrying out six-monthly discussions on the progress reports of the Country Vision 2010-38 and the Nation Plan 2010-22. The Council of Ministers’ function could be strengthened if the main findings of the evaluations of strategic programmes and public policies (those prioritised in the evaluation agenda annually set) were also discussed at this stage, together with the budget proposal or progress reports on the national plans.
Honduras could leverage the evaluation ecosystem that has developed beyond the executive to generate stronger demand for evidence-based decision making
In Honduras, there are institutions beyond the centre of government that can help convey a strong message related to the importance of evidence-based decision making. Firstly, parliaments have a particular role to play in promoting the use of evaluations. They rely on verifiable and sound data on which they can base their policy initiatives and can thus push for the establishment of a structured approach to gather this information (OECD, 2020[3]). Most parliaments have research and information services that help members of parliament order, understand or request evaluation reports.
Honduras could therefore benefit from supporting and empowering members of Congress in their role as users of evidence as part of budget and general discussions. To do so, Honduras may wish to create a specialised unit within the National Congress aimed at providing technical support to members of the Congress as they analyse and use the results of evaluations carried out by the Executive on their main programmes and policies. Countries with specialised offices within congress/parliament that support the appropriate use of evidence by their members include Canada and the United Kingdom (Box 4.11).
Box 4.11. Country examples of specialised offices within parliament that promote the use of evaluation results
In Canada, the Parliament is the recipient of all programme evaluations and results reports produced by government departments. To facilitate access by individual Parliamentarians to these reports, government departments and agencies must list all evaluations undertaken within their Departmental Performance Reports and include a list of all planned evaluations in their Reports on Plans and Priorities. Additionally, there is a Parliamentary Budget Office responsible for providing independent and non-partisan financial and economic analysis on government’s actions, raising the quality of parliamentary debates, and promoting budget transparency and accountability.
In the United Kingdom, the Scrutiny Unit within Parliament provides Members of Parliament and Select Committees with support and advice to enable them to better interpret, analyse and scrutinise financial information delivered and/or published by the government. The Scrutiny Unit also assists the House of Commons with the scrutiny of draft bills and co-ordination of the evidence collection of public bill committees.
Source: (Mathot and Giannini, 2022[10]).
Secondly, the Supreme Audit Court of Honduras could contribute to the use of monitoring and evaluation information and results by assessing government entities’ use of evidence in decision making as part of their mandate to audit the effective and efficient use of public assets and resources. For example, the US Government accountability office produces reports and recommendations targeted to both the executive and Congress on the implementation of the US Government Performance and Results Act, which gives the Office of Management and Budget an important role in disseminating and integrating a results- and performance-based approach to public administration (OECD, 2020[3]).
Moreover, independent institutions responsible for monitoring and evaluating different aspects of the implementation of national plans in Honduras could contribute to the use of monitoring and evaluation results by including assessments of the use of evidence in decision making regarding the definition of the new plans. This includes the National Anticorruption Council, responsible for monitoring the transparent use of public resources allocated for implementation of the Nation Plan, and the National Forum of Convergence, a civil society body responsible for verifying and monitoring the execution of the Country Vision 2010-38 and the Nation Plan 2010-22 using an independent approach (Decree 286 of 2009). These institutions could play a key role in encouraging the government to formulate a new strategic plan based on evidence, lessons learned, and the results of previous evaluations.
Recommendations
This section lists the policy recommendations presented thought the chapter aimed at helping the Honduran Government to strengthen its monitoring and evaluation culture and promoting the use of evidence and results in decision and policy making.
Building a sound institutional framework for monitoring and evaluation in Honduras
Develop and adopt a sound and robust legal framework for the whole of government to guide and undertake monitoring and evaluation activities across government. Such a legal framework could be developed within the broader planning system/performance framework and should include:
Clear and comprehensive definitions of monitoring and evaluation, with information on the objectives and advantages of these activities.
Clear mandates for specific institutional actors on when and how to conduct monitoring and evaluation activities.
Clear mandates for the Secretariat of Finance in the promotion of monitoring and evaluation results as part of budget decision making.
Define an annual evaluation agenda, communicate its findings widely and monitor its implementation. In particular:
Further develop a focused approach on evaluation by clearly defining and communicating an annual evaluation agenda, which specifies how many and which programmes and policies are going to be evaluated, the evaluators (what competencies they must have, whether the evaluations will be carried out by internal or external stakeholders), and when and how the evaluations should be conducted.
Define a specific methodology to communicate the recommendations arising from the evaluations conducted by the centre-of-government institution responsible for monitoring and evaluation, to inform the type of formal responses that are expected from public institutions and to allow follow-up on the implementation of such recommendations.
Develop a detailed and tailor-made guidelines and manuals on evaluation practices. In particular:
Develop guidance on monitoring that clearly articulates monitoring activities for the different national plans (Country Vision 2010-38, Nation Plan 2010-22 and Strategic Government Plan 2018-22) and the institutional plans, and that clearly states the actors involved and their mandates, the timeline, and the tools and methodology for monitoring.
Promoting the quality of monitoring and evaluation processes
Secretariats of state should improve the quality of indicators used and data produced for monitoring and evaluation. In particular, secretariats of state should:
Explicitly link each institutional indicator (included in the institutional plan) to at least one national priority goal and strategy (included in the Nation Plan 2010-22 and the Strategic Government Plan 2018-22, respectively), and clarify the coherence between that institutional plan and the national plans.
Strengthen the robustness of the indicators of national and institutional plans by including key background information to facilitate their monitoring and evaluation. Background information should include a description of the indicator (with the formula for its calculations, the unit of measurement and the data source), the body responsible for the collection and reporting of the indicator, the frequency of data collection and update of the indicator, and the baseline and targets.
Carry out a comprehensive data inventory that accounts for all data assets secretariats of state created and collected, as a first step towards developing a strategy to encourage systematic access to, and use of, administrative data.
Hire statistical officers within the UPEGs, to advise on statistical policy, techniques and procedures throughout the policy cycle and to help guarantee that data needed in the evaluation process are systematically collected from the beginning of an intervention.
Further strengthen methodologies and quality control for monitoring and evaluation across government. In particular:
Issue additional guidelines to clarify the working methods and tools that will support monitoring practices across government.
Develop explicit and systematic quality control mechanisms to ensure that the evaluation design, planning, delivery and reporting are properly conducted and meet predetermined quality criteria such as:
Submitting evaluations produced or commissioned by the CoG institution responsible for monitoring and evaluation to peer review by academic or international experts before they are published.
Designing self-evaluation checklists to help evaluators from the secretariats of state and the CoG institution responsible for monitoring and evaluation control the quality of their work.
Strengthen the role of the Supreme Audit Court of Honduras in the promotion of evaluation quality by including in its annual audit plan a minimum amount of performance audits of specific policies or programmes that the government considers strategic, and by conducting evaluations of the country’s policy evaluation systems.
Further build capacity and strengthen competencies for monitoring and evaluation across government agencies. In particular:
Strengthen competencies in the monitoring units to develop key performance indicators.
Define quality standards to be included in the terms of reference for outsourcing and commissioning policy evaluations to external stakeholders, which allow secretariats of state and the CoG institution responsible for monitoring and evaluation to identify external evaluators with the right competencies for undertaking such evaluations.
Develop competencies to conduct in-house evaluations by continuing to offer online and face-to-face training courses together with the School for Senior Management in the Public Administration, and by developing specific training courses at the level of individual secretariats of state.
Foster knowledge sharing through a network of evaluators that connect those responsible for monitoring and evaluation within the UPEG across the different secretariats of state with academics, the private sector, and the international community.
Promoting the use of monitoring and evaluation results and evidence
Promote the use of monitoring and evaluation results and evidence in decision making, and in particular in the budget negotiation process, by for instance encouraging the use of policy evaluations conducted as part of budgetary discussions in the National Congress. In particular:
Develop a communication and dissemination strategy to adapt the way evaluation findings are presented to their potential users (policy makers, civil servants, National Congress, citizens, academia, etc.). Such a strategy may include the use of infographics, tailored syntheses of evidence (e.g. in the form of policy briefs or executive summaries), seminars to present the main findings of evaluations, “information nuggets” and fragments of storytelling that can be disseminated through social media accounts, to spread the main messages of key policy and evaluation reports (OECD, 2020[17]).
Other actors could also contribute to increasing the use of monitoring and evaluation results. In particular:
Incorporate evaluation results into the budgetary cycle by informing budget decisions with evidence arising from impact and performance evaluations carried out by the CoG institution responsible for monitoring and evaluation and the secretariats of state.
The National Congress could create a specialised unit aiming to provide technical support to members of the Congress to analyse and use the results of evaluations carried out by the CoG institution responsible for monitoring and evaluation and secretariats of state on their main programmes and policies.
The Council of Ministers could discuss evaluation results at the highest political level by systematically holding consultations on the main findings of evaluations conducted by the CoG institution responsible for monitoring and evaluation.
References
[9] Congreso Nacional de Honduras (2020), Presupuesto General de Ingresos y Egresos de la República, Ejercicio Fiscal 2021, https://www.tsc.gob.hn/biblioteca/index.php/varios/973-presupuesto-general-de-ingresos-y-egresos-de-la-republica-ejercicio-fiscal-2021 (accessed on 7 October 2021).
[13] HM Treasury (2020), Magenta Book Central Government guidance on evaluation, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/879438/HMT_Magenta_Book.pdf (accessed on 13 October 2021).
[10] Mathot, A. and F. Giannini (2022), “Evaluation Framework and Practices: A comparative analysis of five OECD countries”, OECD Journal on Budgeting, https://doi.org/10.1787/911cc792-en.
[15] OECD (2021), Better Governance, Planning and Services in Local Self-Governments in Poland, OECD Publishing, Paris, https://doi.org/10.1787/550c3ff5-en.
[2] OECD (2021), Monitoring and Evaluating the Strategic Plan of Nuevo León 2015-2030: Using Evidence to Achieve Sustainable Development, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/8ba79961-en.
[12] OECD (2021), OECD Recommendation of the Council on Enhancing Access to and Sharing of Data, https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0463.
[17] OECD (2020), Building Capacity for Evidence-Informed Policy-Making: Lessons from Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/86331250-en.
[16] OECD (2020), How Can Governments Leverage Policy Evaluation to Improve Evidence Informed Policy Making? Highlights from an OECD comparative study“, OECD Publishing, Paris, https://www.oecd.org/gov/policy-evaluation-comparative-study-highlights.pdf (accessed on 7 October 2021).
[3] OECD (2020), Improving Governance with Policy Evaluation: Lessons From Country Experiences, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/89b1577d-en.
[4] OECD (2020), Policy Framework on Sound Public Governance: Baseline Features of Governments that Work Well, OECD Publishing, Paris, https://doi.org/10.1787/c03e01b3-en.
[7] OECD (2019), Open Government in Biscay, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/e4e1a40c-en.
[8] OECD (2018), Draft Policy Framework on Sound Public Governance, http://www.oecd.org/gov/draft-policy-framework-on-sound-public-governance.pdf (accessed on 7 October 2021).
[18] OECD (2016), Evaluation Systems in Development Co-operation: 2016 Review, OECD Publishing, Paris, https://doi.org/10.1787/9789264262065-en.
[1] OECD (2016), OECD Public Governance Reviews: Peru: Integrated Governance for Inclusive Growth, OECD Public Governance Reviews, OECD Publishing, Paris, https://doi.org/10.1787/9789264265172-en.
[6] Secretaría de Coordinación General de Gobierno (2017), Guía Metodológica: Evaluación de Diseño, https://sgpr.gob.hn/SGPR.Admin2019/Repositorio/Details2/73 (accessed on 5 October 2021).
[5] Secretaría Técnica de Planificación y Cooperación Externa (2012), Guía para la Formulación de Indicadores, https://www.yumpu.com/es/document/read/14342669/guia-para-la-formulacion-de-indicadores-seplan (accessed on 5 October 2021).
[14] Stufflebeam, D. (2001), Method Notes Evaluation Checklists: Practical Tools for Guiding and Judging Evaluations, http://www.wmich.edu/evalctr/checklists/.
[11] UN Global Pulse (2016), Integrating Big Data into the Monitoring and Evaluation of Development Programmes, https://www.unglobalpulse.org/wp-content/uploads/2016/12/integratingbigdataintomedpwebungp-161213223139.pdf (accessed on 7 October 2021).
Note
← 1. A policy framework is a document or set of documents that provide strategic direction and guiding principles to the government on a specific sector or thematic area.