This chapter provides an overview of continuous supervision in Spain, led by the General Comptroller of the State Administration (Intervención General de la Administración del Estado, IGAE) and the National Audit Office (Oficina Nacional de Auditoría, ONA). The chapter describes the objectives and the risk factors that are the focus of the continuous supervision system (sistema de supervisión continua, SSC). It also offers recommendations for the IGAE and the ONA to strengthen its risk assessment processes in relation to the SSC.
Enhancing Public Accountability in Spain Through Continuous Supervision
1. Refining risk assessments for continuous supervision in Spain
Abstract
Introduction
Various control, inspection and public audit bodies share responsibility for oversight and accountability at the central government level in Spain. This includes the General Comptroller of the State Administration (Intervención General de la Administración del Estado, IGAE), which provides three levels of control—review of internal controls, continuous monitoring of financial controls and public internal audit. Within the IGAE, the National Audit Office (Oficina Nacional de Auditoría, ONA), plays a critical role as the financial control and internal audit body responsible for the continuous supervision system (sistema de supervisión continua, SSC). Other entities with oversight responsibilities include the General Inspection of Services (Inspección General de Servicios) within line ministries, which is tasked with reviewing controls related to the effectiveness of an entity’s internal processes and procedures (OECD, 2020[1]). In addition, the Court of Audit (Tribunal de Cuentas) is responsible for external audits of the economic and financial activity of public entities (Tribunal de Cuentas, n.d.[2]).
As required by the Public Administration Legal Regulation Act of 2015,1 the IGAE must provide independent oversight when a new public body is created. This involves analysing the rationale for the establishment of the body and assessing possible overlap with existing bodies. In addition, the IGAE must conduct periodic evaluations to determine whether the circumstances justifying the public bodies’ existence are still applicable (OECD, 2014[3]).2 The Act also established the legal framework for the SSC.
This chapter provides an overview of continuous supervision in Spain and the responsibilities of the ONA in relation to the SSC. The ONA first implemented the SSC in 2020, so the processes and methodology are still very much a work in progress. This chapter discusses the ONA’s approach as well as the concept or risk in the context of the SSC, including its focus on “rationality risk” and the risk factors established in regulations that that shape the ONA’s assessment. The chapter also offers recommendations for how the ONA could strengthen its risk assessment for continuous supervision with emphasis on the following issues:
formalising criteria for automated reviews and clarifying the linkage between risk indicators and the strategy for continuous supervision
considering a broader assessment of sustainability, going beyond the focus on financial indicators to the extent its mandate allows
standardising and documenting selection processes for control reviews and how they are used
enhancing the approach to continuous supervision by assessing fragmentation and overlap of entities.
The recommendations in the chapter focus on the risk assessment process, and Chapter 2 covers other aspects of the SSC. The chapter highlights the experiences from different countries, such as the United Kingdom, the Netherlands, Brazil and the United States, to support the IGAE in further developing its model for continuous supervision.
Continuous supervision in Spain’s public sector
The IGAE’s mandate to improve efficiency in the public sector through continuous supervision
The jurisdiction of control for the IGAE’s expanded mandate applies to public sector entities linked to or dependent on the General State Administration (Administración General del Estado) that are classified as autonomous bodies (organismos autónomos) or public business entities (entidades públicas empresariales) (Government of Spain, 2015[4]), among others.3 These entities run the gamut from state-owned corporations to state-run foundations, and vary widely in terms of size, budget, objectives or by line ministries. Table 1.1 shows all public entities subject to continuous supervision.
Table 1.1. Public sector entities in Spain subject to continuous supervision in 2017-19
Legal Form of entity |
Characteristics |
Number of entities |
---|---|---|
State trading companies |
Public entities owned by the state that engage in commercial activity and operate under commercial law |
140 |
Consortia affiliated to the General State Administration |
Provide public services on a partnership basis |
71 |
Autonomous Bodies |
Created to deliver public services with more flexibility and are governed similar to line ministries |
59 |
Public sector foundations affiliated with the General State Administration |
These entities are designed to use private sector mechanisms |
36 |
Other public bodies affiliated with the General State Administration |
Public bodies under the General State Administration that do not come under one of the other categories |
31 |
Unincorporated Funds |
Financed by the General State Budget |
27 |
Other public sector entities |
Public bodies that do not come under one of the other categories |
22 |
Public business entities |
Business entities providing goods and services that are dependent on central ministries |
13 |
State agencies |
Tend to have greater management autonomy and often subject to performance-based management contracts with output indicators |
9 |
Independent administrative bodies |
First created in the Public Administration Legal Regulation Act of 2015 (Ley 40/2015 de 1 de octubre de Régimen Jurídico del Sector Público) and have supervisory functions over a particular sector or economic activity |
6 |
Managing entities and shared service agencies of Social Security |
Public bodies affiliated with Social Security |
6 |
Public university |
Provider of higher education services |
1 |
Total |
421 |
Source: IGAE, Inventory of Public Sector Entities (Inventario de Entes del Sector Público, INVESPE).
Continuous supervision activities became the central means for the IGAE to fulfil these responsibilities, which included the design and implementation of the SSC. In Spain, continuous supervision activities are defined as follows:
“the set of verifications and analyses, preferably automated, carried out with the purpose of evaluating compliance with the objectives of the continuous supervision system, as well as the specific control actions that, with the same purpose, carried out in the field of permanent financial control or public audit provided for in Law 47/2003, of November 26, General Budgetary.” (Government of Spain, 2018[5]).4
The linkage between continuous supervision and the IGAE’s existing mandate is therefore explicitly acknowledged and embedded in regulation, which provides specific guidance on co‑ordination, planning and execution of these complementary activities. In addition, to further define the role of the IGAE in relation to the SSC, the Ministry of Finance and Civil Service (Ministerio de Hacienda y Función Pública) issued a Directive (Orden HFP/371/2018) stipulating the methodology for performing continuous supervision (Government of Spain, 2018[5]). Specifically, the IGAE was tasked with the following responsibilities under the Directive:5
Developing continuous supervision activities as required by legislation.
Planning, performing and evaluating activities in relation to continuous supervision.
Designing and managing an information system accessible by public sector entities subject to continuous supervision and the corresponding line ministries.
Issuing instructions specifying the relevant information requirements, criteria and guidelines to ensure the continuous supervision system functions well.
The National Audit Office and its responsibilities for continuous supervision
The IGAE follows a decentralised operating model, with three central service functions to deliver on its core areas of responsibility at the central government level:
the National Audit Office (Oficina Nacional de Auditoría, ONA), the financial control and internal audit body, which is also responsible for the SSC.
the Public Accounts Office (Oficina Nacional de Contabilidad, ONC), responsible for the planning and management of public accounting.
the Office of Finance and Information Technology (Oficina de Informática Presupuestaria, OIP) which designs and implements the IGAE’s policies on information technology.
The IGAE also has internal control “delegates” (intervenciones delegadas) embedded within line ministries and public sector entities. Delegates act as financial controllers for these government institutions and are responsible for ongoing monitoring of financial controls and public internal audits (IGAE, 2020[6]). Within the IGAE, responsibility for planning, conducting and reporting on the SSC is assigned to the ONA (IGAE, 2020[6]).
While the ONA is the body responsible for co-ordinating and conducting internal audits of public sector entities at central government level, its mandate also includes oversight of control reviews executed by control functions within the IGAE, such as the “delegates” embedded in line ministries or of financial controls over EU funds. It is therefore well-positioned to take a leading role in designing and implementing the SSC. The ONA comprises six divisions to deliver its comprehensive mandate. Currently six members of the ONA staff within the Director’s office are leading the planning, design and implementation of the SSC process. This includes the Director, three auditors, a technician and an IT specialist.
Overview of the SSC and defining the concept of rationality risk
Following the addition of continuous supervision to its mandate, the ONA prepared a strategy, approved in 2018 (ONA, 2018[7]), and a methodology (ONA, 2020[8]) for delivering the SSC. The strategy is based on principles of effective internal control (OECD, 2020[9]) and aligns with the objectives for the SSC in the Public Administration Legal Regulation Act. Figure 1.1 illustrates the key elements of the SSC in Spain.
Risk factors
Directive HFP/371/2018 provides the basis for how the ONA ultimately defines and interprets risk in the context of the SSC (Government of Spain, 2018[5]). The Directive calls for three levels of verification for the ONA to assess public entities with respect to compliance, financial sustainability and relevance. Taken together, these risk factors form the basis of the concept of “rationality of the structures” (racionalidad de las estructuras) of public entities, as defined in the Directive. Through this lens of “rationality,” the ONA interprets risk and shapes its automated reviews and continuous supervision methodology. As defined in law, the SSC is not explicitly intended to identify a broader set of strategic, operational or reputational risks, including fraud or corruption risks, if they fall outside the scope of the rationality concept described in Table 1.2.
Table 1.2. Risk factors underpinning rationality risk
Risk Factor |
Description |
---|---|
Compliance |
The entity complies with laws and regulations |
Sustainability |
The entity demonstrates financial sustainability. |
Relevance |
The entity does not duplicate efforts and is the appropriate lead |
Source: OECD interpretation of Official Gazette (Boletín Oficial del Estado), (2018[5]).
For public bodies and state-run foundations, financial sustainability includes an assessment of whether to dissolve the entity, taking into consideration its sources of financing, levels of expenditure and investment, as well as the impact, if any, on the General State Budget (Presupuestos Generales del Estado). For other categories of public sector entities, financial sustainability is understood, according to the Directive, as the entity’s ability to finance current and future expenditure commitments within the limits of applicable rules on public and commercial debt. To establish relevance, the ONA must verify that the reasons for establishing a public sector entity remain, and that the public entity continues to be the most appropriate means for fulfilling the goals entrusted to it. This verification relies, in part, on a review of the entity’s strategic action plan. The ONA also verifies whether the entity continues to deliver the services for which it was created, and assesses possible duplication with other entities that could be better placed to deliver the same services.
Automated reviews
“Automated” reviews are risk assessments that the ONA conducts based on indicators derived from financial and economic data reported by public sector entities to the IGAE, as well as other qualitative information. The reviews are “automatic” in that data is collected and indicators are generated in an Excel spreadsheet using formulas. Automated reviews apply to all the aforementioned entities that fall under the scope of continuous supervision. The ONA collaborated with the Office of Finance and Information Technology (OIP) within the IGAE on the design of the tool that automatically generates financial and economic indicators and ratios in Excel spreadsheets, drawing data from public financial reporting systems called, CICEP.red and RED.coa. Entities currently submit data to the IGAE monthly, quarterly or annually in the form of Excel files.
The ONA’s methodology for automated reviews produces a risk score based on the following inputs: 1) self-assessment questionnaires; 2) the ONA’s consideration of other qualitative risk factors that vary by entity; and 3) financial indicators (OECD, 2020[9]). Figure 1.2 shows the weighting the ONA applies for each input in the total risk score for rationality on a scale from 0 (low) to 3 (high). Specifically, entities with overall scores between 0-1 are considered low risk, between 1-2, at medium risk and 2-3 at higher risk. The score determines which entities warrant a control review. The final output of the automated reviews is called the Automated Actions Report, which communicated the results of the risk analysis.
Self-assessment questionnaires
Public sector entities are required to submit self-assessment questionnaires to the ONA as part of the SSC. The questionnaire allows the ONA to collect data on the entity’s activities and services delivered, its sources of financing, expenditures and the internal control environment. Entities access and complete the questionnaire using an online form of the CICEP.red or RED.coa reporting systems, depending on the type of entity. The tool used to generate the indicators and ratios from financial reporting data also aggregates the responses to the self-assessment questionnaires along with any supporting documentation provided by the entity. Analysts performing the SSC risk assessment access the responses and any evidence provided by the entities through Excel files. The leadership of each entity, subject to additional oversight and legal action, certifies the information that the ONA collects as part of the self-assessment. The ONA reviews responses for internal coherence, and if the entity is selected for a control review, it checks whether the information provided in the questionnaire is reliable and accurate.
Other risk factors
“Other qualitative risk factors,” a term used in the strategy and methodological documentation for the SSC, makes up 40 percent of the overall risk score for the automated review. These risk factors rely on several sources, including budget data, data from RED.coa and information in IGAE’s AUDINet, which is an application that acts as a central repository for control reports and information about auditing of public accounts. Like the self-assessment questionnaires, these risk factors vary by type of entity. For instance, autonomous bodies and state agencies are subject to the largest number of risk factors, including: how long the agency has been an existence; total expenditure of the entity; the volume of governmental transfers as a percentage of total revenues and income; and the audit opinion of the entity, among other factors. See Table 1.3 for a list of all “other risk factors” for autonomous bodies and state agencies. The ONA gives each of these risk factors a weight, which it uses to calculate an individual risk score. The risk scores are summed for a total valuation of “other qualitative risk factors” for each entity.
Financial indicators
The ONA’s financial indicators cover common areas of good practice in public financial management, such as solvency, the entity’s ability to meet obligations over the long term, and liquidity, its ability to meet current obligations6 (IPSASB, 2014[10]). The indicators also consider aspects of operational performance, as well as productivity for entities that have commercial activities. For instance, the indicators for state agencies take into account the ability to cover debt commitments (solvency), but also operational and service delivery components (see Table 1.3).
Table 1.3. Financial indicators used to assess state agencies, among others
These metrics are also applicable to autonomous bodies, consortia, unincorporated funds and other public entities
Solvency |
Revenue and Expenditure (%) |
Budget management |
Liquidity |
Activity (Average) |
---|---|---|---|---|
S.1 Debt over assets (%) |
CREP.1 Tax revenues/Revenue from ordinary activities |
PRE.1 Current budgeted expenditure |
L.1 Current ratio |
A.1 Units completed/planned units (activities) |
S.2 Surplus from ordinary activities |
CREP.2 Transfers/Revenue from ordinary activities |
PRE.1B Current budgeted expenditure |
L.2 Quick ratio |
A.4 Population covered (activities) |
S.3 Self-financing (%) |
CREP.3 VN and PS/Revenue from ordinary activities |
PRE.2 Average payment period (days) |
A.5 Waiting time for service (days) |
|
S.4 Coverage (%) |
CREP.4 Other income/Revenue from ordinary activities |
PRE.3 Current budgeted revenue |
B.1 Cost of the activity/number of users (days) |
|
S.5 Cash and cash equivalents |
CREP.5 Staff costs/Administrative expenses |
PRE.3B Current budgeted revenue |
B.2 Actual activity cost/projected cost (activities) |
|
CREP.6 Transfers/Administrative expenses |
PRE.4 Average collection period (days) |
B.3 Cost of the activity/equivalent units (euros) |
||
CREP.7 Other expenses/Administrative expenses |
C.1 Economic indicators (average) |
|||
CREP.8 Administrative expenses/Revenue from ordinary activities |
C.2 Economic indicators (euros) |
Source: (ONA, 2020[8]).
The ONA considers additional metrics for entities that undertake commercial activities, such as state trading companies, public business entities and foundations. For example, as these entities can borrow from commercial lenders, bank borrowings as a percentage of liabilities is included as a financial indicator (see Table 1.4).
Table 1.4. Indicators and ratios for public sector entities undertaking commercial activities
Applicable to state trading companies, public business entities and foundations
Financial Management (%) |
Structure (%) |
Productivity (in EUR 000) |
|
---|---|---|---|
SF.1 Liquidity or acid-test ratio |
SF.7 Long-term debt |
E.1 Grants/Turnover |
P.1 Average staff costs |
SF.2 Quick ratio |
SF.8 Short-term debt |
E.3 Grants/Equity |
|
SF.3 Solvency |
SF.9 Bank borrowings/Liabilities |
E.4 Shareholder contributions to equity |
|
SF.4 Guarantees or Coverage |
SF.10 Economic performance |
E.5 Related party debt/ Capital |
|
SF.5 Fixed assets coverage |
SF.11 Financial performance |
E.6 Grants/Operating profit |
|
SF.6 Indebtedness |
Source: (ONA, 2020[8]).
Entities are also expected to provide information on annual financing needs (for entities classified as public administrations under the European System of Accounts), gross operating profit, sources of expenditure and investments or sustainability forecasts. Public bodies and state-run foundations are further required to provide annual expenditure and investment reports. Other entities are required to submit sustainability forecasts or at a minimum, a report on their capacity to finance current and long-term commitments within the applicable constraints on public debt.
Control reviews and evaluation reports
The ONA selects public entities for additional scrutiny through on-site “control reviews,” which it selects based on the risk scores calculated during the automated review process, as well as consideration of additional qualitative factors. For instance, the ONA will take into account whether the entity has recently gone through a restructuring process, or whether it failed to submit a self-assessment questionnaire. In such cases, entities would be viewed as higher risk and therefore it would be more likely for the ONA to select them for a control review. As part of these reviews, the ONA or delegated entities within the entity may assess whether an entity is achieving its objectives and factor this into its final determination. The reviews culminate with an evaluation report that conveys the ONA’s opinion on the rationality of the entity, as defined above, with one of three conclusions:
Maintain—the ONA recommends that the entity is maintained in its current form, with possible recommendations for improvement.
Merge—the ONA recommends that the entity merge with another entity with similar objectives and functions.
Dissolve—the ONA determines that the entity is financially unsustainable and should be dissolved.
The ONA reports annually to the Ministry of Finance on the results of individual actions following the control reviews. The Ministry of Finance, in conjunction with the relevant line ministries responsible, table the ONA’s recommendations to the Council of Ministers (Consejo de Ministros), which ultimately takes the decision (OECD, 2020[9]). This is a key characteristic of the SSC and its target audience. Its effectiveness depends on the judgement and decisions of political leadership who are responsible for the institutional arrangements of government, and have the authority to either accept or reject the ONA’s recommendations.
Strengthening risk assessments for continuous supervision
The ONA could formalise the criteria for its automated reviews and clarify how indicators link to the strategy for continuous supervision
Since the ONA first began planning for the SSC in 2018, it has designed the methodology, developed tools to enable the process and completed a full cycle of reviews during a pilot phase in 2019. As the SSC evolves, and prior to its implementation at other levels of government, there are opportunities for the ONA to improve its approach. First, the ONA could formalise the criteria and justifications it has developed for automated reviews, including documenting its rationale for indicators and linkages to the strategy of continuous supervision. In doing so, the ONA would promote transparency of its processes, and improve understanding of how the ONA interprets and acts on risks among entities subject to continuous supervision. The ONA has chosen a comprehensive set of indicators for automated reviews, and it also recognised the need for tailored indicators according to the legal form of the public sector entity. The selection of indicators for financial sustainability in particular reflects a broad consensus on the effectiveness of measuring financial sustainability by evaluating expenditures, revenues, debt and cash management (Pina, Bachiller and Ripoll, 2020[11]). However, not all of the indicators are taken into consideration in the risk weighting for an entity. For instance, Table 1.5 shows the financial indicators for which the ONA collects information on public business entities, including those that contribute to 20% of the automated review risk score and those that are not considered as part of the weighted risk calculation. For purposes of this report, the OECD did not include the weights for each individual indicators, but it is this very information that could be useful for public entities to know.
Table 1.5. Financial indicators for public business entities by risk area
Structure |
Productivity |
Financial management |
|
---|---|---|---|
Included in the risk score |
|
N/A |
|
Not included in the risk score |
|
Average staff costs |
|
Note: Although not shown in the table, the weights for public business entities apply to public sector entities classified as “other public entities” (otras entidades de derecho público) with the exception of the average staff costs indicator.
Source: The ONA, Excel file Indicadores Entes Públicos (ONA, 2020[12]).
The self-assessment questionnaires follow a similar design. Despite the breadth of the questions in the self-assessment, only some of the responses contribute to an entity’s overall risk score for the SSC. For instance, only 11 of the 36 questions for state-run foundations contribute to the overall risk score, as shown in Table 1.6. Other self-assessment questionnaires for different types of entity also draw from a subset of the responses as part of the calculation for 40% of the risk score. The ONA indicated that for the purposes of the pilot, the evaluation team leveraged professional experience and judgement in determining the metrics that would have an assigned risk weighting (OECD, 2020[9]).
Table 1.6. Self-assessment questions for state-run foundations that contribute to the risk score
Question |
---|
Are the foundation’s objectives included in a strategic action plan that covers justification for its establishment, strategic objectives, description of key activities, timeframe for implementation, budget, system of internal control and performance indicators? |
Does the foundation have any objectives or activities that are shared with other foundations or public bodies? |
Qualitative value of the patronage received external to the state public sector |
Has the Board established an internal control model that aims to provide reasonable assurance of achieving its objectives? |
Do you consider that changes have occurred that would justify a review of the foundation’s membership or patrons? |
Do you believe that there have been changes that justify a review of the aims and objective of the foundation? |
In the last five years, has the foundation been subject to actions under Article 132 of the Public Administration Legal Regulation Act of 2015? |
Was the strategic action plan of the Foundation approved by the Patron? |
In your opinion, have there been changes that could justify the merger of the Foundation with another entity that has similar objectives? |
In the last five years, have there been significant changes in the circumstances of the Foundation’s patrons? |
In your opinion, rate the extent to which the Foundation has the necessary resources (staff, materials, equipment etc.) to effectively achieve its objectives? |
Note: Each question receives a different weight determined by the ONA.
Source: Data files provided by the ONA to the OECD with extracts from the CICEP.red and RED.coa (2019).
The ONA’s methodology for automated reviews is rigorous and evidence-based, yet it is also complex and underpinned by numerous internal decisions. While the judgement may be sound, the ONA could further clarify why it chooses to include some questions or indicators and not others, as well as provide the rationale to stakeholder for how it determines specific parameters for individual indicators. For instance, ONA officials noted that the questions it does not use for risk score still are important for assessing the internal coherence and reliability of the answers provided. The ONA could also provide additional details in existing methodology documents that explain the criteria for including specific indicators as part of its model, as well as its decisions about assigned weightings.
In addition to formalising the rationale for the metrics used and the corresponding weightings, the ONA could also further clarify the linkage between the rationality risk factors, the selected indicators and its methods for assessing risks via the automated and control reviews. Specifically, this could include explanations as to how the ONA uses information from the self-assessment questionnaires, other risk factors and financial indicators to inform decisions related to the aforementioned risk factors (i.e. compliance, financial sustainability and relevance), as well the selection of entities for further control review. Specifically, in its methodological documentation, the ONA could explain the linkage between the rationality risks and the risk areas identified for each type of entity. For example, in Table 1.4, the ONA could clarify how the indicators related to the risk areas of structure, productivity and financial management directly translate to the 3 areas of rationality risk: compliance, financial sustainability and relevance. This would help to promote transparency, as well as consistency as the SSC matures. It would also would address a need expressed by officials in interviews with the OECD for more information from the ONA about how it uses the information provided for continuous supervision.
For the ONA, the SSC is a new medium for communicating and applying standards in government related to financial management and control. According to ONA officials, the SSC is inspired by the Committee of Sponsoring Entities of the Treadway Commission’s (COSO) 2013 Internal Control-Integrated Framework. Box 1.1 presents a self-assessment tool developed by the National Academy for Finance and Economy (NAFE) of the Dutch Ministry of Finance. It supports evaluations and self-assessment, but goes beyond a questionnaire. It offers a self-assessment matrix and clear explanations about financial management and internal control. This has the added benefit of supporting managers in government to learn and apply standards and good practices.
Box 1.1. Netherland’s Financial Management and Control Self-Assessment for Government
The National Academy for Finance and Economy (NAFE) of the Dutch Ministry of Finance developed a self-assessment tool to improve public governance, focusing on financial management control (FMC) as a key component of public internal control. The NAFE developed an FMC assessment matrix as a practical tool to support assessments of FMC policies and practices at an institutional level, as well as to aid follow-up evaluations and actions to strengthen FMC. According to the NAFE, reasons for developing such tools include:
FMC lacks behind the development of internal audit.
Key elements of FMC are in place, such as financial departments and reporting systems, but operational and implementation challenges remain (including those subsequently listed).
Excessive operational control by top management.
Second Line of Defence, i.e., risk management, oversight and monitoring are undeveloped.
Financial divisions do not support planning and control, except for control of the budget.
Lack of an entity-wide planning and control mechanism, as well as planning and control at the operational level.
Blurred lines of responsibility between the second and third lines of defence, i.e., between risk management and the internal audit function.
Lack of key performance indicators.
The NAFE’s FMC assessment matrix allows management to understand the design of their entity assessed against good practice criteria, drawing from the European Union’s principles of Public Internal Financial Control (PIFC). Assessors must have excellent knowledge of PIFC, including managerial accountability elements. In addition to managers using the matrix as a self-assessment for their department, internal auditors can make use of the matrix during an entity-wide assessment of currently running FMC systems. Effective implementation of the self-assessment methodology, including completion of the FMC matrix, results in insights about possible actions to improve the FMC configuration and practices. The matrix and results can be shared with management and staff. The table below shows the header row of the matrix followed by an example of how each column can be populated. An actual matrix would include all key components of the internal control system, such as the internal audit function, as well as many other key variables and assessment impacts.
Table 1.7. Illustrative example of select components of an FMC assessment matrix
Key component of internal control |
Key variables |
Assessment aspects |
Indicators |
Sources |
Methodological approach |
---|---|---|---|---|---|
FMC within the primary processes/ programmes /projects (I) |
Configuration of Managerial Accountability (composition of the accountability triangle: Responsibility, Accountability and Authority) (I.1) |
Responsibility: there is a delegated mandate structure (tasks/obligations) described which is aligned with the entityal structure |
FMC within the primary processes/programmes /projects (I) |
Configuration of Managerial Accountability (composition of the accountability triangle: Responsibility, Accountability and Authority) (I.1) |
Responsibility: there is a delegated mandate structure (tasks/obligations) described which is aligned with the entityal structure. |
Alignment of the managerial accountability configuration (I.2) |
Responsibilities are well aligned and in balance with accountability obligations and granted authorities (I.2.1) |
Alignment of the three elements of the accountability triangle |
Internal regulations/process /programme descriptions |
Study relevant internal regulations and assess to what extent the responsibilities, accountability and authorities are balanced with each other |
|
FMC through supportive oversight/ controlling /monitoring processes (II) |
Managerial Accountability (II.1) |
Responsibility: The division of tasks and responsibilities between supportive second-line functions and first-line departments is clear and unambiguous. (II.1.1) |
It is clear how division of tasks and responsibilities between first-line primary processes and second-line supportive functions are divided |
• Internal regulations/ procedures • Operational Management •Management of supportive functions (e.g. financial department, planning department, HR, IT) |
Check the internal regulations/procedures and see if a clear division of tasks between first and second line can be distinguished. Is it described at all? In interviews: try to determine if the division of tasks matches the philosophy of first and second line or not. If the distinction between first and second line is blurry: describe it |
Finally, the FMC assessment matrix relies on the Institute of Internal Auditors’ Three Lines Model. In particular, according to this model, operational managers are the first line. They are responsible for implementing and maintaining effective internal control while assessing risks to operations and strategic objectives. The various oversight, risk management and compliance functions overseeing the operational management make up the second line. These functions are responsible for support, monitoring, oversight and control over the first line. The internal audit function is the third line, and it provides independent assurance on the functioning of the first two lines. Each of these three “lines” are reflected in the FMC assessment matrix, since they play distinct roles within the entity’s wider governance framework.
Source: The Dutch Ministry of Finance (March 2018[13]), Good Financial Governance and Public Internal Control, Presentation to the OECD.
Officials of public entities, interviewed by the OECD, were broadly supportive of the ONA and the recommendations it has made to date through the SSC process. In one interview, an entity raised the issues of the administrative burden the SSC creates, particularly concerning the need to compile and respond to the questionnaires. To avoid duplication of effort and limit the administrative burden on entities, only information not readily accessible by the IGAE and the ONA is required of entities as part of the SSC process (Government of Spain, 2018[5]). Moreover, to promote efficiency, the technical requirements are the same for both financial reporting and the SSC. The self-assessment questionnaires themselves, conducted on an annual basis, consist of approximately 35 questions. These measures suggest that the ONA has taken into account the burden it places on public entities in the design of the SSC. Nonetheless, the process of formalising its criteria and further documenting its rationale for its methodology could lead to in even leaner and less burdensome set of self-assessment questionnaires, which currently do not use all the questions for risk scoring as it is.
The SSC makes effective use of financial indicators, but opportunities remain to leverage the process for a broader assessment of sustainability
As discussed, one of the three risk factors that the ONA focuses on when assessing rationality risk is financial sustainability (the others being compliance and relevance). The assessment of financial indicators and ratios is a useful starting point for the ONA to evaluate financial sustainability of public sector entities in line with the Directive on continuous supervision. However, in other OECD countries, many audit bodies are incorporating a focus on y financial or economic metrics and increasingly considering the value of environmental and social benefits that the entity provides. One way the ONA can advance its efforts to enhance the impact and effectiveness of the SSC is to consider broader notions of long-term financial sustainability as part of its risk assessment process. To do this, the ONA would require an amendment to the Public Administration Legal Regulation Act of 2015 (Ley 40/2015 de 1 de octubre de Régimen Jurídico del Sector Público). The amendment would help the ONA to further modernise the SSC for future iterations, and support a longer-term vision for government and society that transcends the narrow interpretation of sustainability to short-term financial concerns.
Definitions of financial sustainability in the public sector context can vary. One common factor in most definitions is the likelihood of the failure of public bodies with significant liabilities or debt burdens (Pina, Bachiller and Ripoll, 2020[11]). However, financial sustainability of public entities is more complex than this one factor. The International Public Sector Accounting Standards Board (IPSASB) defines long-term fiscal or financial sustainability as a circumstance in which a public sector entity is able to achieve its goals for service delivery and meet its financial commitments both now and in the future (IPSASB, 2013[14]). This definition is similar to that of financial sustainability in the Directive on the SSC but with an added emphasis on service delivery.
In the IPSASB’s recommended guidance for public sector entities, consideration of financial sustainability is broader than accounting information from financial statements. It includes projected cash inflows and outflows related to the provision of goods, services and programmes providing public services using current policy assumptions over a specified period. The IPSASB identified three inter-related dimensions of long-term financial sustainability—service, revenue and debt—as well as two aspects that affect each dimension: capacity, the entity’s ability to change or influence the dimension, and vulnerability, the extent of the entity’s dependence on factors outside its control or influence:
Service: the projected volume and quantity of public services and entitlements to beneficiaries that an entity can deliver. This view of financial sustainability takes into account policy assumptions related to revenue from taxation or other sources, as well as debt constraints, and considers the impact on the entity’s ability to deliver services.
Revenue: impact of taxation levels and other sources of revenue on the provision of services while staying within debt constraints. In this dimension, the entity considers its capacity to vary tax receipt levels, modify or add sources of revenue. It also considers the entity’s vulnerabilities to outside sources of revenue. For example, if an entity’s inter-governmental transfers are legally mandated, its revenue streams are likely to be more stable.
Debt: considers debt levels and projected debt levels over the length of the assessment period in light of expected service provision commitments. In this dimension, an entity considers its capacity to meet its financial commitments on time or to refinance or incur additional debt where necessary. The level of net debt, or the amount spent providing goods and services in the past that need to be funded in the future is a key indicator as illustrated in Figure 1.3.
Assessments of financial sustainability can use a broad range of data such as financial and non-financial information about future economic and demographic conditions, assumptions about country and global trends such as productivity, relative competitiveness of the economy (at national, state or local levels) and demographic variables (IPSASB, 2013[14]). The ONA already considers several factors related to revenue and debt dimensions and considers performance aspects of the entities it monitors as part of the SSC. Building on this, provided it obtains the legal mandate, the ONA could enhance its focus on the service dimension in the IPSASB model. For example, the ONA could incorporate questions and qualitative indicators into its self-assessment questionnaires and “other risk factors” that provide insights about the service dimension of, such as:
Does the entity(s) have the capacity to vary volume or quantity of services it provides?
Is the entity(s) vulnerable to factors such as an inability to vary service levels or the unwillingness of recipients to accept reduced services?
Are expenditures on specific programmes expected to increase at a higher rate than the general level of expenditure?
Do entity(s) with capital-intensive activities account for the expected useful lives or replacement values of property, plant and equipment?
Such questions are useful for assessing medium- to long-term financial sustainability because they allow for comparison between an entity’s current or projected commitments to a future state, based on reasonable assumptions. The ONA could also add performance-related indicators that are relevant for not only the objectives of individual entities, but also for transversal policies that rely on the effectiveness and efficiency of multiple entities or programmes. This could include indicators related to health, education and welfare policies. For instance, going beyond the financial performance of an entity, the ONA’s indicators could reflect the cost of healthcare as a percentage of projected revenue from taxes and other sources, or as a percentage of changes in the estimated volume of beneficiaries. This analysis would provide insights about potential systemic challenges related to the sustainability health provision. Such analysis would also provide the entities responsible, as well as consumers of the ONA’s reports, with deeper insights about the functioning of services and use of taxpayer money.
As a long-term objective, a renewed legal mandate would also allow the ONA to consider environmental (i.e. external context) challenges that could affect sustainability in the context of the SSC. Environmental/contextual indicators are crucial in the assessment of sustainability in local government entities (Pina, Bachiller and Ripoll, 2020[11]) and encompass factors such as community needs and resources, intergovernmental constraints, disaster risk, political culture and external economic conditions. These additional considerations help decision-makers to govern better, make predictive decisions and enhance policies, controls and resource investment to ensure the achievement of objectives in relation to the rationality risks of compliance, sustainability and relevance.
In addition, taking into account the broader context could also have practical implications for the ONA’s unit of analysis for the SSC. Currently, the SSC targets individual entities, according to the law and regulations that governs it. The automated reviews capture information related to specific entities and the subsequent control reviews are carried out on the entities that pose the highest risk. In the context of the SSC, the ONA has only reviewed entities one-by-one. The experience of the United Kingdom offers insights into an approach that offers options for monitoring individual institutions and across several institutions, taking into account the environmental context.
In the United Kingdom., the government established triennial reviews to ensure that non-departmental public bodies (NDPBs) were subjected to regular and robust monitoring.7 The purpose is similar to that of the ONA’s SSC, albeit with a greater focus on outcomes and impact. The reviews act as mechanisms to ensure the NDPBs exist for a clear purpose, deliver the services users want, maximise the value for money for the taxpayer and confirm they have not outlived their useful purpose (UK Cabinet Office, 2015[15]). Since the launch of the programme in 2011, departments have reviewed hundreds of entities and recommended the dissolution of NDPBs. The success of the triennial review programme informed the design of the transformation methodology for the 2015 to 2020 Public Bodies Transformation Programme (see Box 1.2).
Box 1.2. The 2016-20 Public Bodies Transformation Programme of the United Kingdom
In April 2011, the UK Cabinet Office announced a triennial review starting in 2011 for all non-departmental public bodies (NDPBs) still in existence following the reforms brought about by the Public Bodies Act. According to the Cabinet Office, the review led to “fewer, more accountable and more efficient” government; the triennial reviews brought together public bodies across departments to deliver greater transformation than departments could deliver alone. Building on this effort, the Cabinet Office established the 2016-20 Public Bodies Transformation Programme. The Department-led reviews conducted as part of this programme provide regular assurance concerning the continuing need, efficiency and good governance of public bodies. The programme is two-pronged, consisting of “tailored reviews” and “functional reviews.”
Tailored Reviews
Tailored reviews extend the scope of the triennial review process to include executive agencies and non-ministerial departments. Each body is subject to a tailored review. The scope of the tailored review can be carried out in the context of departmental or functional reviews, described below. Their purposes it to challenge and provide assurance on the continuing need for an individual public entity in terms of both function and form. Reviews focus on the entity’s capacity for delivering more effectively and efficiently, including identifying the potential for efficiency savings, and where appropriate, its ability to contribute to economic growth. The Cabinet Office’s guidance indicates that the review “should include an assessment of the performance of the entity or assurance that processes are in place for making such assessments.” Reviews also take into account control and governance arrangements in place to ensure compliance with principles of good corporate governance.
Functional Reviews
Functional reviews look across departments and examine holistically the functions of several public bodies in similar or related areas of government. This approach will identify opportunities for reform that cannot be revealed by reviewing bodies one by one. The first review covers bodies with regulatory functions. This and subsequent reviews will be delivered through partnership with public bodies, and departments.
The Cabinet Office oversees the reviews. The guidance for the reviews establishes a principle of openness, and encourage public entities to publish results of reviews. In addition to transparency, other key principles for conducting the reviews include proportionality, challenge, being strategic, pace and inclusivity. The report of the reviews include recommendations to improve the effectiveness and efficiency for government, including evidence to substantiate judgements and consideration of the value for money for taxpayers.
As described in Box 1.2, the UK’s programme has a similar objective to the SSC as a way to provide continuous assurance that public entities remain relevant, needed and efficient in their operations. The UK’s guidance for tailored reviews provides some insights as to how the ONA could enhance future iterations of the SSC. The guidance calls for reviews that are challenging and take a “first principles” approach to whether each function is 1) still needed; 2) still being delivered; 3) carried out effectively; and 4) contributes to the core business of the entity, the sponsor department and to the government as a whole (UK Cabinet Office, 2019[16]). The environment (the external context discussed previously), as well as broader governance and financial issues like savings in relation to digital transformation, also are important considerations for the tailored reviews. For instance, the guidance highlights the following questions for assessing both efficiency and costs of digital transformation, as well as the impact on users of services:
What is the current spend in this area or areas?
How many transactions are received by the service per channel (online, phone, paper, face to face)
How many of these transactions end in an outcome, and a user's intended outcome?
How many phone calls, letters or in-person visits are there to the service?
What are the reasons for those phone calls, letters or in-person visits (e.g. to get information, chase progress, challenge a decision)?
What will the expenditure be after transformation?
When will savings start to be realised?
What will be the reduction in average cost per transaction, service or channel?
Is there potential in other areas of the public body’s activities to consider digital work that will contribute to spending reductions and improved services? (UK Cabinet Office, 2019[16])
In addition to being more performance-oriented, another key difference, which could help the ONA develop the SSC further, is the UK’s model of finding opportunities for improvement that are not identifiable by assessing public bodies individually or one-by-one. The UK’s functional reviews are by definition cross-departmental, and therefore they provide a more comprehensive picture than the tailored reviews concerning issues that affect or implicate multiple entities. The ONA’s SSC provides a foundation for expanding the current self-assessment questionnaires, or alternatively the lines of inquiry as part of control reviews, in the same way. In addition, the SSC covers entities that are the equivalent to NDPBs in the UK, so many of the questions are directly relevant for the ONA’s approach.
The ONA could standardise and document its process for selecting entities for control reviews, as well as the process and use of the reviews themselves
As noted, the ONA uses automated reviews to identify entities that pose a higher risk in terms of the concept of a rationality, in particular, the risk of the entity being non-compliant with laws, financially unsustainable, or irrelevant and duplicative relative to other entities. Based on this assessment and the resulting risk score, the ONA selects entities for further control reviews. During the pilot phase of the SSC, the ONA completed automated reviews of 421 entities. It selected nine entities for the control review. In interviews, the ONA explained this selection process, described below. However, the ONA could standardise and document its approach and criteria for decision-making to improve future supervision activities and promote greater transparency of the SSC.
In the SSC strategy, the ONA anticipated defining criteria to streamline and possibly automate the decision-making process for selecting entities to review. However, the ONA has yet to do this in its current methodology. The ONA indicated during interviews that the pilot phase of the SSC served to develop a baseline both for the metrics used for the automated review and for the criteria used in selecting entities for control review by an evaluation team. The evaluation team reviewed the risk score from the automated review with ONA analysts, and applied selection criteria to determine which entities warranted further review. The ONA indicated that the selection criteria included consideration of recommendations from other control reviews or public audits.
The ONA noted that consistency in the application of the selection criteria was achieved through discussion and agreement with the evaluation team for the nine entities reviewed during the pilot. As the SSC evolves from a pilot to full implementation, the ONA could benefit from formalising this process, including the selection criteria, in its methodology documents for the SSC. This will help future evaluations teams to carry out the SSC, and promote consistency and standardisation in the selection process. As the SSC matures, the ONA could periodically review and update this guidance and criteria for relevance and adjust as necessary. Box 1.3 provides some insights as to how the Federal Court of Accounts in Brazil (Tribunal de Contas da União, TCU) guides auditors in selecting entities for control actions.
Box 1.3. Assessing risks for audit selection by the Brazilian Federal Court of Accounts
To further align its practices with the International Standards of the Supreme Audit Institutions (ISSAI), the Federal Court of Accounts in Brazil (Tribunal de Contas da União, TCU) develop a process for its audit teams to systematically assess risks and key challenges in order to select audit subjects. To support this initiative, TCU developed guidance for its teams that explains the methodology and steps that auditors can take to conduct the risk assessments. The methodology encourages broad participation of internal stakeholders, including directors and auditors, as well external experts.
The guidance provides auditors with insights for assessing risk factors, materiality, relevance and opportunities concerning audit subjects. The “relevance” element considers whether audit subject or the implicates pressing issues of interest to society that are under public debate. The guidance outline how assessments are conducted, and it describes TCU’s severity index, represented as follows:
Severity Index = (Social Impact + Economic Impact) * Probability * Trend
TCU scores "social impact" and "economic impact" variables on a scale from 1 (low) to 5 (high). “Probability" and “trend” are variables that are both represented by percentages. Multiplying by the perceived trend allows TCU to decrease o increase the severity of a problem based on the auditors’ perceived direction of the problem (i.e. trending better or worse). TCU recognises the need for individual audit teams to tailor the guidance and it clearly articulate measures of quality for auditors. The methodology is designed to be flexible so that it is useful to different types of audit teams, as well as to account for different data requirements, availability and resources. To ensure the quality of the selection process, the guidance encourages audit teams to do the following:
obtain comprehensive and quality data on the universe of control under its jurisdiction
invite experts to assist in the analysis of related topics
seek internal guidance and support throughout the process
make appropriate use of the internal tools (i.e. a selection support system)
seek to involve the entire team of auditors in the discussion and analysis process
schedule sufficient resources and time to carry out the activities
properly document all stages of the process, so that the basis for the decisions adopted is demonstrated and that the information collected is preserved, enabling its eventual use in inspection processes or for planning work in future years.
TCU considers the relevance criteria when assessing the economic and social impact. Determining the scoring for individual variables relies on the professional judgement and expertise of auditors, as well as data and information collected during the selection process. The TCU promotes other forms of analysis to contemplate its risk assessments, such as Ishikawa Analysis and Problem Trees. Heads of audit units must approved the determinations and scoring related to risks. Once the TCU completes its analysis, it selects subjects for control. The guidance maps out each of these phases so that auditors have a step-by-step understanding of the entire process.
In addition, the control review follows a methodology specifically designed for the SSC (actuaciones de control individualizadas) which indicates the steps that an evaluator should follow. However, the methodology does not articulate the strategic elements for consideration or criteria underpinning the decision making for the opinion. As with the selection criteria for entities warranting further control review, the ONA indicated that the recommendations and opinions for the evaluation reports produced in the pilot were achieved through discussion and agreement with the evaluation team on a case-by-case basis.
The ONA could benefit from further defining the principles, guidance and criteria that evaluators and auditors can consider when forming recommendations and decisions about the rationality of entity. It could also clarify the process downstream, after the completion of the control review and the communication of the results in an evaluation report to the Ministry of Finance. For instance, the Directive (Orden HFP/371/2018) requires the ONA to include the response of an entity’s governing body or line ministry in its opinion; however, it does not provide guidance on actions to be taken in the event that there is a difference in opinion between the ONA and the line ministry.
ONA officials informed the OECD that the public entity evaluates the results of the SSC, and if it disagrees with IGAE’s conclusions, it would be elevated to the Ministry of Finance to advance the decision and possibly negotiate with the entity. Clarifying this process and decision-making criteria, in co-ordination with policymakers, could have several effects. First, it would help to promote more transparency of the SSC and important decisions about how the government is structured and taxpayer money is used. Second, it would be useful for subjects of the SSC to understand more clearly the rationale for conclusions and recommendations proposed. Lastly, clarifying how the results of controls reviews and the evaluation reports are and should be used—as well as further documenting the expectations, roles and responsibilities in this regard—would help to promote political accountability of the SSC. In particular, enhanced transparency and clarification of the process would help to promote ownership and responsibility among the Council of Ministers for decisions taken, or not taken, as a result of the ONA’s continuous supervision activities.
The ONA could enhance its approach to assessing duplication, including consideration of fragmentation and overlap
A key component of the CORA reforms was to reduce duplication and overlap within the general state administration as well as between the state administration and the autonomous or local governments. This is reflected in the SSC as a key risk factor, described as “relevance” for purposes of this report (see Figure 1.3 above). The CORA sub-commission defined “overlap” as different public entities providing identical services to identical recipients or public entities with similar missions acting on the same subjects. In conjunction with the Sub-Commission on Institutional Administration, the Sub-Commission on Overlap aimed to improve efficiency by streamlining the number of public sector institutions, companies and foundations.
To assess duplication, the ONA compares the powers assigned to the entity subject to the SSC with those of other entities that have similar objectives. The ONA verifies that the entities operate in the same or similar environment and scope. The ONA typically carries out the analysis manually, consisting of a review of relevant norms, statutes and other documentation for establishing the entities. Budget information, activity codes and other information or data can also be inputs for the ONA to determine whether duplication exists.
To enhance this approach, the ONA can draw lessons and inspiration from the work of supreme audit institutions (SAIs).In the United States, the Government Accountability Office (GAO) developed a unique approach to assessing not only duplication, but also fragmentation and overlap (see Box 1.2 for further explanation). GAO’s work promotes policy coherence in government, and it recognises that duplication or overlap is not always negative. For instance, complex policy issues involving multiple actors can benefit from multi-stakeholder insights and contributions, particularly in the case of transversal policies that cut across sectors and involve different entities. In 2015, the GAO issued a manual for assessing duplication, fragmentation and overlap. Notably for the ONA as it consider formalising its process, the manual includes guidance for auditors as well as policymakers and managers in government. It highlights four key steps:
1. Identify fragmentation, overlap, and duplication among a selected set of programmes and understand how the selected programmes are related.
2. Identify the potential positive and negative effects of any fragmentation, overlap, or duplication found in Step 1.
3. Validate the effects identified in Step 2 and assess and compare the fragmented, overlapping, or duplicative programmes in order to determine their relative performance and cost-effectiveness.
4. Identify options to reduce or better manage the negative effects of fragmentation, overlap, and duplication (US Government Accountability Office, 2015[18]).
The ONA could consider explicitly incorporating fragmentation and overlap into its analysis. At a minimum, this could include adding additional questions to the self-assessment process or in an internal guide for auditors to conduct control reviews that would them to identify duplication, fragmentation and overlap. The ONA considers some of these key questions already, although it could do so more formally and systematically:
How are entities or programmes related to each other?
Which entities or programmes are unnecessarily duplicating others?
Where can efficiencies be found between programmes with shared goals?
What relations do these programmes have with others?
Are there legitimate reasons for competition among or redundancies between entities or programmes?
A more robust assessment of duplication, fragmentation and overlap would likely require more resources than the ONA currently has for the SSC. However, the benefit from a government-wide perspective in the US context is considerable. As noted in Box 1.4, the GAO identified approximately USD 429 billion in total financial benefits as a result of actions taken to address GAO’s recommendations related to fragmentation, duplication and overlap.
Box 1.4. The US Government Accountability Office’s assessments of fragmentation, duplication and overlap
The Government Accountability Office (GAO) is required by law to conduct routine investigations to identify federal programmes, agencies, offices, and initiatives with duplicative goals and activities within departments and government-wide. GAO also must report annually to Congress on its findings, including the costs of fragmentation, overlap and duplication in government, as well as recommendations for Congress to address it. The figure below shows how GAO defines these concepts.
The GAO collects and analyses data on costs and potential savings to the extent they are available. GAO uses the information to identify potential financial and other benefits that can result from addressing fragmentation, overlap, or duplication, or taking advantage of other opportunities for cost savings and enhancing revenues.
Ensuring the reliability of data
GAO assesses the reliability of any computer-processed data that materially affects findings, including cost savings and revenue enhancement estimates. GAO reports on data reliability for each source and area it assesses. The steps taken to assess the reliability of data vary, but generally aim at fulfilling auditing requirements that data be sufficiently reliable and fit-for-purpose. The steps GAO takes to assess data reliability for this work can include:
Reviewing published documentation about the data system, including reviews of the data by the inspector general or others.
Interviews with the entities’ or external officials to better understand system controls and process for producing the data, as well as any limitations associated with the data.
Electronic testing of the data to see whether values in the data conform to what is said in interviews or documentation regarding valid values.
Comparison of data to source documents, as well as other sources for corroboration.
Results of 2020 activities
GAO’s 2020 report identifies 29 new areas where a broad range of government entities may be able to enhance efficiency or effectiveness. For each area, GAO suggests actions that Congress or executive branch agencies could take to reduce, eliminate, or better manage fragmentation, overlap, or duplication, or achieve other financial benefits. GAO also monitors actions taken to address its previous recommendations. GAO identified approximately USD 429 billion in total financial benefits as a result of steps either the Congress or government entities have taken to address GAO’s recommendations related to fragmentation, duplication and overlap.
Conclusion
This chapter describes Spain’s approach to continuous supervision and how the IGAE and the ONA have scoped the assessment of risks around the concept of “rationality risk.” Spain’s regulations provide the basis for how the ONA ultimately defines and interprets risk in the context of the SSC (Government of Spain, 2018[5]), calling for three levels of verification for the ONA to assess public entities with respect to compliance, financial sustainability and relevance. The ONA has developed a solid risk assessment methodology, which it first implemented in 2020. Building on this early model, the chapter offers recommendations for the ONA to continue advancing its risk assessment processes and methodology for continuous supervision.
The recommendations reflect opportunities for the ONA to formalise the criteria for its automated reviews and clarify how its risk indicators link to the strategy for continuous supervision. In addition, while the ONA makes effective use of financial indicators as part of the SSC, it could further leverage the process for a broader assessment of sustainability, including greater consideration for long-term financial sustainability and performance-related indicators. The ONA could also emphasise standardisation as it moves ahead with the next iterations of the SSC. This could involve documenting processes for selecting entities for control reviews and clarifying the process and use of the reviews themselves. Finally, the ONA could enhance its current approach for assessing duplication of government entities and programmes by incorporating analyses that also consider fragmentation and overlap. These are related but distinct challenges, and the SSC could be an effective tool for understanding and monitoring these issues in government.
References
[17] Federal Court of Accounts of Brazil (2016), Guidelines for Selecting Objects and Control Actions, https://portal.tcu.gov.br/fiscalizacao-e-controle/auditoria/selecao-de-objetos-e-acoes-de-controle/.
[5] Government of Spain (2018), Official Gazette (Boletín Oficial del Estado), https://www.boe.es/eli/es/o/2018/04/09/hfp371.
[4] Government of Spain (2015), Official Gazette (Boletín Oficial del Estado), https://www.boe.es/buscar/act.php?id=BOE-A-2015-10566.
[6] IGAE (2020), Memoria de actividades 2019.
[10] IPSASB (2014), The Conceptual Framework for General Purpose Financial Reporting by Public Sector Entities, International Public Sector Accounting Standards Board, https://www.ipsasb.org/publications/conceptual-framework-general-purpose-financial-reporting-public-sector-entities-3.
[14] IPSASB (2013), Recommended Practice Guideline: Reporting on the Long-Term Sustainability of an Entity’s Finances, International Public Sector Accounting Standards Board, https://www.ipsasb.org/publications/recommended-practice-guideline-1.
[1] OECD (2020), OECD Fact-Finding Interview with the Inspector General of Services for the Ministry of Finance (Inspección General de Servicios del Ministerio de la Hacienda).
[9] OECD (2020), OECD Fact-finding interviews with the National Audit Office (Oficina Nacional de Auditoría, ONA).
[3] OECD (2014), Spain: From Administrative Reform to Continuous Improvement, OECD Public Governance Reviews, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264210592-en.
[12] ONA (2020), Indicadores Entes Públicos.
[8] ONA (2020), Metodología relativa a las actuaciones de supervisión continua automatizadas.
[7] ONA (2018), Estrategia del Sistema de Supervisión Continua (2018-2020).
[11] Pina, V., P. Bachiller and L. Ripoll (2020), “Testing the Reliability of Financial Sustainability. The Case of Spanish Local Governments”, Sustainability, Vol. 12, http://dx.doi.org/10.3390/su12176880.
[13] The Dutch Ministry of Finance (March 2018), Good Financial Governance and Public Internal Control, Presentation to the OECD.
[2] Tribunal de Cuentas (n.d.), Función de Fiscalización, https://www.tcu.es/tribunal-de-cuentas/es/fiscalizacion/funcion-de-fiscalizacion/ (accessed on November 2020).
[16] UK Cabinet Office (2019), Tailored Reviews: Guidance on Reviews of Public Bodies, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/802961/Tailored_Review_Guidance_on_public_bodies_-May-2019.pdf.
[15] UK Cabinet Office (2015), Public Bodies 2015, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/506880/Public_Bodies_2015_Web_9_Mar_2016.pdf.
[20] UK Cabinet Office (2014), Triennial Reviews: Guidance on Reviews of Non-Departmental Public Bodies, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/332147/Triennial_Reviews_Guidance.pdf.
[19] US Government Accountability Office (2020), 2020 Annual Report: Additional Opportunities to Reduce Fragmentation, Overlap, and Duplication and Achieve Billions in Financial Benefits, GAO-20-440SP, https://www.gao.gov/assets/710/707031.pdf.
[18] US Government Accountability Office (2015), Fragmentation, Overlap, and Duplication: An Evaluation and Management Guide, https://www.gao.gov/assets/gao-15-49sp.pdf.
Notes
← 1. Part II, Chapter II, Organisation and Functioning of the state institutional public sector. Article 81.2 requires public administrations to establish a system of continuous supervision of their dependent entities, justifying the reasons for their existence and financial sustainability and include proposals to maintain, transform or dissolve the entity. Article 84 defines the categories of public sector entities in scope for continuous supervision and efficiency control reviews while Article 85 defines the roles and responsibilities of the Hacienda, the IGAE and the ministerial inspection units.
← 2. Chapter 2 Administrative Rationalisation and Multi-Level Governance.
← 3. Article 84.
← 4. Article 10 Continuous Supervision Activities Actuaciones de supervisión continua Orden HFP/371/2018.
← 5. Article 6 Role of the IGAE Funciones de la Intervención General del Estado Orden HFP/371/2018.
← 6. Definitions of solvency and liquidity are based on the Conceptual Framework for General Purpose Financial Reporting by Public Sector Entities issued by the International Public Sector Accounting Standards Board (IPSASB) (IPSASB, 2014[10]).
← 7. Non-Departmental Public Bodies (NDPB) is an administrative term for those public bodies that operate at arm’s length from Ministers, but for which Ministers are ultimately accountable. NDPBs can be statutory or non-statutory (UK Cabinet Office, 2014[20]).