Deliberation of technical features of evidence are necessary but not sufficient for an evidence-informed approach to policy-making to deliver results. Policy and practice decisions actors should consider broader considerations, such as ethics, values, and political considerations to think critically about what evidence is needed in a particular context. This chapter presents existing principles for the good governance of evidence, including: appropriate evidence for the policy concern; ensuring integrity (honest brokerage); accountability; contestability; public representation in decision-making; transparency in the use of evidence; and building evidence through emerging technologies and mobilising data. Each subsection offers details of why these principles are important and a summary of the existing approaches.
Mobilising Evidence for Good Governance
2. Principles for the good governance of evidence
Abstract
Appropriate evidence for the policy concern
Evidence should be: selected to address multiple political considerations; useful to achieve policy goals; considerate of possible alternatives; understanding of the local context. It should be useful for learning how to improve policy design, identifying both the positive and negative effects.
Why is it important?
The notion of appropriateness is intended to supplement considerations of evidentiary quality (discussed in the standards chapter). The quality of the evidence remains critical, but the appropriate way to judge the quality of the evidence should be chosen only after consideration of which evidence is most useful to the policy concern (Parkhurst, 2017[1]). Ensuring that evidence is appropriate to the policy question at hand can also create a virtuous circle, creating increased demand and use of policy-relevant evidence.
Summary of the mapping of existing approaches
Approaches to the good governance of evidence consider several components to ensuring the availability of appropriate evidence, notably that evidence addresses multiple policy concerns, that evidence is constructed in ways that best service policy goals and that evidence is applicable to the local context. These can be considered as three interrelated concepts that ensure evidence is appropriate to the policy question and context (Figure 2.1).
Evidence that addresses the multiple policy concerns at stake
Some approaches emphasise the importance of ensuring that evidence addresses the multiple relevant social and policy concerns at stake (Parkhurst and Abeysinghe, 2016[2]) (Parkhurst, 2017[1]). Governments engage in a range of activities to improve the appropriateness of evidence generated in policy. In an analysis of initiatives in the UK and the USA, Shaxson (2019[3]) identifies the importance of strategies, policies and plans at different level of government which set out the priorities for evidence collection. Shaxson recommends that these:
Cover all types of evidence.
Demonstrate relevance to current and future policy directions and risks.
Are open and transparent to encourage public engagement around evidence.
Are updated, via broad-based engagement, on a regular basis.
OECD’s ‘New Approaches to Economic Challenges’ has carried out work on complexity and policy-making (Love and Stockdale-Otárola, 2017[4]). This work recognises that policy makers must pay attention to the interlinkages between policy areas and related objectives, whilst improving evidence on the simultaneous movement of various targets and policy levers. Simultaneously, scientists and researchers must provide policy makers with evidence that makes policy makers aware of the complexity of the systems they are dealing with, whilst avoiding the sentiment that the systems are so complex that no one can possibly understand or influence them.
Evidence that best informs policy goals
Many approaches also emphasise the importance of ensuring that evidence should be sourced, created and analysed in ways that help decision-makers reach policy goals. Parkhurst (2017[1]) argues that what constitutes ‘good’ evidence for policy-making needs to be reframed as a question of policy appropriateness. This avoids over-simplistic applications of evidence hierarchies for questions they are not designed to address, and to help reconsider which evidence is most important to inform policy decisions.
The US Office for Management and Budget (2017[5]) highlights the importance of ‘relevance’ meaning that evidence must actually be used to inform decision- making. This means that findings should be presented in a format that is easy to interpret and apply. Ensuring relevant evidence can be facilitated by strong partnerships and collaborations among evidence producers, policy makers and service providers. The importance of early engagement between evidence experts and partner organisations in framing appropriate and relevant research questions is also highlighted by the UK Government Office for Science (2010[6]).
A range of strategies for ensuring that the information is easy for policy makers to interpret and apply comes from the fields of knowledge translation and knowledge brokerage. This includes a range of strategies such as decision making toolkits, guidelines and recommendations designed to facilitate solutions to particular policy problems (Langeveld, Stronks and Harting, 2016[7]). For example, many of the organisations reviewed in chapter 3 make use of standards of evidence to present evidence in an attractive and understandable way. For example, the Health Evidence (Canada) (2018[8]) offers the Quality assessment tool to help public health service to identify the best evidence available. The tool contains a questionnaire/checklist to review the studies, a dictionary for definition of terms and instructions for completing each section, and an overall assessment of the methodological quality of the review.
Evidence that is applicable to the local policy context
The applicability or transferability of evidence to a local context receives attention from a large variety of sources. Most stress the fact that ‘context matters’ so that even the most carefully studied and well-evidenced policies may not be successful in every context (Wandersman et al., 2000[9]; Leviton and Trujillo, 2017[10]). The case of Nurse Family Partnership illustrates the importance of context, with a heterogeneous pattern of results across different implementation contexts.
Box 2.1. Context Matters: Transporting Nurse Family Partnership to different contexts
Nurse Family Partnership is a licensed intensive home-visiting intervention developed in the United States and introduced into practice in England as Family Nurse Partnership.
The intervention has been shown to be effective in numerous RCTs in the USA (Eckenrode et al., 2010[11]; Olds et al., 2003[12]) and has also shown positive results in trial in the Netherlands (Mejdoubi and Midwifery, 2014[13]). However, when transferred to England it showed disappointing results with no additional short-term benefits to primary outcomes over and above those associated with usually provided health and social care (Robling et al., 2016[14]).
Why might this be the case? In the Netherlands, the intervention was modified so that it was focused on high-risk families. In the United Kingdom context, usual care already includes universal provision in the form of the Healthy Child Programme. It has been recommended that the Family Nurse Partnership be refocused in England to focus on higher risk families and by incorporating other activities to improve the effectiveness of the intervention (Barlow et al., 2016[15])
Source: adapted from (OECD, 2020[16]), Delivering evidence based services for all vulnerable families: A review of main policy issues.
These issues of applicability or transferability of evidence are closely related to themes discussed in section on implementation and scale up of interventions. Several clearinghouses discussed in that section provide resources that enable an assessment of evidence produced in one context and its applicability to a different context. For example the California Evidence-Based Clearinghouse for Child Welfare has provided a practical guide that enables organisations to evaluate their needs, make decisions about which new interventions might be suitable and plan for implementation and sustainability (2018[17]). The Norwegian Institute of Public Health has recently developed the TRANSFER approach to support the assessment of systematic review findings to the context of interest in the review, based on comprehensive list of factors that influence transferability of evidence (Munthe-Kaas, Nøkleby and Nguyen, 2019[18]).
Other important tools also exist to help decision makers apply evidence in context. (The GRADE Working Group, 2015[19]; Lavis et al., 2010[20]; Moberg, Alonso-Coello and Oxman, 2015[21]). One specific approach is the Conclusions on Minimum Quality Standards in Drugs Demand Reduction form the Council of the European Union (2015[22]), where they suggest appropriate evidence-based treatment should be tailored to the characteristics and needs of service users and is respectful of the individual’s dignity, responsibility and preparedness to change.
Key questions
Does the evidence selected address the multiple policy concerns at stake?
Are there plans, which set out government priorities for data collection, to generate appropriate evidence, either through administrative or survey data?
Does the evidence selected best serve the policy goals?
Does the evidence collected help to determine a range of policy options including alternatives and the most appropriate option(s)?
Is the evidence collected applicable in the local policy context?
Ensuring integrity (honest brokerage)
Individuals and organisations providing evidence for policy-making need processes to ensure the integrity for such advice, managing conflicts of interest and ethical conduct to avoid policy capture and maintain trust in evidence-informed processes for policy-making. This notion of integrity includes both the processes used to select and analyse the evidence, but also the processes through which the advice is then provided to policy-making.
Why is it important?
Ensuring the integrity of the interface between evidence production and decision making is critical to avoid the potential manipulation of evidence. Therefore, knowledge brokers need to develop a credible, legitimate and respected position in the eyes of both decision makers and evidence producers (Rantala et al., 2017[23]).
The term “Honest broker” refers to producers of evidence facilitating complex political decisions from a neutral position, aiming to understand and expand the scope of choice available instead of advocating a certain policy (Wilsdon, Saner and Gluckman, 2018[24]; Gluckman, 2014[25]) (Rantala et al., 2017[23]).
Ensuring integrity is particularly important in the 21st century as an increasing number of organisations start using emerging technologies to help them make better informed decisions. In the case of Artificial Intelligence (AI) systems, there is a heavy reliance on training data to develop the solutions, which may reduce the space for human decision making. Handling the preparation of data with integrity is critical to achieving results that are more accurate and to create trust in the application of AI. The OECD Principles on Artificial Intelligence promote artificial intelligence (AI) that is innovative and trustworthy and that respects human rights and democratic values. These principles complement existing OECD standards in areas such as privacy, digital security risk management and responsible business conduct (OECD, 2019[26]).
Summary of the mapping of existing approaches
Approaches to the good governance of evidence consider several facets of integrity, notably honest brokerage, the importance of humility of advice, ethical conduct and managing conflicts of interest.
Honest brokerage
Many jurisdictions have guidance concerning appropriate conduct for public officials, which include material relevant to integrity and honest brokerage. For example, the UK’s Committee on Standards in Public Life (2019[27]) stipulates that public officials must avoid placing themselves under any obligation to people or organisations that might try inappropriately to influence them in their work. It is also important that those providing evidence for policy-making do so in a way that protects their independence from political interference (Gluckman, 2014[25]).
Other approaches consider the integrity specifically in relation to those providing evidence to decision makers. A case study in Finland identified four key roles and thirteen critical attributes in managing the challenges of honest brokerage in the political water supply decision-making process in Finland (Table 2.1).
Table 2.1. Honest brokerage roles and their attributes in Finland
Honest Brokerage |
Rationale |
Specific attributes of the brokerage roles |
---|---|---|
Knowledge provisioning |
Need to provide clear, objective, neutral, and timely information |
Information provider: to provide timely information and decision support to stakeholders during the process Interdisciplinary: to provide holistic picture about complex issues from various disciplines Clear communicator: to use clear language, illustrative methods, and concrete policy paths in making uncertainties and risks tangible Neutral: to be humble and objective by avoiding conveying researcher’s own values and unintentionally promoting an agenda preferred by the broker |
Trust and relationships |
Need to build trust and relationships between brokers and stakeholders |
Empathetic: to listen to all parties patiently and show understanding for their concerns and values Trustworthy: to treat confidential information ethically and to encourage transparency Reciprocal: to attract busy politicians to discuss and exchange ideas with the broker by providing useful information in return |
Leadership |
Need to keep the process on track and drive issues forward |
Facilitative: to make sure the policy discussion evolves by providing clear answers for all open questions and queries Courageous: to expose broker’s own credibility to open criticism by leading discussions and correcting misunderstandings |
Intermediary |
Need to balance and mediate between various positions |
Mediative: to sketch alternative solutions and search for mutual gains Deliberative: to provide opportunities for stakeholders and the public to engage in dialogue and joint knowledge construction in order to balance the prevailing interest-based positions Detective: to be cognizant about hidden motives, political games, and interpersonal ties behind advocacy strategies Adaptive: to be sensitive and ready to |
Source: Adapted from (Rantala et al., 2017[23]).
Humility
The need for humility in the provision of advice is recognised as a critical component of the role of honest brokers. Humility is defined as the striving for an accurate view of oneself and one’s role, combined with an appropriate respect for both the power and limitations of science and technology. The International Network for Government Science Advice (INGSA) provides guidelines on three aspects of humility (Wilsdon, Saner and Gluckman, 2018[24]):
Humility of competence respects the limits of knowledge and understanding. It requires those engaged in scientific advice to be open to new ways of knowing and the need to learn from others, with different disciplines, traditions, ages, genders, or geopolitical perspective.
Humility of motive respects that advisers have their own history, culture, and political motives. Bias and conflict of interest are inescapable.
Humility of role respects that a division of labour in decision-making is legitimate and that the process is non-linear. Scientific and technical input is often essential but rarely sufficient for complex, high-stakes decisions. Policy-making always involves considerations beyond the evidence.
Ethical conduct
Ensuring that there are robust procedures for maintaining the integrity of advice provided to decision makers, including ethical conduct and managing conflicts of interest is a further critical aspect of ‘honest brokerage’.
Many of the approaches consider how ethical values and safeguarding can be assured. The criteria to select best practices adopted by the European Commission’s Directorate General for Health and Food Safety covers such ethical aspects. These include ensuring that conflicts of interests are clearly stated along with measures taken to address them. In the US, the Society for Prevention Research has recently extended its work to consider ethical challenges inherent in the promotion of evidence-based interventions (see Box 2.2). The OECD itself has general recommendations on ‘Managing Conflict of Interest in the Public Service’ (OECD, 2003[28]) and ‘Principles for Managing Ethics in the Public Service’ (OECD, 2000[29]). Ethical conduct principles also extend to relationships between policymakers and providers of evidence, as well as effective communications to the public (Mair et al., 2019[30]). This can help limit undue pressure of policymakers on research providers that might compromise the quality of the evidence and distort trust in evidence-informed policy-making processes.
Box 2.2. Ensuring the ethical actions of researchers providing advice on policy implementation
The ethical challenges involved in the implementation of evidence-based interventions has been explored by the Society for Prevention Research in the US. Seven value statements were developed to promote the ethical conduct of implementation scientists, who should:
Be guided fundamentally by intent to maximize benefits and prevent harm to both individuals and public wellbeing and welfare.
Respect the rights of those whose lives they hope to improve and empower them to make decisions concerning issues that affect them.
Maintain high standards of transparency in representing themselves to stakeholders and in disseminating scientific findings related to evidence-based practices.
Provide accurate and complete information about the generalizability of available evidence, available choices, and costs of evidence-based policies to enhance the capacity of communities to make informed decisions regarding their adoption or scale-up of preventive intervention.
Disclose financial and professional conflict of interests or limitations of expertise when presenting the scientific findings to stakeholders that affect programme adoption, dissemination, and implementation strategies.
Promote ongoing communication, transparency, accountability, reliability, and reciprocity in relationships with all partners across the phases of implementation of preventive interventions.
Anticipate and respect diverse values, beliefs, and cultures of the community or population engaged in implementing an intervention.
Source: Adapted from Leadbeater (2018[31]).
Many approaches focus on ensuring that policy evaluations and other policy related research are conducted to safeguard the dignity, rights, safety and privacy of participants (e.g. OMB guidance). Given the increase in use of administrative data for evaluation purposes, issues of informed consent are raised when information provided by citizens is not being used for the original purposes to which the citizens consented. For example, some secondary analyses use individuals’ identifiers to link public data to existing longitudinal datasets. When consent was given for the original purposes for collecting the data, but not for the use of the linked data, considerations of the need to balance potential benefits of the research with respect for individuals’ autonomy and self-determination are of concern (Leadbeater et al., 2018[31]).
Conflicts of interest
Conflicts of interest is addressed in existing principles both as a general feature of those involved in public service and as more specifically relating to those involved in providing evidence for decision making. A key part of standards of public life is that officials do not act or take decisions in such a way as to gain financial or other material benefits. The UK Committee standards of public life stipulate that officials must declare and resolve any interests and relationships that could give rise to conflicts of interest. In France, a High Authority for the Transparency of Public Life, ensures that officials carrying high level political responsibilities cannot be placed in a situation of conflicts of interest with a wide range of application.1 There is also a literature on managing conflicts of interest in the evidence synthesis process (Healy, 2019[32]; Sturmberg, 2019[33]) and many clearinghouses and evidence repositories who conduct evidence synthesis have policies and procedures for managing conflicts of interest. The OECD’s approach is described in Box 2.3.
Box 2.3. OECD Guidelines for managing conflict of interest in the public service
1. Identify relevant conflict of interest situations
2. Demonstrate leadership commitment
a. Organisations should take responsibility for the effective application of their Conflict of Interest policy.
3. Create a partnership with employees: awareness, anticipation and prevention
a. Ensure wide publication and understanding of the Conflict of Interest policy.
b. Review ‘at-risk’ areas for potential conflict of interest situations.
c. Identify preventive measures that deal with emergent conflict situations.
d. Develop an open organisational culture where dealing with conflict of interest matters can be freely raised and discussed.
4. Enforce the Conflict of Interest policy
a. Provide procedures for establishing a conflict of interest offence, and proportional consequences for non-compliance with Conflict of Interest policy including disciplinary sanctions.
b. Develop monitoring mechanisms to detect breaches of policy and take into account any gain or benefit that resulted from the conflict.
c. Co-ordinate prevention and enforcement measures and integrate them into a coherent institutional framework.
5. Initiate a new partnership with the business and non-profit sectors
a. Create partnerships for integrity with the business and non-profit sectors by involving them in the elaboration and implementation of the Conflict of Interest policy for public officials
b. Anticipate potential conflict of interest situations when public organisations invite the involvement of persons representing businesses and the non-profit sector.
c. Raise awareness of the Conflict of Interest policy when dealing with other sectors, and include safeguards against potential conflict of interest situations when co-operating with the business and non-profit sectors.
Source: Adapted from OECD (2003[28]), Recommendation of the council on guidelines for managing conflict of interest in the public service.
Addressing the risk of capture and bias in the policy process
There is also a need to address the potential for bias as external voices often try to intervene in the policy-making process to preserve or promote specific interests. OECD’s ‘Guidelines for Managing Conflict of Interest in the Public Service’ respond to a growing demand to ensure integrity and transparency in the public sector (OECD, 2003[28]). The primary aim of the Guidelines is to help countries, at central government level, consider Conflict of Interest policy and practice relating to public officials. Demands for transparency in public decision-making have also led to concerns over lobbying practices. In 2009, the OECD reviewed the data and experiences of government regulation, legislation and self-regulation, leading to ‘10 Principles for Transparency and Integrity in Lobbying’. These issues are also pertinent to evidence-informed policy-making and capacity building. Commercialisation of capacity building activities can also create pressure to overstate the benefits, leading to erosion in confidence if expectations are not met (Leadbeater et al., 2018[31]).
Moreover, the rapid spread across governments of emerging technologies such as AI systems in decision-making processes raises pressing issues in acknowledging and mitigating bias in algorithms to ensure integrity and maintain trust. To illustrate this point, the federal government of Canada explored the responsible use of AI in government, establishing an Algorithmic Impact Assessment (AIA) tool in order to assist designers evaluate the suitability of their AI solutions and creating a set of guidelines to complement it (Government of Canada, 2019[34]). This will be further discussed in Building evidence through emerging technologies and mobilising data.
Given that misperceptions affect the success of policy interventions, and eventual trust and buy-in of stakeholders and the public (Mair et al., 2019[30]), it is vital that perceived and alleged conflicts of interest are also proactively tackled and clarified for the public.
Key questions
Are there provisions to ensure appropriate conduct of public officials involved in providing evidence for decision-making?
Are procedures in place to ensure the impartiality of those providing evidence to decision makers?
Are robust procedures in place for disclosing and managing potential perceived and real conflicts of interests of those providing evidence to decision makers?
Are experts providing evidence able to provide objective, clear, neutral and timely information?
Are procedures in place to ensure that evidence experts respect the limits of their own knowledge and understanding and communicate the limitations of evidence?
Accountability
Accountability in decision-making means that the agent setting the rules and shape of official evidence advisory systems used to inform policymaking should have a formal public mandate, and the final decision authority for policies informed by evidence should lie with democratically representative and publicly accountable officials.
Why is it important?
Accountability reflects the extent to which advisers and public officials are answerable for their actions. This also matters from the perspective of open and inclusive government, as reflected in the OECD Recommendation of the Council on Open Government (OECD, 2017[35]).
Summary of Mapping Approaches
Approaches to accountability in decision-making have focused on creating clear distinctions between the roles of advisors and public officials, and ensuring elected officials have leadership over the evidence advisory process.
Clear Roles of Policy Makers and Advisors in the Evidentiary Advisory Process
Creating clear distinctions between the roles of policy makers and advisors within the policy-making process ensures that the final decision rests with those who are democratically elected and whose actions are accountable to the public. As governments have a democratic mandate, public officials must take into account “a wide range of factors and recognise that science is only part of the evidence government must consider in developing policy” (Government Office for Science, 2010[36]).
The role of advisors is to present evidence to policy makers whereas the role of policy makers and elected officials is to define policy and choose between options with different trade-offs (Gluckman, 2014[25]). To ensure accountability, The Nuffield Council for Bioethics states that “there should be explicit acceptance and acknowledgement of where responsibility for governance lies and how it might legitimately and democratically be influenced” (Nuffield Council on Bioethics, 2012[37]).
In many countries governments will appoint ministerial advisors to “increase the responsiveness of government and help address strategic challenges faced by government leaders” (OECD, 2011[38]). Ministerial advisors will gather evidence and co-ordinate with stakeholders within and outside government, and will advise the elected officials based on the evidence. For more information on the ministerial advising position and the associated challenges, see Box 2.4.
Box 2.4. Creating a Governance Framework for Ministerial Advisors
The OECD conducted a survey of 27 countries on their ministerial advising systems and found that there remains considerable room for improvement in developing a clear governance framework for ministerial advisors. The survey findings point to the following as potential avenues of reform:
Clearly stating advisors’ job descriptions, delineating their power and functions as distinct from those of senior public servants and setting the boundaries they may not overstep;
Setting clear standards of integrity specifically for ministerial advisors and ensuring that they disclose their private interests pro-actively to identify and prevent conflict of interest;
Increasing transparency not only in respect to the number of advisors but of their overall cost, profiles and competencies;
Clarifying the accountability structure governing ministerial advisors.
Source: Adapted from OECD (2011[38]), Ministerial Advisors: Role, Influence and Management
Leadership in the Evidentiary Advisory Process
In the evidentiary advisory process, the democratically elected officials should have leadership over the structure and process. The National Institute for Health and Care Excellence highlights the importance of effective leadership for accountability in their guidelines. This means, having ‘visible, proactive and inspiring leadership’ (National Institute for Health and Care Excellence, 2013[39]). The policy makers and elected officials should build a sense of trust while maintaining clear boundaries between those who supply evidence and those who make decisions based on the evidence supplied. It should be clear to the public and to all of those involved in the policy-making process that policy makers and elected officials have authority over the policy-making process and the final policy decisions.
The process through which evidence is provided to policy makers should be created by those who are democratically elected (Shaxson, 2019[3]). The process should be clearly defined with a clear starting and ending point for evidence provision and policy advising. After which, policy makers deliberate and make policy decisions, for which they are accountable to the public.
The ministerial advisory system should be managed by government officials. Government officials can do so through playing a role in appointing advisors, and providing guidance on the ministerial advisors’ functions and terms of employment. Government officials can also set clear standards of integrity for ministerial advisors (OECD, 2011[38]). In enacting these measures, government officials can ensure that the advisory system that provides evidence is structured and managed by democratically elected officials.
Aside from individual ministerial advisors, many governments are also given advice from advisory bodies and have sought to create clear roles and a separation between government and advisory bodies to ensure the independence of policy advice. In a survey of 15 countries conducted by the OECD, it was found that many countries have put measures in place to regulate the division of roles between the advisory bodies and elected officials, including regulations on conflict of interest, ethics and corruption (OECD, 2017[40]).
While the separation of responsibilities and independence of advisory bodies are crucial, the right balance must be reached between the extremes of isolation and dependence. Co-creation, iteration and building effective operational relationships are important for understanding the needs of policymaking by evidence providers, and helps better respond to complexity of the tasks (Mair et al., 2019[30]) (OECD, 2017[40]).
Key questions
Are there clear policies and procedures in place clarifying the responsibilities of those providing evidence to the policy-making process?
Are the roles of advisors and policy makers clearly defined and understood?
Are the structure and processes involved in the evidence advisory processes obliged to give an account and report on their activity?
Contestability
Evidence must be open to critical questioning and appeal, which can include enabling challenges over decisions about which evidence to use.
Why is it important?
The need for contestability – in the form of appeals processes and opportunity for public debate is premised on the principle that having data, evidence and arguments open to questioning and challenge is a key element of the scientific process (Hawkins and Parkhurst, 2016[41]) and a critical part of good governance to ensure trustworthy participatory processes. Furthermore, evidence and its policy implications are typically uncertain and open to debate and disagreement, suggesting that a range of voices and views need to be heard in the process of moving from the evidence base to tangible policy solutions.
Summary of the mapping of existing approaches
Processes for when evidence is uncertain and contested.
Most of the approaches dealing with contestability, highlight the importance of ensuring an openness to critical questioning, including both the evidence that has been used and also the processes through which it has informed decision-making (Shaxson, 2019[3]).
Many of the evidence based on clearinghouses and What Works centres, would have some form of stakeholder input to provide a critical contribution and scrutiny about the evidence that underpins an assessment. This can be formalised in terms of entities such as the ‘Stakeholder Advisory Review Groups’ that exist for the reviews carried out by the Campbell Collaboration (2019[42]). The identification and involvement of a Stakeholder Advisory Review Group can have a number of positive benefits, which include assisting the review team by providing critical challenges about the decisions in the systematic review process, including the inclusion and exclusion criteria in order to maintain relevance to the stakeholder audience.
The European Union has a number of approaches to using stakeholder engagement to tackle the contestability and uncertainty of evidence used in decision-making2. The Commission’s Principles and guidelines on the collection and use of expertise (European Commission, 2002[43]) note that many policy decisions must be made on contentious issues in the face of uncertainty. This can lead to decision makers being confronted with conflicting expert opinions, coming variously from within the academic world, emanating from different starting assumptions, and different objectives. The Commission provides several practical questions to help Commission departments’ design arrangements appropriate to circumstances of specific cases. These practical questions include the following:
Is the advice properly substantiated and documented?
Should the advice be submitted to other persons for comments or validation?
Will this be a scientific peer review?
Is it appropriate to submit the advice to scrutiny and comments from a wider circle of experts and interested parties?
Have arrangements been put in place to record and assess unsolicited comments once advice has been published?
Does the issue require interaction between the experts, interested parties and policy-makers?
The European Food Safety Authority (European Food Safety Authority, 2014[44]) has produced its own detailed guidance on ‘expert knowledge elicitation (see Box 2.5).
Box 2.5. The European Food Safety Authority’s Guidance on Expert Knowledge Elicitation in food and Feed Safety Risk Assessment
In the EU, risk assessment in food and feed safety is the responsibility of the European Food Safety Authority (EFSA). In 2012, a working group was established to develop guidance on expert knowledge elicitation appropriate to EFSA's remit.
In an ideal world, quantitative risk models should be informed by systematically reviewed scientific evidence. In practice, empirical evidence is often limited and in such cases, it is necessary to turn to expert judgement. However, psychological research has shown that unaided expert judgement of the quantities required for risk modelling - and particularly the uncertainty associated with such judgements - is often biased, thus limiting its value.
To address this issue, methods have been developed for eliciting knowledge from experts in as unbiased a manner as possible. EFSA’s guidance first presents expert knowledge elicitation as a process beginning with defining the risk assessment problem, moving through preparation for elicitation (e.g. selecting the experts and the method to be used) and the elicitation itself, culminating in documentation.
Three detailed protocols for expert knowledge elicitation are provided - that can be applied to real-life questions in food and feed safety - and the pros and cons of each of these protocols are examined.
The Sheffield Protocol with group interaction of experts (behavioural aggregation)
The Cooke Protocol with use of seed questions for the calibration of experts (mathematical aggregation)
The Delphi protocol on written individual expert elicitation (i.e. remote) with feedback loops (mixed behavioural and mathematical aggregation).
The guidance also contains principles for overcoming the major challenges to expert knowledge elicitation: framing the question; selecting the experts; eliciting uncertainty; aggregating the results of multiple experts; and documenting the process.
Source: Adapted from European Food Safety Authority (European Food Safety Authority, 2014[44]), Guidance on Expert Knowledge Elicitation in Food and Feed Safety Risk Assessment.
A further set of issues about the contestability of evidence concerns the erosion of the authority of science in the context of social media use, the increased diffusion of fake news, and an increasingly polarised values based political debate (Allcott and Gentzkow, 2017[45]). Social media outlets such as Facebook and Twitter have a fundamentally different structure to earlier media technologies. It is now possible for content to be relayed among users without significant third party verification, fact checking or relevant editorial process (Allcott and Gentzkow, 2017[45]). This unverified material makes it both easier to diffuse ‘fake news’ as well as cast doubts on legitimate news. This news imposes both private and societal costs by making it more difficult for citizens to infer the true state of the world. This is compounded by the so called ‘echo chamber effect’, whereby online communication leads to selective exposure and ideological segregation (Barberá et al., 2015[46]). Initiatives such as the International Fact Checking Network are using algorithms to identify fake content and validating information sources - two approaches that have been developed to address these issues (Figueira and Oliveira, 2017[47]).
Other initiatives work on bridging the gap between the cultures of the various stakeholders and providing insight into the use of evidence in political behaviour. For example, France Stratégie, who questions the difficulties involved in the production, distribution, and circulation of ‘correct’ information from the point of view of elected representatives, researchers, administrators, community leaders, journalists, and think tanks (France Stratégie, 2019[48]), or the French Council of State which recommends to use evaluations as part of democratic deliberations (Conseil d'Etat, 2020[49]). Similarly, the European Commission also examines how evidence-informed policymaking processes influence political behaviour (Mair et al., 2019[30]).
Processes for stakeholder engagement in participatory evidence assessments and recommendations
Many of the evidence-based clearinghouses conducting health technology assessments and assessments of social services have procedures for relevant stakeholders to request a reassessment of the judgment reached about an intervention. For example, the US Clearing House HomVEE has a detailed process that enables stakeholders to request a reconsideration of ‘evidence determinations’ (see Box 2.6). The UK Early Intervention Foundation has a similar process, allowing stakeholders to request a reassessment of an evidence rating if they feel that the standards of evidence have not been properly applied (Early Intervention Foundation, 2017[50]).
The notion of contestability is also relevant to evaluations carried out by other elements of the government ecosystem. The International Standards of Supreme Audit Institutions (INTOSAI, 2016[51]) aim to help Supreme Audit Institutions and other entities in charge of evaluation to analyse neutrally and independently the utility of a public policy. This guidance recommends a ‘Clearing Stage’ in which a draft report containing the results and the analyses are shared and discussed with stakeholders of the policy evaluated. The purpose of this exercise is to ensure that the provisional analyses and conclusions are accurate and that the opinions of stakeholders can be gathered and integrated where appropriate, thus ensuring accountability (INTOSAI, 2016[51]). This form of stakeholder engagement is also a critical part of Regulatory Impact Assessment, in order to elicit feedback from citizens and other affected parties so that regulatory proposals can be improved and broadly accepted by society (OECD, 2019[52]; OECD, 2018[53]).
Box 2.6. HomVEE’s ‘Requests for Reconsideration of Evidence Determinations’
In the event that a US State, a researcher, an intervention developer or other stakeholder believes that, the Department for Health and Human Services (HHS) criteria for evidence of effectiveness for a particular intervention, contains an error these can be raised with the review team.
A ‘request for reconsideration of the evidence based determination’ can be requested based on misapplication of the HHS criteria, or missing information, or errors on the HomVEE website.
The HHS review team considers the request and, if approval is granted a re-review team composed of members external to the original team conducts a the new independent review. The re-review team will provide assurance that they do not have any actual or perceived conflicts of interest. The re-review team is certified and trained in the HomVEE standards.
The re-review team utilizes the original empirical articles (see the model reports), any information submitted by the individual raising the concern, the original review team’s reports, and make any needed queries to the original team. The goal will be to issue a final decision as to whether the standards were accurately applied or not within 60 days of the submission of the request for review. Following the decision, the requester will be notified of the decision and, if necessary, any adjustments to the model reports or HomVEE website will be made.
Source: Adapted from HomVEE (2018[54]), Assessing Evidence of Effectiveness.
In the health area in the UK, the National Institute for Clinical Excellence (NICE,) has developed a few principles that guide the development of NICE guidelines and standards. One of NICE’s principles stipulates that people interested in a topic area should be given the opportunity to comment on and influence NICE’s recommendations (See more in Box 2.7). In the area of Sustainable Development, the French National Institute for Industrial Environment and Risks (INERIS/CDDEP) has also developed guidelines for engaging with stakeholders3.
Box 2.7. Stakeholder engagement in NICE recommendations
NICE recommendations are based on complex considerations of the evidence by NICE committees, and it is important therefore that a wider group of stakeholders also have the opportunity to comment.
This wider consultation helps ensure the validity of the final recommendations. The principles of the NHS Constitution also require NICE to be accountable to the public, communities and patients that NICE serves, and to make decisions in a clear and transparent way.
NICE’s guidance and standards are therefore developed using a process that takes into account the opinions and views of the people who will be affected by it. NICE consults openly with organisations that represent people using services, carers and the wider public as well as health and social care professionals, NHS organisations, industry, social care businesses and local government.
NICE’s advisory committees consider and respond objectively to comments and, where appropriate, amend the recommendations.
Source: adapted from NICE (National Institute for Health and Care Excellence, 2013[39]).
Key questions
Is the evidence used in decision making open to critical questioning, as well as the underlying process through which it was used?
Are there opportunities for opening up processes through stakeholder engagement, to ensure participatory processes, including with experts, in policy areas where the evidence is contentious and uncertain?
Are there procedures to ensure that knowledge coming from experts on contested issues is as unbiased as possible?
Are there opportunities for stakeholders to comment on and influence in the recommendations produced based on an evidence?
Public representation in decision making
There should be public engagement that enables stakeholders and members of the public to bring their multiple competing values and concerns to be considered in the evidence utilisation process, even if not all concerns can be selected in the final policy decision.
Why is it important?
Government has long used engagement and participation to earn trust and overcome complexity. Traditionally engagement has focused on getting buy-in from communities for a policy. But engagements should not focus solely on buy-in, and ‘managing’ citizens and stakeholders and their expectations, looking to minimise opposition. Rather, public servants should see the public as a source of expertise, and that engaging with them (particularly through innovative engagement methodologies like public deliberation) can forge a partnership to overcome complexity. Finally, engagement helps in supporting countries strategies and initiatives towards Open Government (i.e. transparency, integrity, accountability and stakeholders’ participation), in line with the OECD Recommendation of the Council on Open Government (OECD, 2017[35]), as a compound for good governance, democracy that promotes inclusiveness (OECD, 2016[55]).
Summary of the mapping of existing approaches
Guidelines and standards that discuss public deliberation focus on stakeholder and public consultations, and public inclusion in evidence synthesis.
Stakeholder and public consultations
Many of the approaches focus on the importance of including a variety of stakeholders in the policy-making process. The OECD encourages public deliberation that includes a wide range of stakeholders in public consultations to ensure that the policy “serves the public interest and is consistent with societal values” (OECD, 2017[56]). Guidelines published by Oxford University also emphasise including a larger range of stakeholders to increase public participation in the policy process as well as in the creation and use of evidence (Oliver and Pearce, 2017[57]). Such as these initiatives, other sources specify that consultations enable stakeholders to reach consensus on research questions and shape research agenda decisions; and it can be a factor for successful applications (Ferri et al., 2015[58]).
Stakeholder engagement can include a variety of groups in the public, with guidelines each emphasising the importance of different groups. The European Science organisation EuroScientist, highlights the need to include stakeholders from the private sector, civil society organisations and NGOs in public deliberations (EuroScientist, 2017[59]). (What Works Clearinghouse, 2020[60]) (What Works Clearinghouse, 2020[60]) Ferri et al (2015[58]) explore interventions’ effectiveness together with a stakeholder’s consultation to ensure a broader participation in the definition of research needs. In this in-depth analysis, the authors report international initiatives such as James Lind alliance (2019[61]), a non-profit initiative dedicated to stakeholders’ involvement in research priority setting, which enables patients, carers and clinicians to reach consensus on research questions, and it explicitly excludes the pharmaceutical industry and non-clinicians researchers.
It is also important to ensure that the perspectives of the people receiving services are consulted, including marginalised groups, to understand how they feel about the policy and ways in which they feel policy will affect them (Bond, 2018[62]). The Society for Prevention Research also includes the need for consultations with communities, institutions and public agencies (Leadbeater et al., 2018[31]).
Public deliberation can allow politicians to understand what the public thinks on different issues and allows for the public to more thoroughly discuss their views and what they want compared to public feedback through polls or referendums (Chwalisz, 2017[63]). One example of a forum for public deliberation is mini-publics where a group of randomly selected individuals (usually 24 to 48 people) will come together and deliberate on an issue for a period of time (usually 2 to 4 months) and will try to come up with a solution (Chwalisz, 2017[63]). Another interesting example are the Citizens’ Assemblies that have been put in place in France to identify ways to address climate change4.
For more information on mini publics, see Box 2.8.
Box 2.8. Using Mini-Publics for Deliberation
In the public deliberation process, some governments will form mini-publics, where a group of citizens will come together and deliberate on a certain policy or issue. The organisation New Democracy has collected data on mini-publics in Australia through participant surveys and has received very positive feedback.
In the deliberation process, the mini-public members establish their own agreed behavioural guidelines, set the criteria for evaluation, gather and test information, brainstorm solutions, prioritise the possibilities, agree on recommendations and account for minority opinions when consensus is not found, and collectively writing a report.
Post-deliberation, many departing citizens speak of challenging but surprisingly respectful conversations, despite individual differences; deep exploration of issues, with a shared motivation to solve a problem; and an enhanced ability to think critically.
In Anonymous feedback post-deliberation, participants usually say they would do it again and they want decision makers to make many more, similar opportunities available to their fellow citizens.
Source: Adapted from Carson (2017[64]), New Democracy Research and Development Note: Deliberation
There are many benefits to public deliberation; however, policy makers should be aware of and acknowledge the “potential for bias and vested interests” (EuroScientist, 2017[59]) . Public deliberation entails listening and including the views of the public; policy makers must take into account the many diverse views but policy makers do not necessarily need to agree with or integrate the opinions of the public groups into the final policy. The appropriate choice of stakeholder engagement methods will depend, therefore, on the rigour and relevance of the knowledge already available about a specific issue and its context in the implementation stage (Oliver et al., 2018[65]).
Stakeholder Engagement in Evidence Synthesis
Evidence syntheses, such as systematic reviews, attempt to summarise the best available evidence on a specific issue, using transparent procedures to locate, evaluate and integrate the research findings. The technical aspects of carrying out evidence synthesis are described in chapter 3, but a further issue that has received attention is how to ensure appropriate stakeholder engagement in the process of creating evidence synthesis. Stakeholder engagement in the evidence synthesis process can ensure that research findings are more effectively put into practice (Pollock et al., 2018[66]). Stakeholder engagement can also ensure that the research has ‘real-world’ relevance and applicability. Nonetheless, other authors suggest that the appropriate methods for stakeholder engagement with knowledge production will depend on the clarity and consensus about core concepts shaping a research, and whether or not the purpose is to generate research findings that are generalizable beyond the context of the research setting (Oliver et al., 2018[65]).
Stakeholder engagement can be done through several different means. In a review that analysed 291 systematic reviews, the most common ways that stakeholders were involved in the systematic review process was through one-off face to face meetings with targeted participants, general meetings open to the public and online surveys (Pollock et al., 2018[66]). The systematic reviews varied in how they engaged stakeholders in the systematic review process. Some reviews included stakeholders throughout the review process or during different stages of the process, while other reviews only included stakeholders in interpreting the results after the evidence was synthesised (Pollock et al., 2018[66]). Overall, the evidence is still unclear as to what the best ways are for including stakeholder participation in evidence synthesis.
Key questions
Are a variety of stakeholders given the opportunity to give input or feedback during the policy-making process?
Are there mechanisms to ensure meaningful engagement, so that stakeholders can discuss their views, including deliberative techniques such as mini publics?
When engaging with stakeholders, are policy makers aware of the potential biases and vested interests of the different stakeholders?
Are there mechanisms in place to ensure that the evidence collected is presented in a format that is easy to interpret and apply?
Transparency in the use of evidence
Evidence should be clearly visible and open to public scrutiny. The public should be able to see how the evidence base informing a decision is identified and utilised and for what purpose.
Why is it important?
Transparency in the gathering and use of evidence is an essential part of the democratic process. Ensuring transparency can help to build trust with the public and experts in the policy-making process (McDonald, 2010[67]). Transparency also enables accountability and facilitates stakeholder participation (OECD, 2019[68]).
Summary of the mapping of existing approaches
Transparency is well covered in existing approaches to the use of evidence, which includes the transparency of decision making and the transparency in disclosing information.
Transparency of decision making
Many guidelines and principles align with Parkhurst’s definition, emphasising that policy-makers should make information available to the public, in a form that is meaningful and understandable to the citizens and stakeholders concerned. In order to be transparent, governments need to disclose and make accessible to the public relevant government information and data through explicit open government data strategies and through securing access to information (OECD, 2016[55]).
Transparency in decision-making requires policy makers to explain and justify the process and purpose of evidence gathering. This includes disclosing how policy makers decide what evidence to use or not use in their policy creation (Ministry of Health). The Commission of the European Communities, in their report on principles and guidelines for evidence use in policy-making, stress that policy makers should be able to explain and justify their decisions on “the way issues are framed, experts are selected, and results handled” (Commission of the European Communities). This includes disclosing the framing and methodology of the evidence, assumptions and limitations, and any complexities or contentions within the findings of the evidence used in the decision-making process. As well, policy makers should disclose how they weigh the evidence and determine what evidence is useful based on the transparent presentation of the evidence. For more information on transparency in presenting evidence, see Box 2.9.
Box 2.9. Transparency in the Presentation of Evidence
The Royal Society of the UK, in their principles on the use of evidence in policy-making, outline key ways to ensure that those who are advising policy makers on research are transparent in their portrayal of evidence. This is important because evidence that is transparent in its sources and creation “is likely to be more credible, replicable and useful.”
In presenting evidence, it should:
Clearly describe the research question, methods, sources of evidence and quality assurance process,
Communicate complexities and areas of contention,
Acknowledge assumptions, limitations and uncertainties, including any evidence gaps,
Declare personal, political and organisational interests and manage any conflicts.
Source: Adapted from The Royal Society (2018[69]), Evidence synthesis for policy a statement of principles
When policy makers are not the ones who selected evidence to consider, but instead are presented with synthesised evidence, policy makers should be transparent about the evidence synthesis process. Transparency in evidence synthesis ensures that it is made clear how the research question was formulated and how evidence was selected to be included in the synthesis. For more information on transparency in evidence synthesis, see Box 2.10.
Box 2.10. Transparency in Evidence Synthesis
The European Food Safety Authority created a list of principles for the synthesis of evidence and outlined how to ensure transparency in the synthesis process. Those who synthesize information should have a protocol that explains and defines the following:
The review question and objective
The criteria for study inclusion or exclusion
The methods for searching research studies, methodological quality of the included studies, and synthesising the data from the included studies.
Source: Adapted from European Food Safety Authority (2010[70]), Application of systematic review methodology to food and feed safety assessments to support decision making.
Many of the guidelines emphasise the importance of policy makers being transparent in their reasoning for the policy decisions they make. The EU, in their Declaration on Ethics and Principles for Science and Society Policy-Making, state that transparency within the decision-making process includes policy makers disclosing all sources of input, including non-scientific, that were considered when making policy decisions (EuroScientist, 2017[59]). The policy-making process is complex and policy makers must consider many inputs including community desires, public opinion, budget, timeline, etc. The different inputs that were factors in the decision-making process and the way in which these factors were weighed should be disclosed to the public so that they are able to understand how a final decision was made.
Transparency in disclosing information
To be transparent governments need to take steps to ensure that information is available and accessible to the public. To ensure that the public can access the information, policy makers should use a variety of channels to disseminate the information. A number of guidelines propose ways for policy makers to disseminate information and OECD has been generally proactive in this regard. The OECD work on Enabling the strategic use of data for productive, inclusive and trustworthy governance explores how to make sure data and digital technologies (existing and emerging) are used to enact openness, such as the use of trends and patterns in order to mitigate emerging risks and respond to developing crises; and the use of data to understand problems, engage the public and provide access to insights for improving public services that meet user needs, while creating the conditions for robust, evidence-based policy making (van Ooijen, Ubaldi and Welby, 2019[71]).
The Commission of the European Communities recommends that policy makers are proactive in their communication and “constantly seek ways to better publicise and explain its use of expertise to interested parties and the public at large” (European Commission, 2002[43]). The US’ Office for Management and Budget advise that information should be made easily accessible through posting information online (Office of Management and Budget, 2018[72]). The OECD’s report on Scientific Advising found that many advisory boards tend to use the more traditional forms of communication such as printed reports or online editions. However, to reach a broader audience, policy makers and advisory boards should also use sources that are more frequently read by the public including traditional media (newspaper and television), as well as social media (OECD, 2015[73]). Using these sources can facilitate a larger discussion around the policy and can include a broader section of society.
In creating transparency, policy makers need to ensure that the information they make available is understandable to the public. To make information more understandable, The Royal Society recommends that policy makers use plain language, avoid jargon, and present the information in a clear, concise and objective manner (The Royal Society, 2018[69]). This will ensure that the information is more accessible to the public, including individuals who are not scientific experts. However, policy makers still need to also make available the more technical and detailed information about the evidence so that those in the public who do have a greater understanding of the topic can engage with the technical aspects of the evidence
Key questions
Is the evidence underpinning policy advice clearly accessible?
Is the underlying data made publicly available through explicit open government data commitments?
Is it possible to justify decisions with reference to how issues were framed, how experts were selected and how evidence was interpreted?
Is information presented to the public in a clear and understandable fashion?
Building evidence through emerging technologies and mobilising data
Why is it important?
Due to the fast advancement of digital technology, a wealth of standards currently being developed aim at encouraging and guiding governments’ use of emerging technologies and data to develop, govern, and improve data and hence evidence generation in the public sector, relying on the new powers of data (e.g. Big Data, Open Data) and artificial intelligence. This chapter offers a preliminary overview of these issues, while acknowledging that this remains a fast moving field. The goal is mainly to ensure that the reader can be broadly aware of both the opportunities and challenges offered by these technologies and by the new data environment, while ensuring that these can be appropriately mobilised to feed into policy relevant evidence informed decision making processes.
Although these set of standards can be seen as a “softer” option to regulations (Wagner, 2018[74]), setting and following them are particularly important because this ensures both efficiency, consistency and ethical behaviours of public servants when they are handling government held data, as well as of those in charge of processes where these data are fed into algorithms and emerging technology systems supporting policy processes to inform policy decision making. This is important to maintain trustworthy processes in an area of rapid technological development.
Summary of the mapping of existing approaches
Achieving the promise of emerging technologies through appropriate institutional set ups
With emerging technologies playing an important role in decision making processes and in the generation, access, sharing and use of data and evidence, organisations need to gain trust from citizens in their responsible behaviours and practices. For this, governments have often established data governance arrangements, inclusive of independent bodies, and specific legislations and frameworks with the following four dimensions: ethics, privacy and consent, transparency and security. The goal is that the outcomes of the emerging technologies can represent valuable contributions to evidence-informed policy-making while adhering to values and respecting the rights of individuals.
Since something lawful does not necessary mean ethical, many governments rely on independent bodies and frameworks to ensure data and information are generated, accessed, shared and used in a responsible manner. Bodies operating at arms’ length from government support good data governance practices across public sector entities to further build their capability to use data and manage it as a valuable strategic asset for instance by to reducing access barriers to data, implementing data standards and experimenting with new methodologies and data itself in a safe environment whilst using the data ethically and responsibly. These bodies have been set up in many OECD countries. For example, in Portugal, the National Commission for Data Protection (CNPD) is an independent entity with powers of authority extending throughout the country. It supervises and monitors compliance with the laws and regulations in the area of personal data protection, with strict respect for the human rights and the fundamental freedoms and guarantees enshrined in the Constitution and the law. Public and private entities have to notify the CNPD regarding any personal data treatment made by them. Another example is the CNIL in France (CNIL, National Commission on Information and Liberty). The latest OECD Digital Economy Outlook offers an overview of recent developments (OECD, 2020[75]), where the majority of countries reported to have in place a type of legislation for privacy and personal data protection. Even so, the report highlights the effort of many countries to address the collection, processing and sharing of personal data to support COVID-19 by endorsing privacy-enhancing solutions such as systems to manage encrypted data to prevent the access to personal data.
Countries face a number of key challenges to achieve a Data Driven Public Sector (2019[76]). One of them is to guarantee the interoperability of data. Interoperability is a critical issue at both the national and regional level to maximise the potential of data. The European Interoperability Framework looks to establish and adopt universal data standards to member states of the European Union. In Korea this is done through the Enterprise Architecture for managing data resources and in Italy, the National Digital Data Platform (PDND) is aimed at improving interoperability of national public information assets.
Developing frameworks and principles for the data governance
Data privacy, consent, and agency are also fundamental for public trust when data is being handled to inform evidence. Due to the increasing amount of data, and particularly sensitive data, being used and held by governments to create evidence for better policy-making, the careful, safe and responsible treatment of that data is essential (OECD, 2019[76]). In 2008, the OECD Recommendation for Enhanced Access and More Effective Use of Public Sector Information had addressed issues of openness, access and transparent conditions for use, as well as quality and integrity among others5. This recommendation provided a framework for the broader and more effective use of public sector information, such as: the presumption of openness as the default rule to facilitate access and re-use; integrity; access and transparent conditions for re-use; and sharing best practices to educate users and re-users, within other principles. In 2014, the OECD Recommendation on Digital Government Strategies (2014[77]) made explicit reference to the need for developing frameworks to enable, guide, and foster access to, use and re-use of the increasing amount of evidence, statistics and data concerning operations, processes and results in the public sector.
Given the growing use of emerging technologies in the public sector for information generation, the transparency of data use, the purpose, the algorithmic systems it informs, and the decisions made once applied are essential for accountability purposes. Indeed, citizens may not be informed about the data being used, how and by whom (Saidot, 2019[78]). This is why transparency in the use, access and sharing of data matters to ensure quality and reliability, contributing to trust in the evidence generated using such data (Ubaldi et al., 2019[79]). Failing to make data or algorithms transparent can also result in difficulties in standardizing evidence, with possible biased results.
Data governance instruments such as frameworks, guidelines and principles provide users with information, resources and approaches to assist them achieve ethical practices and decisions making in handling data, particularly at the practitioner level for they promote self-regulation and control in practice. They are not intended to be prescriptive but aim at widening common understanding and work through ethical concerns.
Within the European Union, the General Data Protection Regulation (GDPR) came into force in May 2018. This regulation guarantees the rights of the data subject to be informed about the use of their data, to access, to edit, to remove, and to restrict the use of their data, and the portability of their data. The regulation intends to protect citizens / data subjects with regard to the processing of their data.
The United Kingdom developed the Data Ethics Framework to guide policymakers and data analysts in the ethical implications of the work they are undertaking. This framework provides a foundation for the work being done in the field of data science, requiring that all activity be as open and accountable as possible (GOV.UK, 2019[80]).The US Federal Data Strategy centres builds on the Evidence-Based Policy making act from 2018 (OECD, 2019[76]). The mission of the Federal Data Strategy is to enhance the value of federal data for the mission, service, and the public good by guiding the Federal agencies in the areas of: Ethical Governance, Conscious Design, and Learning Culture. The strategy comprises principles, practices, and an annual action plan to guide federal data management and use. The practices include guidance on issues such as how to build a culture that values data and on how to promote Efficient and Appropriate Data Use. Similarly, in Canada, the Canadian Data Strategy intends to support federal public services foster a more strategic use of data while protecting citizens' privacy and building on current federal data initiatives to ensure complementarity, coherence and transparency, so that emerging opportunities are understood and quickly acted upon. New Zealand has established Principles for the safe and effective use of data and analytics supported by the Government Chief Data Steward (See Box 2.11). Chief Other countries have established specific posts with the responsibility of helping government to better use data as a resource across government, including:
A nominated Data Protection Officer is part of the regulatory provisions of the UK Government Data Protection Regulation. There are sanctions for organisations if the Data Protection Officer is not properly resourced or supported.
In France, the Lemaire Act, serves the purpose of promoting greater transparency: it aims to ensure a trustworthy public service of data by encouraging innovation and building a framework of trust that guarantees the rights of users while protecting their personal data (Dreyfus, 2019[81]).
In Ireland, the National Research Ethics Committees Bill (2019[82]), along with parallel secondary legislation on clinical trials, promotes a streamlined, regulated, and fit-for-purpose model for the ethical review of health research projects, such as “whether there are adequate safeguards in place to protect the privacy of individuals participating in the health research and the confidentiality of their personal data (Head 23)” where it is not a requirement to provide personal data (Head 33). Equivalent laws exist in many EU and OECD countries.
Although making the processing of data access, sharing and use accessible and transparent is important in maintaining public trust, it is equally crucial to ensure data security. Efforts put in place to secure the processing and protection of data are essential. The increasing number of sophisticated hackers is a challenge that needs to be addressed starting by strengthening digital security in a comprehensive manner, both in the public sector and engaging with citizens. Government needs to not only protect itself but also to help citizens to understand how to keep themselves safe through their online interactions.
Box 2.11. New Zealand Principles for the safe and effective use of data and analytics
The Government Chief Data Steward (GCDS) role supports the use of data as a resource across government to help deliver better services. The GCDS aims to facilitate and enable a joined-up approach across government. As well as developing policy and infrastructure, the GCDS provides support and guidance so agencies can use data effectively, while maintaining the trust and confidence of citizens.
Within its lead roles on data, the Chief Government Data Steward and the Privacy Commissioner have developed the Principles for the safe and effective use of data and analytics. These principles support the development of government agencies’ guidance on the use of data and analytics for decision-making, such as:
Deliver clear public benefit: The use of data and analytics must have clear benefits for the population;
Focus on people: Keep in mind the people behind the data and how to protect them against misuse of information;
Ensure data is fit for purpose: Using the right data in the right context can substantially improve decision-making and analytical models, and will avoid generating potentially harmful outcomes;
Maintain transparency: Transparency supports collaboration, partnership, and shared responsibility, and is essential for accountability;
Retain human oversight: Ensure significant decisions based on data involve human judgement and evaluation and regularly reviewed decision-making processes to make sure they’re still fit for purpose; and
Understand the limitations: Developing data capability helps to create depth of understanding and to implement the most useful data tools while keeping any limitations in mind.
Source: New Zealand Government (2020[83]), Government Chief Data Steward (GCDS) https://www.data.govt.nz/about/government-chief-data-steward-gcds/. Principles for safe and effective use of data and analytics, https://www.stats.govt.nz/about-us/data-leadership/ (Data.gov.nz, 2019[84]).
Balancing the dynamics of innovation and new technologies through good governance
Recent OECD’s work on the regulation and governance of innovation6 recognises that quickly evolving technologies are shifting control away from governments. The challenge is to maintain a balance between innovation and other values such as privacy, transparency and accountability in a complex and rapidly evolving context. The shift in power from the government to a broad range of non-governmental actors requires multi-stakeholder involvement as these non-government actors have had increasingly important roles in representing a variety of interest in the digital society. The inherent challenge is that social and economic actors are increasingly influenced by processes that are invisible to governments. For example, the proliferation of algorithms, with their ‘black box’ nature, creates information asymmetries between the public and private sector, which raise complex accountability, transparency and regulatory concerns. This is all the more the case as such algorithms have an impact on the data and the evidence that are produced as a result. In cases where companies are using Machine Learning algorithms, it can be extremely difficult for regulators to determine what an algorithm is doing and why it is doing it. One possible solution might be to archive versions of the algorithm and to make them available to regulators under a non-disclosure agreement. Non-government actors have an increasingly important role in representing different interests for setting norms in the digital society that need to be reconciled. Consequently, multi-stakeholder approaches are important in setting societal and regulatory goals.
The use of advanced data analytics, with Machine Learning and Artificial Intelligence will require to make objectives explicit, exposing policy trade-offs that had previously been implicit and obscured (Coyle D., 2020[85]). Many of the Machine Learning systems may work as black boxes, with an implicit bias that results from the way that they have been specified in terms of users’ experience. The question is the extent to which weighting decisions and trade-offs can be left with such systems that can only maximise broad utilitarian goals. This will hamper the demands for explaining, justifying and maintaining accountability of the evidence that can be produced through such approaches. These are of course frontier issues in terms of developing policies fit for the time, but which will have to be resolved if these technologies are to play a substantive role in the future beyond mere enforcement decisions such as in the health, police or justice areas.
The development of principles for the use of emerging technologies and new data and their impact on evidence-informed policymaking
The demand for principles has increased massively because of the growing use of emerging technologies, which also directly impacts the generation and use of evidence for policymaking. It is important to note the increasing focus on establishing ethical principles as a supporting framework for developing public policy in a way that avoids setting regulations that might in return have a negative impact on data access and sharing. Many private organisations use options for self-regulation to implement these ethical concerns.
Work is also undertaken at sectoral level, particularly in the health and social services area, where data is critical for evidence informed decision making and yet where many ethical issues arise. The Recommendation of the OECD Council on Health Data Governance (OECD, 2017[56]) calls upon countries to develop and implement frameworks that secure privacy while enabling health data uses that are in the public interest. It recommends that governments establish and implement a national health data governance framework that includes consideration of issues of informed consent and appropriate alternatives. The purpose is to provide clarity on whether individual consent to the processing of their personal health data is required and, if so, the criteria used to make this determination; what constitutes valid consent and how consent can be withdrawn. The New Zealand Social Investment Agency and Statistics New Zealand are developing a shared set of rules for the safe, ethical, and transparent use of social sector data (Cabinet Social Policy Committee, 2017[86]). In order to generate useful data for evidence informed policy making, analysts and researchers often need access personal data across Integrated Data Infrastructures. Such links and the use of integration through unique identifiers needs to happen while protecting the privacy and consent of users.
Besides privacy and consent, it is essential for organisations to make transparent and accountable the way new technologies are being used, the way data are being treated, and how they contribute to policy-making due to the fast development of these areas. For this, governments develop frameworks or principles to advise public servants on their behaviour or best practices when feeding data into algorithmic systems.
As mentioned above, some countries have established their own principles to ease information flow and set standards that enable data to be collected, analysed, and stored in the same way. Data principles are aimed to guarantee that citizens’ data is treated responsibly, which increases evidence reliability, encourages more accurate policy-making and builds public trust. The OECD has also developed a set of principles for Artificial Intelligence, which are practical and flexible enough to stand the test of time in a rapidly evolving field (Box 2.12). They complement existing OECD standards in areas such as privacy, digital security, risk management, and responsible business conduct (OECD, 2019[26]). These principles, serve not only to promote international standards as ways to develop and use AI systems (which lead to standardised methods for the generation of evidence) but also strengthen the trustworthiness of AI outcomes and policy decisions.
Box 2.12. The OECD Principles for Responsible Stewardship of Trustworthy AI
The Recommendation identifies five complementary values-based principles for the responsible stewardship of trustworthy AI:
1. AI should benefit people and the planet by driving inclusive growth, sustainable development, and well-being.
2. AI systems should be designed in a way that respects the rule of law, human rights, democratic values and diversity, and they should include appropriate safeguards – for example, enabling human intervention where necessary – to ensure a fair and just society.
3. There should be transparency and responsible disclosure around AI systems to ensure that people understand AI-based outcomes and can challenge them.
4. AI systems must function in a robust, secure, and safe way throughout their life cycles, and potential risks should be continually assessed and managed.
5. Governments, organisations, and individuals developing, deploying, or operating AI systems should be held accountable for their proper functioning in line with the above principles.
Source: (OECD, 2019[26]).
In addition to data and AI principles, it is equally important to equip public servants with guidelines for handling data in flexible and agile ways while meeting high ethical standards. Assuring citizens that the data about them are being handled by public servants who act responsibly and with accountability according to published guidelines (Box 2.13) will increase public trust. Therefore the OECD has also been developing draft Ethics Good Practice Principles for Data Ethics in the Public Sector (Box 2.13).
Aiming at policymakers, statisticians, analysts, data scientists, and any public officers handling data, these guidelines provide public servants with a framework for the appropriate processing of data. The proposed ethics guidelines in the box below are intended to promote the ethical behavior of public servants as well as the rights of data subjects (citizens). The standardisation of behaviours and consistency of conducts that the guidelines below ensure enable public trust to be maintained.
Box 2.13. Good Practice Principles for Data Ethics in the Public Sector
Draft Good Practice Principles for Data Ethics in the Public Sector have been developed by the OECD Working Party on Senior Digital Government Officials in the context of the OECD work on digital government. As part of broader considerations, the current draft suggests that :
Data use by Governments should serve the public interest.
Data use by governments should deliver public good
There is a need to explore the collective and community nature of data governance, the environmental implications of data infrastructure, and the risks for abuse in the use of data, where risks go beyond the public sphere.
The Good Practice Principles invite to:
Use data with integrity. Government should not abuse its position, the data at its disposal, or the trust of the public.
Be aware of and observe relevant arrangements for trustworthy data access, sharing and use. Public officials should build knowledge on the specific governance arrangements framing data access, sharing and use, to secure they are respected and applied.
Incorporate data ethical considerations into governmental, organisational and public sector decision-making processes. Public officials might consider the incorporation of data ethical considerations for the generation of public sector data and decisions on data collection, funding data projects, and use of data by third-parties.
Safeguard the agency of end-users of AI systems to make the final determination on the action taken following a machine-based recommendation. Public officials should retain control over the data they access, share and use, including to help inform the development and training of those systems.
Be specific about the purpose of data use, especially in the case of personal data. Make sure that data use has a clearly articulated purpose that explains the reason why data are being used and that addresses the concerns of different stakeholders.
Define boundaries for data collection, access, sharing and use. Make sure that your design considers balanced data use by weighing relevant societal costs and benefits, with data minimization as the norm when it comes to personal data. This ensures the quality of design and the ability to explain how data are being used.
Be clear, inclusive and open. The applied use of data should recognize, and mitigate, any potential bias in order that it never leads to discrimination with people in similar cases to ensure they always treated equally.
Broaden individuals’ and collectives’ control over their data. Citizens are empowered and have action perspective to make decisions regarding the sharing of their personal data within, or external to, government.
Be accountable and proactive in managing risks. Governments design mechanisms for giving citizens’ insights into, and consent for the use of their personal data, by organizing internal and external accountability. Stakeholders should know where to address questions, remarks, or mistakes and governments should be responsive to the input of citizens.
Source: Adapted from OECD, (Forthcoming[87]), Good Practice Principles for Data Ethics in the Public Sector.
Principles on the standardisation of data use, AI systems, and public servants’ behaviours ensure the quality of commitment made to be evaluated. Indeed, well-designed and responsible principles enable further collaborations, easier data and information collection, and the sharing of data, as well as more effective and reliable analyses for developing evidence-based contributions to policymaking.
Key questions
How can evidence be collected following national and/or international principles coordinating the key dimensions of ethics, privacy, transparency, and security in the use and access to data?
How to ensure that citizens are given an opportunity to confirm consent to the ethical use of the data to inform evidence as a way to foster trust in the results?
What is the role of standards in ensuring that evidence drawn from AI processes be generated in ethical ways that foster public trust?
What are the respective roles of ethical principles versus regulation to protect the core elements in the use of, and access to, data?
References
[45] Allcott, H. and M. Gentzkow (2017), “Social Media and Fake News in the 2016 Election”, Journal of Economic Perspectives, Vol. 31/2, pp. 211-236, http://dx.doi.org/10.1257/jep.31.2.211.
[46] Barberá, P. et al. (2015), “Tweeting From Left to Right”, Psychological Science, Vol. 26/10, pp. 1531-1542, http://dx.doi.org/10.1177/0956797615594620.
[15] Barlow, J. et al. (2016), “Questioning the outcome of the Building Blocks trial”, The Lancet, Vol. 387/10028, pp. 1615-1616, http://dx.doi.org/10.1016/S0140-6736(16)30201-X.
[62] Bond (2018), An introduction to the principles for assessing the quality of evidence, Bond.
[86] Cabinet Social Policy Committee (2017), “Implementing Social Investment: Report Back”, Office of the Minister of Finance Office of the Minister of State Services, Wellington, http://dx.doi.org/12345.
[42] Campbell Collaboration (2019), Campbell Collaboration Guidance for establishing and managing a Stakeholder Advisor Review Group 1, https://campbellcollaboration.org/guidance-on-establishing-managing-stakeholder-advisory-review-groups.html.
[64] Carson, L. (2017), NewDemocracy Research and Development Note: Deliberation, newDemocracy, http://www.newdemocracy.com.au/research-note-deliberation (accessed on 2 May 2019).
[63] Chwalisz, C. (2017), The People’s Verdict : adding informed citizen voices to public decision-making.
[49] Conseil d’Etat (2020), Conduire et partager l’évaluation des politiques publiques.
[22] Council of the European Union (2015), “Council conclusions on the implementation of the EU Action Plan on Drugs 2013-2016 regarding minimum quality standards in drug demand reduction in the European Union”, https://www.emcdda.europa.eu/system/files/attachments/8043/INT19_EU%20Min%20Quality%20Standards_ST11985.EN15.pdf.
[85] Coyle D., W. (2020), ““Explaining” machine learning reveals policy challenges”, Science,, Vol. 368/6498, pp. 1433-1434, http://dx.doi.org/10.1126/science.aba9647.
[84] Data.gov.nz (2019), Algorithm review underway to increase transparency and accountability, https://www.data.govt.nz/blog/algorithm-review-underway-to-increase-transparency-and-accountability/.
[81] Dreyfus (2019), France: Public service and processing of personal data, https://dreyfus.fr/en/2019/08/05/public-service-and-processing-of-personal-data/.
[50] Early Intervention Foundation (2017), Getting your programme assessed | EIF Guidebook, http://dx.doi.org/12345.
[11] Eckenrode, J. et al. (2010), “Long-term Effects of Prenatal and Infancy Nurse Home Visitation on the Life Course of Youths”, Archives of Pediatrics & Adolescent Medicine, Vol. 164/1, pp. 9-15, http://dx.doi.org/10.1001/archpediatrics.2009.240.
[43] European Commission (2002), Communication from the comission on the collection and use of expertise by the commission: principles and guidelines, Commission of the European Communities, http://ec.europa.eu/governance/docs/comm_expertise_en.pdf (accessed on 22 March 2019).
[44] European Food Safety Authority (2014), “Guidance on Expert Knowledge Elicitation in Food and Feed Safety Risk Assessment”, EFSA Journal, Vol. 12/6, p. 3734, http://dx.doi.org/10.2903/j.efsa.2014.3734.
[70] European Food Safety Authority (2010), “Application of systematic review methodology to food and feed safety assessments to support decision making”, EFSA Journal, Vol. 8/6, p. 1637, http://dx.doi.org/10.2903/j.efsa.2010.1637.
[59] EuroScientist (2017), The Brussels declaration on ethics and principles for science and society policy-making, EuroScientist, http://www.euroscientist.com/policy-making-manifesto-squaring-science-human-factor (accessed on 22 March 2019).
[58] Ferri, M. et al. (2015), “What is needed in future drug treatment research? A systematic approach to identify gaps on effectiveness of drug treatment from the EMCDDA”, Drugs: Education, Prevention and Policy, Vol. 22/1, pp. 86-92, http://dx.doi.org/10.3109/09687637.2014.954988.
[47] Figueira, Á. and L. Oliveira (2017), “The current state of fake news: challenges and opportunities”, Procedia Computer Science, Vol. 121, pp. 817-825, http://dx.doi.org/10.1016/j.procs.2017.11.106.
[48] France Stratégie (2019), Expertise and democracy: Coping with mistrust, https://www.strategie.gouv.fr/english-articles/expertise-and-democracy-coping-mistrust (accessed on 25 October 2019).
[25] Gluckman, P. (2014), “Policy: The art of science advice to government”, Nature, Vol. 507/7491, pp. 163-165, http://dx.doi.org/10.1038/507163a.
[80] GOV.UK (2019), Guidance Data Ethics Framework, https://www.gov.uk/government/publications/data-ethics-framework/data-ethics-framework.
[34] Government of Canada (2019), Algorithmic Impact Assessment (AIA), https://www.canada.ca/en/government/system/digital-government/modern-emerging-technologies/responsible-use-ai/algorithmic-impact-assessment.html.
[82] Government of Ireland (2019), National Research Ethics Committees Bill.
[36] Government Office for Science (2010), Principles of scientific advice to government - GOV.UK, https://www.gov.uk/government/publications/scientific-advice-to-government-principles/principles-of-scientific-advice-to-government (accessed on 2 April 2019).
[6] Government office for Science (2010), The Government Chief Scientific Adviser’s Guidelines on the Use of Scientific and Engineering Advice in Policy Making, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/293037/10-669-gcsa-guidelines-scientific-engineering-advice-policy-making.pdf.
[41] Hawkins, B. and J. Parkhurst (2016), “The ’good governance’ of evidence in health policy”, Evidence & Policy: A Journal of Research, Debate and Practice, Vol. 12/4, pp. 575-592, http://dx.doi.org/10.1332/174426415X14430058455412.
[8] Health Evidence (2018), Quality Assessment Tool, https://www.healthevidence.org/documents/our-appraisal-tools/quality-assessment-tool-dictionary-en.pdf (accessed on 8 March 2019).
[32] Healy, D. (2019), “The crisis in Cochrane: Evidence Debased Medicine”, Indian Journal of Medical Ethics, Vol. IV/1, pp. 52-54, http://dx.doi.org/10.20529/ijme.2018.091.
[54] Home Visiting Evidence of Effectiveness (2018), Assessing Evidence of Effectiveness, https://homvee.acf.hhs.gov/Review-Process/4/Assessing-Evidence-of-Effectiveness/19/7 (accessed on 19 February 2019).
[51] INTOSAI (2016), Guidelines on the Evaluation of Public Policies, INTOSAI.
[61] James Lind Alliance (2019), About the James Lind Alliance, http://www.jla.nihr.ac.uk/ (accessed on 22 April 2020).
[7] Langeveld, K., K. Stronks and J. Harting (2016), “Use of a knowledge broker to establish healthy public policies in a city district: a developmental evaluation”, BMC Public Health, Vol. 16/1, p. 271, http://dx.doi.org/10.1186/s12889-016-2832-4.
[20] Lavis, J. et al. (2010), SUPPORT Tools for evidence-informed health Policymaking (STP) 9: Assesing the applicability of the findings of a systematic review, BioMed Central, http://dx.doi.org/10.1186/1478-4505-7-S1-S9.
[31] Leadbeater, B. et al. (2018), “Ethical Challenges in Promoting the Implementation of Preventive Interventions: Report of the SPR Task Force”, Prevention Science, pp. 1-13, http://dx.doi.org/10.1007/s11121-018-0912-7.
[10] Leviton, L. and M. Trujillo (2017), “Interaction of Theory and Practice to Assess External Validity”, Evaluation Review, Vol. 41/5, pp. 436-471, http://dx.doi.org/10.1177/0193841X15625289.
[4] Love, P. and J. Stockdale-Otárola (eds.) (2017), Debate the Issues: Complexity and Policy making, OECD Insights, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264271531-en.
[30] Mair, D. et al. (2019), Understanding our political nature: How to put knowledge and reason at the heart of political decision-making, Publications Office of the European Union, http://dx.doi.org/10.2760/374191.
[67] McDonald, R. (2010), The Government Chief Scientific Adviser’s Guidelines on the Use of Scientific and Engineering Advice in Policy Making, Government Office for Science, https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/293037/10-669-gcsa-guidelines-scientific-engineering-advice-policy-making.pdf (accessed on 22 March 2019).
[13] Mejdoubi, J. and S. Midwifery (2014), “Effects of nurse home visitation on cigarette smoking, pregnancy outcomes and breastfeeding: a randomized controlled trial”, nursingplus.com, http://www.nursingplus.com/article/S0266-6138(13)00243-X/abstract (accessed on 31 January 2018).
[21] Moberg, J., P. Alonso-Coello and A. Oxman (2015), GRADE Evidence to Decision (EtD) Frameworks Guidance. Version 1.1, The GRADE Working Group, https://ietd.epistemonikos.org/#/help/guidance (accessed on 21 April 2020).
[18] Munthe-Kaas, H., H. Nøkleby and L. Nguyen (2019), “Systematic mapping of checklists for assessing transferability”, Systematic Reviews, Vol. 8/1, http://dx.doi.org/10.1186/s13643-018-0893-4.
[39] National Institute for Health and Care Excellence (2013), How NICE measures value for money in relation to public health interventions, https://www.nice.org.uk/Media/Default/guidance/LGB10-Briefing-20150126.pdf (accessed on 1 May 2019).
[83] New Zealand Government (2020), Government Chief Data Steward, https://www.data.govt.nz/about/government-chief-data-steward-gcds/.
[37] Nuffield Council on Bioethics (2012), Emerging biotechnologies: technology, choice and the public good, Nuffield Council on Bioethics, http://www.espcolour.co.uk (accessed on 2 April 2019).
[16] OECD (2020), Delivering evidence based services for all vulnerable families, http://www.oecd.org/officialdocuments/publicdisplaydocumentpdf/?cote=DELSA/ELSA/WD/SEM%282020%298&docLanguage=En.
[75] OECD (2020), OECD Digital Economy Outlook 2020, OECD Publishing, https://doi.org/10.1787/bb167041-en.
[52] OECD (2019), OECD Best Practice Principles for Regulatory Policy: Regulatory Impact Assessment, OECD, Paris, http://dx.doi.org/123.
[26] OECD (2019), OECD Principles on Artificial Intelligence, https://www.oecd.org/going-digital/ai/principles/ (accessed on 13 January 2020).
[68] OECD (2019), Openness and Transparency - Pillars for Democracy, Trust and Progress, https://www.oecd.org/fr/corruption/opennessandtransparency-pillarsfordemocracytrustandprogress.htm (accessed on 2 May 2019).
[76] OECD (2019), The Path to Becoming a Data-Driven Public Sector, OECD Digital Government Studies, OECD Publishing, Paris, https://dx.doi.org/10.1787/059814a7-en.
[53] OECD (2018), OECD Regulatory Policy Outlook 2018, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264303072-en.
[40] OECD (2017), “Policy Advisory Systems: Supporting Good Governance and Sound Public Decision Making”, https://www.oecd-ilibrary.org/docserver/9789264283664-en.pdf?expires=1556887846&id=id&accname=ocid84004878&checksum=7A3FD77BBF5B7A94E579CD379493A5CA (accessed on 3 May 2019).
[35] OECD (2017), Recommendation of the Council on Open Government, http://www.oecd.org/gov/Recommendation-Open-Government-Approved-Council-141217.pdf.
[56] OECD (2017), Recommendation of the OECD Council on Health Data Governance, OECD, http://www.oecd.org/health/health-systems/Recommendation-of-OECD-Council-on-Health-Data-Governance-Booklet.pdf (accessed on 2 April 2019).
[55] OECD (2016), Open Government: The Global Context and the Way Forward, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264268104-en.
[73] OECD (2015), “Scientific Advice for Policy Making: The Role and Responsibility of Expert Bodies and Individual Scientists”, OECD Science, Technology and Industry Policy Papers, No. 21, OECD Publishing, Paris, https://dx.doi.org/10.1787/5js33l1jcpwb-en.
[77] OECD (2014), “Recommendation of the Council on Digital Government Strategies”.
[38] OECD (2011), Ministerial Advisors: Role, Influence and Management, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264124936-en.
[88] OECD (2008), Recommendation of the Council for Enhanced Access and More Effective Use of Public Sector Information.
[28] OECD (2003), Recommendation of the council on guidelines for managing conflict of interest in the public service, http://www.oecd.org/governance/ethics/2957360.pdf (accessed on 3 October 2018).
[29] OECD (2000), Trust in Government: Ethics Measures in OECD Countries, http://www.oecd.org/puma (accessed on 3 October 2018).
[87] OECD (Forthcoming), Good Practice Principles for Data Ethics in the Public Sector.
[72] Office of Management and Budget (2018), Building Capacity to Produce and Use Evidence, Office of Management and Budget, http://www.whitehouse.gov/omb/evidence. (accessed on 2 April 2019).
[5] Office of Management and Budget (2017), Analytical Perspectives, Budget of the United States Government: Building the Capacity to Produce and Use Evidence, Office of Management and Budget, Washington, https://obamawhitehouse.archives.gov/omb/budget/Analytical_Perspectives (accessed on 28 May 2019).
[12] Olds, D. et al. (2003), “Taking preventive intervention to scale: The nurse-family partnership”, Cognitive and Behavioral Practice, Vol. 10/4, pp. 278-290, http://dx.doi.org/10.1016/S1077-7229(03)80046-9.
[57] Oliver, K. and W. Pearce (2017), “Three lessons from evidence-based medicine and policy: increase transparency, balance inputs and understand power”, Palgrave Communications, http://dx.doi.org/10.1057/s41599-017-0045-9.
[65] Oliver, S. et al. (2018), Stakeholder Engagement for Development Impact Evaluation and Evidence Synthesis.
[1] Parkhurst, J. (2017), The politics of evidence : from evidence-based policy to the good governance of evidence, Routledge, London, http://researchonline.lshtm.ac.uk/3298900/ (accessed on 23 November 2018).
[2] Parkhurst, J. and S. Abeysinghe (2016), “What Constitutes “Good” Evidence for Public Health and Social Policy-making? From Hierarchies to Appropriateness”, Social Epistemology, Vol. 30/5-6, pp. 665-679, http://dx.doi.org/10.1080/02691728.2016.1172365.
[66] Pollock, A. et al. (2018), “Stakeholder involvement in systematic reviews: a scoping review”, http://dx.doi.org/10.1186/s40900-017-0060-4.
[23] Rantala, L. et al. (2017), “How to Earn the Status of Honest Broker? Scientists’ Roles Facilitating the Political Water Supply Decision-Making Process”, Society & Natural Resources, Vol. 30/10, pp. 1288-1298, http://dx.doi.org/10.1080/08941920.2017.1331484.
[14] Robling, M. et al. (2016), “Effectiveness of a nurse-led intensive home-visitation programme for first-time teenage mothers (Building Blocks): a pragmatic randomised controlled trial”, The Lancet, Vol. 387/10014, pp. 146-155, http://dx.doi.org/10.1016/S0140-6736(15)00392-X.
[78] Saidot (2019), A Consortium of Finnish organisations seeks for a shared way to proactively inform citizens on AI use, https://www.saidot.ai/post/a-consortium-of-finnish-organisations-seeks-for-a-shared-way-to-proactively-inform-citizens-on-ai-use.
[3] Shaxson, L. (2019), “Uncovering the practices of evidence-informed policy-making”, Public Money & Management, Vol. 39/1, pp. 46-55, http://dx.doi.org/10.1080/09540962.2019.1537705.
[33] Sturmberg, J. (2019), “Evidence‐based medicine—Not a panacea for the problems of a complex adaptive world”, Journal of Evaluation in Clinical Practice, http://dx.doi.org/10.1111/jep.13122.
[17] The California Evidence-Based Clearinghouse for Child Welfare (2018), Scientific Rating Scale, http://www.cebc4cw.org/ratings/scientific-rating-scale/ (accessed on 25 January 2019).
[19] The GRADE Working Group (2015), Key DECIDE tools, http://www.decide-collaboration.eu/ (accessed on 21 April 2020).
[69] The Royal Society (2018), Evidence synthesis for policy a statement of principles, https://royalsociety.org/-/media/policy/projects/evidence-synthesis/evidence-synthesis-statement-principles.pdf (accessed on 2 April 2019).
[79] Ubaldi, B. et al. (2019), “State of the art in the use of emerging technologies in the public sector”, OECD Working Papers on Public Governance, No. 31, OECD, Paris.
[27] UK Committee on Standards in Public Life (2019), The 7 principles of public life - GOV.UK, http://dx.doi.org/1234.
[71] van Ooijen, C., B. Ubaldi and B. Welby (2019), “A data-driven public sector: Enabling the strategic use of data for productive, inclusive and trustworthy governance”, OECD Working Papers on Public Governance, No. 33, OECD Publishing, Paris, https://dx.doi.org/10.1787/09ab162c-en.
[74] Wagner, B. (2018), Ethics as an Escape from Regulation: From ethics-washing to ethics-shopping?, https://www.privacylab.at/wp-content/uploads/2018/07/Ben_Wagner_Ethics-as-an-Escape-from-Regulation_2018_BW9.pdf.
[9] Wandersman, A. et al. (2000), “Getting to outcomes: a results-based approach to accountability”, Evaluation and Program Planning, Vol. 23/3, pp. 389-395, http://dx.doi.org/10.1016/S0149-7189(00)00028-8.
[60] What Works Clearinghouse (2020), Standards Handbook (Version 4.1), https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Standards-Handbook-v4-1-508.pdf (accessed on 5 February 2019).
[24] Wilsdon, J., M. Saner and P. Gluckman (2018), “INGSA Manifesto for 2030: Science Advice for Global Goals”, INGSA, http://dx.doi.org/10.1057/palcomms.2016.77.
Notes
← 1. See French Law 2013-907, that complements the law 83-634 from 1983 that initially concerned only senior civil servants, and that extends the field of control to the members of governments and heads of many agencies and public organisation.
← 2. Issues around uncertainty create particular challenges for conducting economic evaluation, which is discussed in section 7.
← 3. See www.ecologie.gouv.fr/sites/default/files/CDDEP_Guide%20du%20dialogue%20avec%20les%20parties%20prenantes.pdf.
← 6. The 2020 OECD Global Conference on Governance Innovation addressed the need for agile regulatory frameworks in the context of the 4th industrial revolution, to promote outcome focused, anticipatory approaches, and enhance accountability. See www.oecd.org/fr/reformereg/politique-reglementaire/oecd-global-conference-on-governance-innovation.htm.