A comprehensive evaluation comprises three essential steps: evaluation of the process design integrity; the deliberative experience; and the pathways to impact of a deliberative process. Together, these three steps allow an evaluator to have a full cycle view of a deliberative process.
Evaluation Guidelines for Representative Deliberative Processes
2. What to evaluate? Framework, criteria, and measurement methods
2.1. Three-Step evaluation cycle
Process design integrity: Organisers frame the policy question and design a deliberative process before people gather in the room to deliberate. Evaluators will ask how these decisions were reached, whether the process has clear and legitimate objectives, whether the design choices are in line with those objectives, and whether the process design allows members enough time to learn and deliberate.
Deliberative experience: Once the deliberative process begins, everything that happens “in the room” and “outside the room” is important. These include the breadth, diversity, and clarity of the evidence and stakeholders presented, the quality of facilitation, opportunities to speak, removal of participation barriers, as well as mitigation of undesired attention and/or attempts at interference.
Pathways to impact: Once a deliberative process is completed and recommendations have been produced, the spotlight turns to the uptake of those recommendations by the commissioning body. Responses and justifications are expected for all recommendations. Depending on the type of deliberative process, it may be necessary to measure its uptake by the broader public (for example, when it is followed by a referendum).
2.2 Framework
The following table sets out the evaluation framework based on the three steps of the evaluation cycle and provides an overview of the key criteria for evaluating each of them.
Table 2.1. Framework for evaluating a representative deliberative process
|
Process evaluation |
Outcome evaluation |
|
---|---|---|---|
Process design integrity |
Deliberative experience |
Pathways to impact |
|
Objective |
Evaluating the design process that set up the deliberation |
Evaluating how a deliberative process unfolds “in the room” and “outside the room” |
Evaluating influential conclusions and/or actions of a deliberative process |
Criteria |
|
|
|
Source: Author's own creation
2.3 Evaluation criteria
The evaluation criteria outlined in the framework are detailed below:
1) Process design integrity
Clear and suitable purpose
The deliberative process was commissioned for a suitable purpose, addressing a policy issue. (See Catching the Deliberative Wave report (OECD, 2020a) Chapter 4 section Scope of the remit for guidance.)
The mandate was clear and it was clear how the recommendations will be used.
The deliberative process was connected to the broader political system or policy-making cycle.
Clear and unbiased framing
The question addressed by the deliberative process was framed in a non-leading, unbiased, clear way, easily understandable to the wider public.
Suitable design
The design choices of a deliberative process were aligned with its objectives.
The resulting process was in line with OECD Good Practice Principles, see Annex B. For example, sufficient length of the process, group deliberation, etc.
Procedural design involvement
Organisers had an established process to call for, respond to, and recognise comments from stakeholders regarding the deliberative process design.
A wide range of stakeholders representing diverse views had an opportunity to review the deliberative process design.
Experts in the policy area were consulted over the questions and the choice of evidence provided.
Deliberative democracy experts (in-house or external) were consulted on process design.
Transparency and governance
There were clear terms of reference, rules of engagement, codes of conduct, or ethical frameworks that govern the process. They were followed throughout the process.
Information about the goals, design, governance of the process, funding source, civic lottery, and any other materials were published publicly.
The design of the process was free of external interference.
Representativeness and inclusiveness
Everyone had an equal opportunity, via civic lottery, to be selected as a member of a deliberative process. (For example, all residents or eligible voters.)
The final group of members was a broadly representative sample of the general public (reflecting the demographic composition of a community, city, region, or country). (Anyone looking at the members could see ‘someone like me’ within the process.)
Efforts were made to involve under-represented groups. (In some instances, it is desirable to over-sample certain demographics during the random sampling stage of recruitment to help achieve representativeness.)
Efforts were made to remove barriers to participation. The OECD Good Practice Principles identify remuneration of the members, covering their expenses, and/or providing or paying for childcare and eldercare as helpful ways to encourage and support participation.
2) Deliberative experience
Neutrality and inclusivity of facilitation
The facilitation ensured inclusiveness, equal access to speaking opportunities, and appropriate balance of small group and panel discussions throughout deliberation.
Enough consideration was given for marginalised communities to be heard. (For example, via supportive and mindful facilitation, creating a safe space for expression, devising specific strategies for encouraging participation by those who are not used to speaking in public or who may feel intimidated.)
Facilitation was neutral regarding the issue addressed.
Accessible, neutral, and transparent use of online tools
Any online tools used throughout the process were equally accessible to all members. There was assistance, training, equipment, and internet connection offered and/or provided. (For some members who are unfamiliar with the internet or online tools, it may be necessary to have one-on-one support during the process.)
The design of the online tools used was neutral and transparent (for example, the algorithms or formulas used for preference or vote counting were explicit and clear, online tools ensured anonymity of members when needed, and the results calculated/aggregated using online tools were auditable).
Breadth, diversity, clarity, and relevance of the evidence and stakeholders provided
Members were provided a solid and accessible information base featuring a wide range of accurate relevant, clear and accessible evidence and expertise, sufficient for effective participation and to address the remit set.
The information base as a whole was neutral, with a breadth of diverse viewpoints represented. (Ensured, for example, through mapping all the arguments of the issue with stakeholders to see whether all relevant areas and viewpoints are reflected in the information base.)
The information base was accommodating to members with different learning styles and included materials in a variety of forms (written, video, in-person expert presentations etc.).
There was a wide range of stakeholder views. (This could include an element of public submission.)
The selection of sources was transparent, revealing the curator and the basis for selecting the content. People in charge of preparing the information base had declared any potential conflict of interest.
Members had a possibility to submit evidence for consideration and request additional information.
Quality of judgement
There was consideration of conflicting values and structural issues underlying the question at hand.
There was an emphasis on diversity of viewpoints, weighing of alternatives and trade-offs, exploring uncertainties, and exposing assumptions.
Members provided justifications for their viewpoints.
Members approached the process with open-mindedness.
Members considered and integrated range of evidence in their judgements.
Perceived knowledge gains by members
Members have exercised and gained empathy by developing mutual understanding and considering different views and experiences.
Members have gained a clearer understanding of each other’s opinions.
After deliberation, members have a better understanding of both the policy issue and the public decision-making process in general.
The opinion of each member became clearer through deliberation and moved towards informed judgement.
Accessibility and equality of opportunity to speak
All members had equal speaking opportunities, opportunity to influence the discussions, and equal access to any necessary support, tools, or resources during the process.
Members had the opportunity to provide ongoing feedback and suggest modifications of the process (such as asking for more time or reporting experienced bias).
Respect and mutual comprehension
Interactions amongst members were respectful.
There was careful and active listening, as well as interactive deliberation that allowed members to weigh each other's views.
All members felt heard in the process.
Free decision making and response
The implementation of the process was free of interference beyond set roles and processes (i.e. intrusions by experts, steering group members).
The final recommendations represent what the members actually think (i.e. members had a final say over the wording of the recommendations).
The final decision making was non-coercive, using democratic decision-making rules (i.e. consensus, majority rule, ranking etc.).
The report fully reflects the judgement of the group, including views that were not supported by the majority. Members were free and supported to contribute a minority report which appears in the appendix to the main report.
Respect for members’ privacy
Members’ privacy was protected. For more details, see Annex B – Principle 10: Privacy.
There was no undesired attention or attempt at interference from the media, stakeholders, or other actors.
3) Pathways to impact
Influential recommendations
The commissioners of the process identified and pursued a set of plausible pathways to immediate policy impact.
The impact (influential conclusions and/or actions) of the deliberative process corresponds to the mandate it was given.
The report of the deliberative process was released publicly.
Efforts were made to disseminate the report widely.
The members’ recommendations had an opportunity to influence opinions and decisions made by a commissioning body, other public institutions, or the broader public.
Response and follow-up
The government or equivalent commissioning body responded to members of the deliberative process and/or to the general public. (Ideally, such a body would accept the recommendations or provide a public justification for why not.)
The implementation of all accepted recommendations was monitored with regular public progress reports.
Member aftercare
The members of the deliberative process were provided information on how to follow the uptake of their recommendations and further engage in the policy-making process.
The members had necessary support to speak about their experiences and recommendations to their communities or the broader public.
Communication channels were established for members to maintain their connection amongst themselves after the deliberative process.
2.4 Measuring the evaluation criteria
There are different ways that evaluators can assess how a deliberative process meets the criteria outlined above. It is important to balance subjective and objective measures when evaluating, to help maintain objectivity. This section covers possible approaches and methods to measure the evaluation criteria for each of the three steps of the evaluation cycle.
Table 2.2. Overview of the applicability of measurement methods for assessing evaluation criteria
Step |
Criteria |
Measurement methods for evaluation |
|||||||
---|---|---|---|---|---|---|---|---|---|
Member survey |
Public survey |
Organiser or expert witness survey |
Document review |
Deliberation observation |
Open-ended Interviews |
Media coverage review |
Policy analysis |
||
Process design integrity |
Clear and suitable purpose |
X |
X |
X |
X |
||||
Clear and unbiased framing |
X |
X |
X |
X |
|||||
Suitable design |
X |
X |
X |
||||||
Procedural design involvement |
X |
X |
X |
X |
|||||
Transparency and governance |
X |
X |
X |
X |
X |
X |
|||
Representativeness and inclusiveness |
X |
X |
X |
X |
|||||
Deliberative experience |
Neutrality, inclusivity, and balance of facilitation |
X |
X |
X |
X |
X |
|||
Accessible, neutral, and transparent use of online tools |
X |
X |
X |
X |
X |
||||
Breadth, diversity, clarity, and relevance of the evidence and stakeholders |
X |
X |
X |
X |
|||||
Quality of judgement |
X |
X |
X |
X |
|||||
Perceived knowledge gains by members |
X |
X |
X |
X |
|||||
Accessibility and equality of opportunity to speak |
X |
X |
X |
||||||
Respect and mutual comprehension |
X |
X |
X |
X |
|||||
Free decision-making and response |
X |
X |
X |
X |
X |
||||
Respect for members’ privacy |
X |
X |
X |
X |
X |
||||
Pathways to impact |
Influential recommendations |
X |
X |
X |
|||||
Response and follow up |
X |
X |
X |
X |
X |
X |
|||
Member aftercare |
X |
X |
X |
Source: Author's own creation
Member survey
Surveying the members of a deliberative process with a standard evaluation survey is a recommended minimum measurement method to evaluate some elements of process design integrity and all elements of the deliberative experience. Annex C contains an evaluation questionnaire that should be used to elicit relevant information from members at the end of the process. It is important to ensure that members of the deliberative process are able to answer the questionnaire confidentially to ensure honesty and openness.
Public survey
A public survey can also be a helpful to evaluate some elements of evaluating impact, such as government response or the awareness about the deliberative process amongst the broader public. It can also help assess elements of process design integrity, such as the clarity of purpose and framing and transparency.
Organiser or expert witness survey
The organiser survey can complement the answers provided by members of the deliberative process and provide insights from the perspective of people who have previously organised deliberative processes and can compare it to their prior experiences. As process organisers and facilitators, they also interact with the commissioning body and have insights into not only the deliberative experience but also the process design. Annex D contains an evaluation questionnaire that can be used to elicit relevant information from organisers. These questions can be used as a survey, or as guidance for a collective self-evaluation and reflection session. If expert witnesses are present in the process, they can also be asked most of these questions.
Document review
Document review is a method that helps to gather objective information about a deliberative process and can be put to good use to help validate some of the more perception-based information sources, such as surveys and interviews. Evaluating process design integrity relies heavily on document review. For example, evaluating representativeness of the panel by comparing member demographics with census data, evaluating transparency by verifying the availability of various documents to the public, evaluating suitable design of a deliberative process by examining the timeline, the agenda, and the various stages of a deliberative process.
Some elements of the deliberative experience can also be evaluated in this way, such as the evidence base presented to the members to evaluate its breadth and diversity, any online tools, or the final set of member recommendations. Transcripts of the deliberation can also be reviewed to evaluate the amount of time that members of different groups take in speaking.
Evidence review is also crucial for evaluating impact – such as assessing the commissioning authority’s response and follow-up to the members.
Deliberation observation
Observation is essential to evaluate the deliberative experience of the process. By having access to the sessions, evaluators can form a judgement of how each of the criteria was met. This is especially important for criteria such as equal opportunity to speak, respect amongst the members, and quality of judgement.
However, it is important to ensure that there are not too many observers, especially during small group discussions, and that they do not interfere in any way with the process.
Open-ended interviews
Qualitative interviews with representatives of the commissioning authority, relevant stakeholders, policy makers, expert witnesses, journalists, or members is another useful method to complement evaluation efforts. Interviews with policy makers can be helpful for identifying the extent that members’ recommendations were influential. Interviews with stakeholders can shed light on the openness and transparency of the deliberative process design. Interviewing the commissioning body can be helpful in identifying the motivations behind the initiation of a deliberative process. Interviews with a selection of members can provide additional insights to the quantitative survey results.
Media coverage review
Reviewing media coverage of the deliberative process, along with coverage of the policy issue addressed, can be useful in evaluating the extent of influence the process had on both the public decision-making process and the broader public. Changes in the discourse around, in the framings of, and the popularity of the policy issue on traditional media outlets or social media can indicate public perception and provide details on government response to recommendations. Media coverage review is most useful for evaluating pathways to impact.
Policy analysis
Policy analysis also helps in evaluating pathways to impact. Sometimes it can be difficult to attribute policy changes to deliberative processes, but other times those links can be clear. Identifying these links can help highlight the value of the deliberative process to public decision making. Policy analysis can include document review and interviews with stakeholders and policy makers. It can also be comparative, looking at relevant changes in policy, legislation, and/or institutional structures before and after a deliberative process takes place.