Claudia Chwalisz
Ieva Česnulaitytė
Claudia Chwalisz
Ieva Česnulaitytė
This chapter relies on the evidence gathered for this report and the wider academic literature to assess the elements that contribute to what could be considered a ‘successful’ representative deliberative process. The framework for analysis has four principles of evaluation:
1) design integrity: the procedural criteria which ensure that a process is perceived as fair by the public and in line with principles of good practice;
2) sound deliberation: the elements that enable quality deliberation that results in public judgement;
3) influential recommendations and actions: the evidence of impact on public decision making, and
4) impact on the wider public: the secondary and long-term effects on efficacy and public attitudes.
Through this analysis, this chapter considers the key benefits and challenges of deliberative processes for public decision making.
How a representative deliberative process is designed and run, and the impact that it has on policy and the wider public are all questions that arise when determining whether it has been a success. For instance, if participants have been chosen through a fair and transparent recruitment process, then the wider public is more likely to see someone like them as part of it and trust the outcome (newDemocracy Foundation and UN Democracy Fund, 2019). If there is enough time devoted to learning and participants hear from a wide range of perspectives, then they are able to formulate more than just opinions; they develop informed policy recommendations. If the process has been well-communicated to the public before, during, and after, then it is arguable that public decision makers are more likely to be held accountable to respond to and implement a resulting public decision. If participants have heard from a wide range of experts and stakeholders and this information is transparently available to the wider public, then it is more likely that the public will be aware of and trust the recommendations of the deliberative process (Suiter, 2018). This chapter tests whether these assumptions are substantiated by the international data.
Drawing on the new empirical comparative research collected by the OECD for this report and wider theoretical research on deliberation, this chapter seeks to assess the different approaches and designs of deliberative processes. Nabatachi et al. (2012) have outlined evaluation principles for the practice and impact of deliberative civic engagement, covering four aspects. The OECD draws inspiration from this framework for analysis in this chapter, adapted to the specific focus on representative deliberative processes and the type of data collection that was feasible for this report (see Figure 4.1):
1. Design integrity: the procedural criteria which ensure that a process is perceived as fair by the public and in line with principles of good practice;
2. Sound deliberation: the elements that enable quality deliberation that results in participants’ arriving at sound public judgement;
3. Influential recommendations and actions: the evidence of impact on public decision making, and
4. Impact on the wider public: the secondary and long-term effects on efficacy and public attitudes.
Overall, the evidence shows that:
Random selection attempts to overcome the shortcomings and distortions of “open” and “closed” calls for participation. It ensures that nearly every person has an equal chance of being invited to participate and that the final group is a microcosm of society. It can also insulate the process from an overwhelming influence of vested interests. While it is not a statistically perfect method, it delivers a more mixed and diverse sample than any other recruitment process.
The most popular random participant selection method for representative deliberative processes to date has been two-stage selection (59%), commonly called a “civic lottery”. This method has mostly been used in Germany, Australia, Canada, and the United States (US), although there are also a handful of examples from other countries.
When stratifying the final sample of citizens, all deliberative processes select participants according to demographic selection criteria that matches the general makeup of the wider population (such as that available in a census), and usually includes at least four criteria: gender; age; geography, and socioeconomic factors (a variable that captures disparity in income and education levels).
While demographic stratification is enough to ensure diversity and representativeness, in some circumstances it may not be enough to ensure credibility, requiring discursive or attitudinal representation as well.
Participants are compensated in one way or another (either through remuneration or costs being covered) 57% of the time.
Time is one of the factors that distinguishes representative deliberative processes from most other types of stakeholder and citizen participation. Half (48%) of the cases with known duration of preparation required 12 weeks or more of preparation before the first participant meeting took place and almost all (98%) of these cases involved a minimum of five weeks of preparation.
While the minimum timeframe required to be included in this report was one full day of face-to-face deliberation between participants, the average duration was 3.7 full meeting days, spread out over the course of 6.6 weeks.
The average duration varies greatly depending on the model of deliberative process. The most common model of the Citizens’ Jury/Panel lasts for 4.1 days over five weeks on average.
Having strong political and/or institutional commitment is important for giving the process credibility and motivating people to invest their time by participating. Evidence suggests that the commitment of public decision makers is one of the key factors for why response rates are high and dropout rates are low amongst participants in representative deliberative processes for public decision making.
The learning stage tends to include: inviting issue experts to present and answer questions to the meetings (79%); providing introductory reading material before the first meeting (48%); learning sessions, such as field trips (43%); the right for participants to request information and invite speakers, stakeholders, and experts (35%); and providing participants with clear and extensive reading material in between meetings (31%),
There are two key aspects of information sources: 1) diversity of information and 2) importance of giving citizens control. The independent team responsible for designing and organising the deliberative process chooses the experts and informational material. Having a wide breadth ensures that participants encounter and consider different points of view. The type of information provided also matters in terms of public perceptions of fairness (i.e. this cannot be government brochures highlighting their successes or arguing for certain solutions). Allowing citizens to ask for information is therefore a crucial aspect of winning public trust in the process.
A key difference between representative deliberative processes and other forms of citizen participation is that the outcome is not many individual views, but a collective and considered view. Citizens are tasked with finding common ground on the recommendations they provide to public decision makers.
At the end of a deliberative process, the citizens’ recommendations are delivered to the commissioning public authorities. Participants sometimes accept or amend the proposals of experts from who they hear, particularly when it comes to more technical proposals. The good practice principle is that the participants should have control of the recommendations.
Once the final recommendations are delivered to the public authority, it is their responsibility to act. In a representative democracy, there is no expectation that the authority is obliged to accept all recommendations. There is a responsibility to respond and to explain a rationale for accepting or rejecting any proposals.
In two-thirds (66%) of examples, the public authority discussed the final recommendations face-to-face with participants. In four in ten (42%) cases, the public authority communicated a public response through government channels (such as a website, social media) and traditional media (newspapers, radio), but it did not take place in person with the participants. In one quarter (24%) of the cases, the commissioning authority followed up directly with the participants to let them know about the response to their recommendations, in addition to the public response.
The OECD tried to collect as much data as possible about the implementation of commitments made based on citizens’ recommendations. There was data available for 55 cases. In three quarters (76%) of these cases, the public authorities implemented over half of the recommendations. In four in ten (36%) of these cases, it implemented all of them. Only in six (11%) of these 55 cases were none of the recommendations implemented.
The limited impact data suggest that when presented with informed and considered proposals, public authorities are likely to act on them, as they include sensible recommendations that can lead to more effective public policies. More data is needed for this to be a robust conclusion, but it sheds some preliminary light on an issue that is much discussed and of great importance.
The most common method of evaluation of representative deliberative process (67%) has been an anonymous survey of participants. Seventeen per cent have had an academic analysis, and only seven per cent have had an independent evaluation, usually by a private consulting company or a non-governmental organisation with expertise in citizen participation.
With effective public communication, a deliberative process can be a mechanism for the broader public to learn about an issue as well as encourage it to participate more in public life in general.
Empirical research has also shown that strong public communication about representative deliberative processes can be a tool to help counteract disinformation and polarisation related to the issue being addressed in the process.
Representative deliberative processes are not typically used in isolation, and are rather a central part of a wider strategy of citizen participation around a specific policy issue. The most common types of stakeholder participation that are used in conjunction with deliberative processes are online calls for proposals/ submissions (used in 33 cases), surveys (29 cases), public consultations (19 cases) and roundtable discussions (16 cases).
Design integrity refers to the rigour and fairness of how the representative deliberative process is organised to ensure that it stands up to public scrutiny, is trusted by the public, and is in line with good practice principles (see Chapter 5). The elements discussed in this report are: scope of the remit; random selection methods; duration; and commitment by decision makers.
The very first stage involves identifying the problem and deciding whether a representative deliberative process is the right type of process to help address the issue (see Chapters 1 and 2 for guidance on identifying whether this is the case). If that is the case, setting out the scope and defining clearly the task at hand are important for ensuring that the process is worthwhile in the eyes of both the organisers and participants. Deliberative processes are a tool for public problem-solving and “good problem-solving requires having a clearly defined problem” (MASS LBP Reference Panel Playbook).
Considering the right question for a deliberative process should only take place after the public problem or dilemma has been clearly identified. The Kettering Foundation (2015) provides a useful list of considerations for the appropriateness of an issue before beginning a process of public deliberation (Box 4.1).
Broad concern exists within a community;
Choices must be made, but there are no clear “right” answers;
A range of people and groups must act in order for the community to effectively move forward;
Additional perspectives and ideas may help the community to move forward;
Citizens have not had the opportunity to consider the different courses of action and their long-term consequences; and
The decision-making of officeholders and other leaders needs to be informed by public judgment, as well as experts’ views.
An issue is NOT appropriate for public deliberation if it:
Is solely technical and requires a technical solution;
Needs only a “yes” or “no” answer;
Has a specific solution that has already been decided, and the public’s role would only be seen as a “rubber stamp”;
Requires an immediate response;
Is relevant only to a narrow interest group; or
Is one for which your group has a particular approach to advocate.
Source: Pratt, Julie (2005), A Guidebook for Issue Framing, Kettering Foundation.
Once the appropriate issue is identified, it is then necessary to frame the question for the deliberative process in order to define its remit. There is a fine line between a remit that is too broad and one that is too narrow. It should be sufficiently broad as to allow for numerous recommendations to be possible, but should be narrow enough to avoid the group side-tracking into irrelevant discussions (Carson, 2018; MASS LBP Reference Panel Playbook; Gerwin, 2018). Moreover, it is important that the question encapsulates the trade-offs or constraints involved. In order to avoid confusion and ambiguity, using simple and clear language is advisable.
Setting the remit is a crucial step as it is one of the key distinctions of a deliberative process. It is not merely a consultation exercise, where people are being asked feedback or input. In a deliberative process, they have a mandate to address genuine challenges and provide practical recommendations. The newDemocracy Foundation has provided some ‘Do’s and Don’t’s’ when it comes to defining the remit, listed in Box 4.2
Start with a question, not merely a subject description.
Ensure that it is a neat fit for what the decision maker will ultimately decide.
Aim for brevity and clarity.
Make it not neither too broad nor too narrow.
Do not lead the participants toward a pre-determined answer or even give the unintended perception impression that you are.
Sometimes it will be useful to precede or follow a question with an explanatory statement.
Embed the trade-offs in either the question or supporting statement.
Test your remit with someone outside the organising group – check that it makes perfect sense to an everyday citizen.
Share the problem/dilemma; don’t sell a solution.
Don’t frame a question that can be answered with either ‘yes’ or ‘no’.
Avoid compound questions (two questions in one). Keep each question separate.
Avoid words like ‘should’ or have a good reason for using them.
Source: Carson, Lyn (2018), “Framing the Remit”, newDemocracy Research and Development Note, available at: https://newdemocracy.com.au/wp-content/uploads/2018/07/docs_researchnotes_2018_July_RampD_Note_-_Framing_the_Remit.pdf
Finally, there is a decision that needs to made about whether the recommendations that are produced by the participants in the deliberative process should be advisory or binding. In the vast majority of cases in this report, the remit is advisory. This seems to be in line with citizens’ preferences for the use of deliberative processes to be advisory (see Chwalisz, 2015 and Goldberg, 2020), as well as the normative theory of deliberative democracy that suggests such processes should be advisory to decision makers and a conversation-starter with the broader public.
Even with advisory processes, there are nonetheless varying levels of commitment possible. It can be a legal obligation for public authorities to respond (publicly or not), which does not necessarily mean there is a commitment to accept all recommendations. Or it can be a prior political commitment from public authorities to respond to or take into account the recommendations.
In a democratic system, there is a question about the legitimacy of participants that have been randomly selected to produce recommendations that carry an obligation to be enacted. Ultimately it is the role of decision makers to accept accountability for their decisions. However, it is a key consideration at the stage of setting the remit to determine what will be done with the recommendations. Participants devote a significant amount of time and effort to learn, deliberate, form consensus, and write a report. As such, they will want to know that their time is valued and receive assurance that their recommendations will be taken seriously. A commitment to heed and respond to the recommendations is therefore important. MASS LBP have outlined a set of questions to consider about planning the response, detailed in Box 4.3.
How will the recommendations fit within the existing policy development process?
Who will respond to the recommendations? What form will this take?
Will officials publicly receive the participants’ report?
How will officials communicate their decisions and progress on their implementation to the participants?
How will the hard work of the participants be recognised and celebrated?
Source: Adapted from MASS LBP, “The Reference Panel Playbook”, available at: https://www.masslbp.com/the-reference-panel-playbook.
In many traditional consultation processes, there is often an “open call” to recruit participants, either to an in-person meeting or to participate in an online consultation or forum. Participation is usually encouraged through advertising the opportunity via a variety of channels (online, social media, post, posters). Participation is open, so anyone who wants to is able to come in person or contribute online. In other instances, participants may be chosen by an institution through an application or selection process, such as before a committee hearing. There is a wealth of research that demonstrates that certain demographics tend to disproportionately participate, notably those who are older, male, well-educated, affluent, white, and urban (Dalton, 2008; Olsen, Ruth and Galloway, 2018; Smith, Schlozman, Verbe and Brady, 2009).
Public authorities may also conduct consultations through a “closed call” for participants, meaning that politicians and/or civil servants might choose specific members of a community who have a particular expertise or experience needed to address a policy issue. In these instances, participation could be based on merit, experience, affiliation with an interest group, or because of their role in the community (see MASS LBP, 2017).
Both the open and closed calls result in non-representative samples of the community, meaning a group that does not mirror the wider population in terms of gender, age, socio-economic status, and other criteria. These processes also tend to be dominated by stakeholders and advocacy groups who are most affected by a decision and potentially have the most to lose (newDemocracy Foundation and UN Democracy Fund, 2019: 49). Depending on their purpose, these processes may thus result in outcomes that are not perceived as fair or legitimate since everyone does not have an equal opportunity to be selected.
As provision 8 of the OECD Recommendation on Open Government (2017) emphasises that public authorities should “grant all stakeholders equal and fair opportunities to be informed and consulted and actively engage[d]”, representativeness and inclusiveness were central to the processes studied for this research. For these reasons, all of the deliberative processes in this report recruit participants through different random selection (sortition) methods, often called a civic lottery (MASS LBP, 2017; newDemocracy Foundation and UN Democracy Fund, 2019).
Random selection attempts to overcome the shortcomings and distortions of “open” and “closed” calls for participation described earlier. It ensures that nearly every person has an equal chance of being invited to participate in a deliberative process and that the final group is a microcosm of society. It can also insulate the process from an overwhelming influence of partisan, wealthy, or special interests (MASS LBP, 2017).
While it is not a statistically perfect method – which is why the result is a sample that one can say is broadly representative of the wider population – it delivers a more mixed and diverse sample than any other recruitment process. This is particularly true in the context of its use for deliberative processes for public decision making. Receiving an invitation to participate from someone with authority (like a minister or mayor) encourages people who have never voted, never attended a town hall meeting, or never participated in an online consultation to consider the opportunity seriously. This brings new voices into the room that are often under-represented in “open” and “closed call” processes.
Representative processes are thus able to garner greater legitimacy and ensure a diversity of participants that are not achievable to the same extent through other recruitment mechanisms. Research suggests that non-participants’ legitimacy perceptions increase when deliberative forums are maximally representative and inclusive (Goldberg, 2020).
Diverse groups also result in better outcomes. Having greater cognitive diversity leads to better decisions than those made by more homogeneous groups (for e.g. groups of experts), since the latter tend to have access to similar types of information and are more likely to reinforce one another’s views than to introduce completely new ideas (Landemore, 2012; Page, 2007).
Participants randomly selected to be broadly representative are also more likely to win citizens’ trust, as people trust random draws in lotteries in other situations, such as the jury system in many countries, but also in sporting events and competitions, as it is very difficult to cheat (newDemocracy Foundation and UN Democracy Fund, 2019). Moreover, people are more likely to trust a process where they see ordinary people reflecting all parts of society engaging in the complex trade-offs required for public decision making.
Finally, it is important to emphasise that stakeholders and experts play a key role in deliberative processes. They are offered an opportunity to make their case and have a fair hearing by a randomly selected group of participants who are broadly representative of the wider population. As a result, such processes can empower elected representatives and civil servants to put forward solutions to complex public problems that have received citizen input, informed by stakeholders and experts. It complements their role in representative democratic institutions to improve the democratic process more broadly.
The principle of random selection can be operationalised in various ways (Figure 4.2), each with advantages and disadvantages to be acknowledged. The participants for the cases in this report have been recruited in one of four ways: two-stage random selection (59%); single-stage random selection (22%); a mix of random and targeted selection of hard-to-reach groups (4%), and three-stage random selection (less than 1%). In fifteen per cent of cases, notably those that are the most dated, the details of the random selection process were not available, but a general description of random selection in the reports about these cases confirmed that one of the methods described in this chapter was employed.
The most popular random participant selection method for deliberative processes to date has been two-stage selection. It means there is randomisation at multiple stages of the participant recruitment and selection process. This method has predominantly been used in Germany, Australia, Canada, and the US, although there are also a handful of examples from other countries (Figure 4.3).
In Germany, the Nexus Institute has been running Planning Cells that also use a two-stage methodology, although it differs in many ways to the approach in the other three countries. In Australia, the non-profit research and development organisation, newDemocracy Foundation, was set up in 2007 and has been running Citizens’ Juries that employ a civic lottery method very similar to the one developed in Canada by the democracy organisation MASS LBP during the same period (the two organisations did not know about one another for numerous years). The participants for a Citizens’ Initiative Review in the United States are also selected through a civic lottery.
In a two-stage random selection process, the first stage involves sending a large number of invitations to randomly selected individuals or households. This entails first deciding on four criteria:
1. the population that will be represented through the civic lottery;
2. the number of individuals to be selected;
3. the stratification criteria – meaning the demographic criteria that will be used to ensure the selected group broadly represents the wider community (e.g. gender, age, geography), as well as attitudinal criteria if appropriate for the context, and
4. the method for inviting that set number of randomly chosen individuals from within that population to participate (MASS LBP, 2017: 9).
Depending on the size of the wider population (i.e. if it is a small town, a big city, a region, or a state), the size of the initial round of random invitations varies. For small populations, usually there are at least 2,000 people initially contacted; for national-level processes, a first round of random invitations can go out to around 30,000 depending on the population size. One of the differentiators between the civic lottery method (used mostly in Australia, Canada, and the US, among other places) and the approach in Germany is that in the former the initial selection pool is much larger (usually at least 10,000).
According to interviews with practitioners in different countries, the number of people to contact to have the desired number of participants depends on the anticipated response rate. This will vary depending on the level of government (due to the size of the population affected), issue salience, level of commitment required from participants, and other contextual factors. Response rates vary due to these factors, plus other aspects such as mode of invitation (i.e. by post, telephone, online), invitation wording, who sends the invitation (i.e. whether it is from someone with authority, like a mayor or a minister), and other design elements. The larger the overall population and the lower the anticipated response rate, the larger the initial invitation pool should be.
Sometimes email invitations are used, but in those cases, they are usually complemented by post or phone invitations to help ensure that older age groups are reached. The trade-offs of different methods for distributing invitations are discussed in the following section.
The convenors will need to have a universal contact list, which can vary from the electoral register (in places where registration is compulsory or automatic) to the national post database, registry of landline and mobile numbers, or other similar resource. In many places, a universal contact list is not available, or not always available to the organisers of deliberative processes due to data privacy rules. Many data sources thus miss part of the population, so it is important to acknowledge this shortcoming or to combine sources. The principle should be to ensure that the largest number of people can be eligible to receive an invitation in the first place.
The invitation typically contains an introduction to the process, an information sheet, and a response form and envelope if by post (or a phone number or a link to an online registration form). Based on their experience of having conducted over 30 civic lotteries, MASS LBP (2017) has identified seven important pieces of information that the invitation should contain:
1. An introduction to the convening public institution;
2. An introduction to the problems or issues;
3. An introduction to the selection and engagement process;
4. An outline of the rules and exclusions of the selection process;
5. An introduction to the specific issue to be addressed;
6. The request to volunteer, which includes: volunteer dates; deadlines; methods of registration; and other information pertaining to the process; and
7. An outline of the responsibilities of volunteers if selected by the lottery (MASS LBP, 2017: 20-21).
In the case of the 2004 British Columbia Citizens’ Assembly on Electoral Reform, as it was the first time that a process of such scale was undertaken, those who were interested in participating after receiving an initial invitation in the post were invited to a meeting where they learned more about the initiative before confirming their interest. This is not a common practice, however.
The second stage of the civic lottery relates to the stratification by demographic criteria of all the individuals who volunteer to participate in the deliberative process. Stratification criteria are essential for bringing together a group of citizens that broadly mirrors the composition of society. From the individuals who volunteer, a second random draw is made, this time using the stratification criteria, to compose the final sample. In most cases, there are four standard variables of stratification:
1. age;
2. gender;
3. geographic locality, and
4. a demographic indicator that ensures a mix of income and education levels (this will vary depending on the context) (newDemocracy Foundation and UN Democracy Fund, 2019; Gerwin, 2018).
For more technical details about how to run a two-stage random selection process, please see the newDemocracy Foundation and UN Democracy Fund handbook on democracy beyond elections (2019), MASS LBP’s guide on how to run a civic lottery (2017), and Marcin Gerwin’s guide to organising Citizens’ Assemblies (2018).
While two-stage random selection – and notably the rigorous method of a civic lottery – has been employed most often, one in five cases (22 %) have relied on single-stage random selection. Geographically, there is a wide spread of where this approach has been used. It has been used more for certain models than others, however. Many of the Citizens’ Juries/Panels convened in France, Spain, and the United Kingdom (UK), Citizens’ Dialogues in France, Consensus Conferences outside of Denmark, Deliberative Polls, G1000s, and the Citizens’ Assemblies in Ireland have relied on single-stage random selection to recruit and select participants (Figure 4.4).
What this most often entails in practice is that a polling company is commissioned to recruit a stratified random sample (based on the same process described in the previous section of identifying the key demographic criteria that the final sample needs to match).
In very few cases, especially the international examples like World Wide Views (described in Chapter 2), a mix of random and targeted selection to reach vulnerable groups is used. Typically, the vast majority of participants are randomly selected and a smaller proportion are targeted. Targeting a specific group can be useful when the issue relates strongly to a specific segment of the society, although should be considered with caution. This is not a common practice as the types of issues that deliberative processes are well-suited to address (values-based dilemmas, complex policy problems that require trade-offs, and long-term issues) are ones that affect the entire population. Targeting certain demographics occurs during the random sampling phase to increase their response rate rather than over-representing certain demographics within the group itself, which can distort the dynamic.
There was one example where a three-stage random selection process was used to recruit and select the participants: the 2017 Yarra Valley Water Citizens’ Jury in Australia. In this Jury, citizens were tasked with providing recommendations to the public water authority regarding its five-year costed plan.
The three-stage random selection process entailed first sending out an electronic invitation to participate to a random sample of 240,000 of Yarra Valley Water’s customers whose email addresses were available. This database was sufficiently large (one third of their customers) to avoid any skew.
To avoid a digital skew, the second stage involved randomly drawing 10% of Yarra Valley Water’s overall database (not just digital subscribers) to send an invitation by post to 5,000 randomly selected addresses. These were able to reach those without or limited digital access.
Finally, the third stage involved randomly selected a group of 35 participants for the Citizens’ Jury from the pool of expressions of interest, stratifying for gender, age, geography, and tenancy (owner or renter). More details about the deliberative process and the random selection process are available on the newDemocracy Foundation’s project webpage about the Yarra Valley Water Citizens’ Jury (2017).
Various databases can be used to carry out the random selection process depending on the country and the available access. Some examples include: the voters registry; the census (national population registry); the national survey database; the civil registration number register; the national post address register, and the Vote Compass (a voting advice application). Other tactics include random digit dialling, ensuring a mix of landline and mobile phone numbers.
Depending on the database used, there are risks of excluding residents who are not citizens, people without a permanent address, or people who are not registered to vote. Sometimes due to legislation or rules limiting access to certain types of databases to service providers, it is not possible to access a complete registry. It is important to consider the limitations of the database to be used and make active efforts to make the process as inclusive as possible.
When stratifying the final sample of citizens for a deliberative process, all deliberative processes select participants according to demographic selection criteria that matches the general makeup of the wider population (such as that available in a census), and usually includes at least four criteria: gender; age; geography; and socioeconomic factors (a variable that captures disparity in income and education levels). This is done to ensure descriptive representation, meaning that the final group broadly mirrors the composition of society. The rationale is that, if conducted properly and rigorously, the random selection process will result in a group that reflects a wide diversity in perspectives on an issue, deriving from different life experiences and interests (Davies, Blackstock, and Rauschmayer, 2005).
There is an argument, however, that social characteristics are not necessarily strongly correlated to attitude, so a well-stratified demographic sample will not necessarily provide adequate diversity of viewpoints (Davies, Blackstock, and Rauschmayer, 2005). Some scholars have also advocated stratifying participants according to their discourse regarding the policy issue to be discussed – called discursive representation. A discourse can be understood as “a set of categories and concepts embodying specific assumptions, judgements, contentions, dispositions, and capabilities” (Dryzek and Niemeyer, 2008: 481). It is more than merely an opinion or perspective; arguably discourses have more solidity and can be measured and described (Dryzek and Niemeyer, 2008). It is a way to mitigate the problem of how to factor in political differences. Some argue that the aim in these cases is to ensure that the spectrum of understandings, interests, and values expressed in different discourses among participants in the deliberative process broadly reflects that of the wider population (Davies, Blackstock, and Rauschmayer, 2005; Parkinson, 2003). Others suggest that discursive representation can be helpful in situations where it is difficult to define the population (Dryzek and Niemeyer, 2008).
Recently, a debate has been ongoing amongst practitioners and academics about the need for discursive representation in deliberative processes on controversial topics, such as environmental issues. In reality, when criteria beyond demographics are included in recruitment approaches, it tends to more often be opinion or attitudinal data rather than a more holistic account of a discourse. For example, the recruitment of panellists for Toronto’s Climate Action Panel (2019) included one attitudinal question in addition to demographics: “Everyone needs to reduce their emissions that contribute to climate change, including myself”, with a four-point response scale: strongly agree / somewhat agree / somewhat disagree / strongly disagree (MASS LBP, 2019). In the case of the 2020 Climate Assembly UK, participants were also stratified based on their response to the following question: “How concerned, if at all, are you about climate change, sometimes referred to as ‘global warming’?” (Climate Assembly UK, 2020). On the other hand, in the case of the French Citizens’ Convention on Climate (2019-20), recruitment was based only on demographic representation (Gouvernment français, 2019). When some attitudinal criteria is included, there is also a question that lacks a clear answer regarding whether different discourses should be represented equally or in proportion to their presence within the population.
There is also a compelling argument, however, against discursive representation. One of the goals of public decision makers when convening deliberative processes is to reach recommendations that achieve public trust. When extra steps are taken to ‘correct the balance’, then public decision makers may be opening themselves to perceptions of manipulation to achieve a pre-ordained result. From a pragmatic perspective, when faced with a design choice, there is a case for erring on the side of the light touch. While motives may be well-intentioned, political realities and optics matter for the wider public to have confidence in the process, and thus its outcome. Overall, there is no one-size-fits-all approach and the decision to include information about opinion, attitudes, or discourses will vary depending on the purpose of the deliberative process and the context in which it is being convened. While demographic stratification is enough to ensure diversity and representativeness, in some circumstances it may not be enough to ensure credibility, requiring discursive or attitudinal representation as well.
Ensuring that all citizens have equal opportunities to participate is key to achieving inclusiveness and representativeness. The difficulty of this varies depending on the time commitment required and the salience and interest of the policy issue. People have other commitments, different levels of financial stability, and low trust of government institutions (as discussed in Chapter 1). Nevertheless, there are several ways to lower barriers to participation and achieve higher response rates.
Remunerating participants is one way to make it happen. Compensating participants for their time spent in a deliberative process, especially when it comes to longer, more time consuming processes such as Citizens’ Assemblies and Citizens’ Juries/Panels, makes it possible for citizens to afford to take some time off from their work or other duties and to cover costs of childcare or elderly care. Often participants are remunerated based on the rate of the national wage average or at the rate that people are reimbursed for jury duty. However, the potential impact of receiving remuneration for participation on some participants’ social security benefits should be a consideration.
In the 172 cases for which there is data, participants are compensated in one way or another 57% of the time (Figure 4.5). In 44% of deliberative processes there is remuneration in the form of payment. In a small number of cases, transport costs are compensated (7%) or expenses are covered (6%). There is no remuneration in 43% of deliberative processes. The majority of these latter instances are at the local level, where arguably costs to participate are lower.
The rationale for non-remuneration is that participation in a deliberative process activates a civic responsibility to volunteer in a democracy. In many cases, it is equally driven by budgetary constraints. As the data collected in this study does not contain details regarding the response rates of different demographics, it is not possible to draw concrete conclusions regarding the impact of remuneration on the decision to participate. Other studies suggest that payment does encourage demographics that generally do not participate otherwise, notably young people and those with lower incomes (newDemocracy Foundation and UN Democracy Fund, 2019: 150).
There are other ways to reimburse participants beyond direct payments. Offering accommodation and covering transportation costs for participants coming from areas that are far away from the location where deliberation takes place, such as, for example, when participants come from all regions of a country to participate in a national level process, is a prerequisite. It may also entail making available structural support systems, such as providing or paying for childcare, or reimbursing the costs incurred for elderly care. For example, participants in the 2019-2020 French Citizens’ Convention on Climate are reimbursed, at their request, by the commissioning public authority for the following costs incurred:
a daily allowance;
persons who prove that they have lost part of the income from their professional activity are also entitled to additional compensation;
coverage for citizens retained outside their municipality of residence for travel and accommodation costs, up to a ceiling of € 110 per night;
reimbursement of meal costs;
reimbursement of childcare expenses up to a ceiling of 18 € per hour (including the amount of employer contributions) (see Service-Public.fr, 2019).
Clear and targeted communication about the deliberative process is essential for supporting the recruitment process and beyond. Having the full picture of the purpose, how the process will unfold, the level of commitment required, and how public decision -makers will respond is key. Effective communication during the selection stage (as well as throughout the deliberative process) can help to ensure a higher response rate, active participation, and lower dropout rates. More information about communicating representative deliberative processes can be found later in this chapter.
Time is one of the factors that distinguishes representative deliberative processes from most other types of stakeholder and citizen participation. Deliberative processes tend to require much longer amounts of time to conduct a proper recruitment and to prepare the educational materials and agendas. Half (48%) of the cases for which there is data required 12 weeks or more of preparation before the first participant meeting took place (Figure 4.6). Almost all (98%) of these cases involved a minimum of five weeks of preparation.
The preparation time is in addition to the time required to conduct the participant recruitment, although it is possible for both to be done simultaneously. For two-stage random selection, the time required ranged from three to eight weeks. For single-stage random selection, it ranged from four to over eight weeks. Random selection combined with a small proportion of targeted selection takes on average six to eight weeks (Table 4.1).
|
Two-stage random selection |
Single-stage random selection |
Random selection plus targeted recruitment |
Three-stage random selection |
---|---|---|---|---|
Number of cases |
63 |
27 |
4 |
1 |
Range |
3-8 weeks |
4-8 weeks |
6-8 weeks |
6 weeks |
Notes: n=110; Data for OECD countries is based on 16 OECD countries that were members in 2019 (Australia, Austria, Canada, Belgium, Denmark, Estonia, France, Germany, Italy, Korea, The Netherlands, Norway, Poland, Spain, United Kingdom, and United States) plus the European Union/Global.
Source: OECD Database of Representative Deliberative Processes and Institutions (2020).
Beyond the time required to recruit and prepare the informational materials and agendas, deliberative processes also require a significant amount of face-to-face time among participants in order to build trust, learn and grapple with the complexity of the issue, deliberate with one another, and formulate shared recommendations.
While the minimum timeframe required to be included in this report was one full day of face-to-face deliberation, the average duration was 3.7 full meeting days, spread out over the course of 6.6 weeks. As discussed in detail in Chapter 2, the average duration varies greatly depending on the model of deliberative process (Table 4.2). The most common model of the Citizens’ Jury/Panel lasts for 4.1 days over five weeks, on average.
Allowing enough time for the in-person deliberation is crucial to achieving the overarching goals of: detailed and considered recommendations; building trust between participants, and instilling public confidence in the process and its outputs. A common finding is that rushing the time process leads to a rushed decision, which undermines these goals (newDemocracy Foundation and UN Democracy Fund, 2019: 110).
Model |
Average duration of face-to-face meetings (in days) |
Average duration between the first and last meeting (in weeks) |
---|---|---|
Informed citizen recommendations on policy questions |
||
Citizens' Assembly |
18.8 |
47 |
Citizens' Jury/Panel |
4.1 |
5 |
a) consecutive days |
3.4 |
0 |
b) non-consecutive days |
4.1 |
7 |
c) ongoing |
11.0 |
104 |
Consensus Conference |
4.0 |
2 |
Planning Cell |
3.2 |
2 |
Citizen opinion on policy questions |
||
G1000 |
1.7 |
4 |
Citizens' Council |
1.7 |
1 |
Citizens' Dialogues |
2.1 |
4 |
Deliberative Poll/Survey |
1.6 |
0 |
World Wide Views |
1.0 |
0 |
Informed citizen evaluation of ballot measures |
||
Citizens' Initiative Review |
4.4 |
0 |
Permanent deliberative bodies |
||
The Ostbelgien Model |
no data |
78 |
City Observatory |
8.0 |
52 |
Note: These calculations have been made by the authors on the basis of the data from the 289 cases, which together feature 763 separate deliberative panels, collected for this study, from OECD Member and non-Member countries. The average length from first to last meeting of the Planning Cell is an exception due to lack of data. In this instance, Nexus Institute, the principal organisation implementing Planning Cells in Germany, was consulted. The overall average length of meetings of Citizens' Jury/Panel is calculated not including the ongoing processes.
Source: OECD Database of Representative Deliberative Processes and Institutions (2020).
Taking into account the time required to recruit participants, prepare the process, and run the meetings, most deliberative processes tend to take around six to seven months from beginning to end. Chapter 2 offers more guidance about choosing a deliberative model depending on the time, complexity of the issue, and other factors.
To show citizens that their input is welcome and valuable, and that it is a privilege to represent fellow citizens in a deliberative process, it is a good practice to highlight the importance of the duty in which they have been invited to participate. Having strong political and/or institutional commitment is important for giving the process credibility and motivating people to invest their time by participating.
One way to do this is through the invitation letter, which can be signed by a person with a high level of authority, such as the mayor or minister. Its contents should appeal to citizens’ sense of solidarity, as well as making it clear that their time will be valued and how their recommendations will be taken into account. Evidence suggests that the commitment of public decision makers is one of the key factors for why response rates are high and dropout rates are low amongst participants in representative deliberative processes for public decision making (Chwalisz, 2017). It is one of the distinguishing factors to the academic experiments and deliberative practices initiated by CSOs, for which there tends to be greater difficulty in recruiting a representative sample and maintaining participation over the course of the process (Chwalisz, 2017).
Additionally, to highlight the value that the commissioning public institution sees in participants’ work, a high-level public representative often opens the deliberative process and welcomes the participants, or attends one of the sessions. Depending on the level of government, it can be a head of a public enterprise or organisation, a mayor, a minister or even the president (for example, the Irish Taoiseach opened and welcomed members to the Irish Citizens’ Assemblies and the French President Emmanuel Macron spoke at the fourth session of the 2019-20 Citizens’ Convention on Climate.
Core to deliberative processes is, of course, deliberation. This entails participants having an equal chance to speak, listen carefully to others, and weigh different options and trade-offs in light of the broadest access to diverse information. In the cases analysed in this report, the focus is on group deliberation, which also entails people finding common ground between one another and coming to some consensus. In the context of public decision making, this means that the group develops collective recommendations (often with a supermajority agreement).
Nabatchi et al. (2012) break down the criteria of sound deliberation and judgements into three components: deliberative analytic process; democratic social process, and sound judgement. The first entails high-quality discussions between participants, which are based on a solid information base, a prioritisation of key values, identification of alternative solutions, and a careful consideration of pros and cons – the trade-offs (Nabatchi et al., 2012). To capture this component, the OECD has collected data about the information and learning environment.
The second criteria refers to the fact that deliberation for public decision making is not only a rational process, it also has a social element that makes it democratic deliberation. This means that equal opportunity to contribute, mutual understanding and consideration, and respect are crucial for overcoming traditional social power inequalities (Nabatchi et al., 2012). Here the OECD has identified the important role of kind and neutral facilitators for fostering this inclusive environment.
Finally, sound judgement is about the capacity of citizens to reach a comprehensive collective decision, through egalitarian methods, based on the information available to them, their exchange of personal experiences, and their diverse perspectives (Nabatchi et al., 2012).
Learning is one of the key elements of a deliberative process. As discussed in Chapter 1, deliberation requires accurate and relevant information, which reflects a diversity of perspectives. For participants to be able to have quality discussions over a specific policy issue and reach informed decisions on recommendations, a learning stage is essential to any deliberative participation model. It is also why time is a crucial component to a successful process, as discussed in the previous section.
Learning usually takes place before the deliberation stage, though in practice the two often go hand-in-hand. It can also take place in the beginning of each smaller deliberative session, introducing a particular question or topic within a larger issue. An example of this is the World Wide Views model of deliberative process, where a complex issue is broken down into several components and each component is then discussed individually, after an introductory video provided to facilitate learning.
There have been different ways of informing participants about the policy issue at hand and facilitating learning. Figure 4.7 shows that among the deliberative processes for which data was available on learning practices (157 out of 282 cases), a large majority (79%) have had experts on the policy issue available at meetings. Experts were there to give presentations and answer participants’ questions.
Other types of learning components include introductory reading material before the first meeting (48%), learning sessions, including field trips to locations concerned, such as hospitals or infrastructure objects (43%), the right for participants to request information and invite speakers, stakeholders, and experts (35%). and providing participants with clear and extensive reading material in between meetings, so that participants could come prepared to the discussions (31%).
There are two key aspects of information sources: 1) diversity of information and 2) importance of giving citizens control (newDemocracy Foundation and UN Democracy Fund, 2019: 121). On the first point, the independent team responsible for designing and organising the deliberative process chooses the experts and informational material. They do not necessarily have expertise on the policy issue – their role is as experts of participation and deliberation. At the outset, they prepare comprehensive educational materials for the participants, sometimes with input from an advisory group of experts and stakeholders.
An extensive range of information sources is important. Having a wide breadth ensures that participants encounter and consider different points of view; the diversity of participants is complemented by a diversity of viewpoints in information sources. The type of information provided also matters in terms of public perceptions of fairness (i.e. this cannot be government brochures highlighting their successes or arguing for certain solutions). Allowing citizens to ask for information is therefore a crucial aspect of winning public trust in the process. They should be able to request any information they feel is necessary to come to an informed decision, which helps to address of a mistrust of experts and ensures they do not feel and the public does not perceive that the participants are being led towards a certain conclusion (newDemocracy Foundation and UN Democracy Fund, 2019).
Information comes from three types of sources: 1) government; 2) stakeholder or active voices; and 3) sources requested by participants. The information programme usually begins with an introduction to the issue, the context, and the diagnosis of the problem, followed by more details about the issue, and an exploration of possible solutions (Gerwin, 2018: 54).
In half (48%) of the deliberative processes for which there is data, participants are provided with an introductory kit ahead of the first meeting. The kit tends to cover the following information: “the problem and what answers are needed from the participants; the context of the process; what is on the table; the current approach or thinking on the topic; a deep set of data required to make a decision, and information from other government agencies whose responsibilities interact with the decision” (newDemocracy Foundation and UN Democracy Fund, 2019: 123). Beyond independent information, it often also includes the government’s view and position of the problem so that this is transparent to the participants.
The newDemocracy Foundation and UN Democracy Fund handbook on democracy beyond elections (2019) suggests that information kits for Citizens’ Juries/Panels should aim for 50-200 page documents that explain as much of the problem as possible, as this provides a foundation for forming informed decisions. While this sounds like a lot of reading, which may be perceived as an issue to inclusiveness as not all participants will have the time nor capacity to read such a large amount of information ahead of time, the idea is not for everybody to read the entirety of the kit. Participants will be naturally more interested in certain aspects than others. Between them, they will have covered everything and added to the collective intelligence of the group.
There is also increasing interest in televisual materials being used to complement the text-based ones in recognition that people have different learning styles. To ensure inclusion, it is also important for organisers to be able to provide alternative formats, such as braille or large print and video subtitles, if needed.
Beyond this information, stakeholders are encouraged to put forth submissions to provide a complementary set of perspectives to the policy issue. This can take the form of stakeholder information sessions and public submission processes online, where the information is also available to the wider public. The independent co-ordinators, together with the commissioning public authority and the advisory group if one exists, should identify key industry, social, and community stakeholders and actively seek their contribution. They should represent a wide range of perspectives.
A process is needed to identify the final line-up of experts and stakeholders who will address the participants in person and the information that will be shared as priority reading material. This is arguably the most challenging design element. It has to include a range of different points of view, opinions, and voices of groups that have a stake in or are involved in the policy question at hand. All stakeholders should be on an equal footing and have similar conditions and opportunities to present their point of view to the participants, which limits the influence of strong lobbies and allows groups with fewer resources to have a voice. Some examples of how this stage is designed in detail can be found in Gerwin’s guide to Citizens’ Assemblies (2018) and the newDemocracy Foundation and UN Democracy Fund’s handbook on deliberative democracy (2019).
There is often a large amount of stakeholder submissions. In these cases, a selection is made by the independent organisers to ensure the diversity of submissions is reflected. For example, during the Irish Citizens’ Assembly about amending the eighth amendment concerning abortion, 13,075 stakeholder submissions were received. Approximately 12,200 of them were published on the Assembly’s website in chronological order on a rolling basis and categorised by the name of the individual or organisation that submitted it. So that this large number of submissions could meaningfully contribute to the Assembly’s deliberations, a random sample of 300 submissions was selected and compiled into a single document, grouped according to submission date, and circulated to all Assembly members (see The Citizens’ Assembly, 2018 for more details about this process).
Finally, at the very beginning of the process and at the end of each learning session before the deliberation phase, participants should be asked: “What do you need to know and who do you trust to inform you?” (newDemocracy Foundation and UN Democracy Fund, 2019: 126; Gerwin, 2018).
Data was not collected for this report about the role of facilitators in the various deliberative processes. However, it is important to acknowledge that the role of people conducting the meeting is crucial to its success. They are responsible for creating a warm atmosphere, building trust among members, and ensuring the credibility of the process (Gerwin, 2018). They play a crucial role in supporting the participants to formulate their own recommendations, while maintaining neutrality and withholding their own judgements about the proposals. For this reason, it is important that facilitators do not have a stake in the outcome of the process – they should be independent and at arm’s length from the commissioning public authority.
Moreover, facilitators are there to deal with what can be considered ‘difficult’ situations, such as when there is tension between participants or someone loses their nerve (Gerwin, 2018). They also encourage equal participation amongst participants – some will naturally be more shy while others will be more likely to dominate a conversation; facilitators ensure a balance of speaking time.
For a practical guide to facilitating deliberative processes, see Chapter 5 (p. 165-202) in the newDemocracy Foundation and UN Democracy Fund handbook (2019).
A key difference between representative deliberative processes and other forms of citizen participation is that the outcome is not many individual views, but a collective and considered view. Citizens are tasked with finding consensus on the recommendations they provide to public decision makers. This does not mean that 100% of participants must agree with 100% of the proposals. This is highly unlikely and is arguably not desirable in a democracy that values pluralism. A common rule of thumb is that around 80% of the participants must agree that they would be fine with the recommendation. Sometimes the report with citizens’ recommendations includes a minority report, where participants are able to include the proposals that garnered some support, but not enough to be accepted by the majority of the group (see, for example, MASS LBP’s sample reports on their website).
The process for developing recommendations varies from model to model. For informed citizen recommendations, which requires the greatest amount of rigour, a detailed chapter about the steps to follow for developing recommendations and decision making is available in Gerwin’s guide (2018: 66-82).
The third criterion in Nabatchi et al.’s evaluation framework for deliberative processes (2012) is that the outcome is a set of influential conclusions and actions. There should be evidence of impact. This means that public authorities should respond to participants’ recommendations in a timely manner, explaining the rationale for why or why not they are able to accept them, and providing regular public updates about their implementation. This section looks at the data regarding the outputs, implementation of recommendations, and evaluation and monitoring of deliberative processes.
At the end of a deliberative process, citizens’ recommendations are delivered to the commissioning public authorities. Data was not systematically collected about the authorship of recommendations, although a qualitative analysis of available final reports from different countries suggests that often proposals are written mostly by citizens in their own words and are not edited by anyone. Participants sometimes accept or amend the proposals of experts from who they hear, particularly when it comes to more technical proposals. In some cases, such as during the 2016-2018 Irish Citizens’ Assembly, the report is written by the Secretariat with input from citizens, is sent back to a sub-group of citizens for comment, and then to the entire group to validate it. In the 2019-2020 French Citizens’ Convention on Climate, the participants’ recommendations are drafted with the help of legal experts, to ensure they could go directly to a legislative debate by parliamentarians. Such an approach leaves less room for ‘translation’ by public authorities. At the time of writing in early 2020, these recommendations have not yet been published and the full benefits and challenges of such an approach are not yet clear. The good practice principle is that the participants should have control of the recommendations.
An unedited final report gives the final document legitimacy and authenticity, which can also increase its perception of legitimacy in the public’s eyes. More information about the activities, guides, and prompts that enable citizens to write detailed and complex policy recommendations is available in Chapter 5 of the newDemocracy Foundation and UN Democracy Fund handbook (2019).
Once the final recommendations are delivered to the public authority, it is their responsibility to act. In a representative democracy, there is no expectation that the authority is obliged to accept all recommendations. There is, however, a responsibility to respond and to explain a rationale for accepting or rejecting any proposals.
Of the 104 cases for which data is available about the type of response, in two-thirds (66%) of them, the public authority discussed the final recommendations face-to-face with participants. In four in ten (42%) of those 103 cases, the public authority communicated a public response through government channels (such as website, social media) and traditional media (newspapers, radio), but it did not take place in person with the participants. In one quarter (24%) of the 103 cases, the commissioning authority followed up directly with the participants to let them know about the response to their recommendations, in addition to the public response (Figure 4.8).
A good example where participants’ recommendations received a thorough response with direct feedback to them after review is the 2014 Melbourne People’s Panel about the city’s 10-year, $5 billion AUD plan. The Council met with the participants in person to hear their recommendations. After seven months of review, it reconvened the Panel with the wider public to announce the 10-year plan, and clearly indicated which aspects came from the Panel’s suggestions. The Council accepted 10 out of 11 recommendations. The final plan document includes an annex where the participants’ recommendations are written in their own words, with an explanation of the council’s decisions regarding implementation (Box 4.4).
Overall, it is good practice to communicate directly with participants before and after the official response to recommendations to manage their expectations, highlight new opportunities to continue contributing to the issue, and reinforce the value of their involvement.
In 2014, 43 people were randomly selected by a civic lottery to participate in the Melbourne People’s Panel about the city’s 10-year, $5 billion AUD plan. They were given the time and resources to meet six times over the course of four months to deliberate and provide the Council with detailed recommendations. After reflecting on the Panel’s proposals for seven months, the council publicly launched the final budget, which accepted 95 % of the Panel’s proposals. The final plan document includes an annex where the participants’ recommendations are written in their own words, with an explanation for their decisions. The process allowed the Council to close its budget hole and is now being implemented.
More information is available at:https://participate.melbourne.vic.gov.au/10yearplan
As shown in the evidence, many people in countries around the world have been willing to give up a large amount of their time to participate in a deliberative process (on average, 3.7 days spread out over 6.6 weeks). It is a testament to the importance that impact plays in people’s decision making about whether participation is worth their time. People lead busy lives, and it is a rational response to not participate if the purpose and outcome are unclear.
All of the cases in this report have been commissioned by public authorities who have the ability to act on the recommendations that result from a deliberative process. They make a commitment to respond to citizens’ proposals and take them seriously. The more people feel they will have impact on policies that affect their lives, the more seriously they will volunteer their time.
However, impact is also the most elusive to measure. The previous section identifies that in many cases, there is a public or direct response to participants about their recommendations. The OECD tried to collect as much international data as possible about the implementation of commitments made based on citizens’ recommendations. There was data available for 55 cases, which suggests some promising conclusions (Figure 4.9).
In three quarters (76%) of these cases, the public authorities implemented over half of the recommendations. In four in ten (36%) of these cases, it implemented all of them. Only in six (11%) of these 55 cases were none of the recommendations implemented. One example of how citizen recommendations have been implemented, leading to improved road safety is discussed in Box 4.5, but there are many others. More research and analysis is needed about which proposals are accepted and whether there is a general tendency to ‘cherrypick’ (i.e. only accept proposals that fit with a public authority’s existing agenda, those that cost less, etc.).
These findings suggest that when presented with informed and considered proposals, public authorities are likely to act on them, as they include sensible recommendations that can lead to more effective public policies. More data is needed for this to be a robust conclusion, but it sheds some preliminary light on an issue that is much discussed and of great importance.
Impact in these situations is notoriously difficult to measure as often, even when a recommendation is accepted, it takes many months, if not years, for it to be operationalised and implemented. Public authorities are also seemingly missing out on an opportunity to publicly communicate how citizens’ recommendations are informing their decision making. As discussed later in this chapter in the section on public communication, it is a tool that should be leveraged more often to promote participation. If citizens see that the proposals of people like them are having an impact on policies, it could increase their trust in government and increase their likelihood to give up their own time when future opportunities to participate in public decision making arise.
Sharing the Roads Safely Citizens' Jury in South Australia, 2014
A four-day long Citizens’ Jury in South Australia of 47 randomly selected citizens has produced a set of recommendations to improve road safety in their region. Because of measures recommended by the citizens Jury and their implementation, bicycle related injuries dropped sharply in South Australia. The Jury's recommendations helped to reduce fatal and serious injuries by 28% from their high in 2012. Examples like this one provide evidence of positive outcomes of implemented citizen recommendations.
More information can be found at: https://yoursay.sa.gov.au/decisions/sharing-our-roads-safely/about
Monitoring and evaluating deliberative processes is key for several reasons. As Provision 5 of the 2017 OECD Recommendation on Open Government states, it is important to “develop and implement monitoring, evaluation and learning mechanisms for open government strategies and initiatives”, which include representative deliberative processes. Doing so allows for learning about what worked well and what could be improved regarding the processes that took place. It also helps build credibility and citizen trust in deliberative processes, and permits commissioning authorities and the public to understand the benefits for better policies and public services.
There is little data available about the monitoring of deliberative processes, and how citizens could be involved in monitoring implementation. However, good practice examples offer guidance on how such practices could be expanded to improve the end outcomes. For example, following a Citizens’ Jury in Dakota about the county’s land use plan, the members were reconvened to review how their recommendations were interpreted by officials. Later, the Jury members were asked to provide feedback on whether the plan was being implemented according to their proposals (Box 4.6).
The Citizens' Jury on Dakota County's Comprehensive Plan brought together 24 randomly selected citizens for five days to provide informed recommendations to the local government for the County’s Comprehensive Land Use Plan. The County faced tough choices related to its projected growth and how it could be managed.
After the process was completed and the government had a chance to consider the citizens’ recommendations, the Citizens’ Jury was reconvened to review how their recommendations were interpreted and taken into account. The Jury was also able to tell the county through a series of electronic votes whether the comprehensive plan appropriately reflected their recommendations.
More information can be found at: https://jefferson-center.org/wp-content/uploads/2012/10/land-use.pdf.
Another more recent example is from the Noosa Shire in Australia. Following a Citizens’ Jury about organic waste, the Council reviewed their proposals and convened a series of workshops to discuss their costs and implementation timings (Box 4.7).
In Australia, Noosa Shire, 24 randomly selected citizens were brought together for three and a half days to a Citizens’ Jury process to consider trade-offs involved in reducing organic waste sent to landfill. Once citizens’ recommendations were reviewed, the Council launched a series of workshops to discuss their costs and timing of implementation, taking the engagement of jurors even further and along multiple stages of the policy making cycle.
More information can be found at: https://www.newdemocracy.com.au/2014/10/01/noosa-community-jury/.
So far, the most common method of evaluation of deliberative process (67%) has been an anonymous survey of participants (Figure 4.10). Such surveys usually gather participant opinions on different elements of how the process went: their overall satisfaction; whether participants had enough opportunities to express their views; and whether they perceived the facilitation to be fair and balanced.
Seventeen per cent of deliberative processes have had an academic analysis. In most cases, these have been Deliberative Polls/Surveys (described in Chapter 2), as due to their design, they entail analysis of citizens’ opinion change after deliberation. By nature they are a scientific process. However, as Pincock (2012) covers extensively with reference to a wide range of academic literature, the empirical evidence that deliberation necessarily leads to opinion change is mixed; high quality deliberation can also lead to a reinforcement of an existing opinion backed by a better set of reasoned arguments. Some Citizens’ Initiative Reviews also have extensive academic evaluation due to close co-operation between the organisers and the researchers, and the researchers’ interest in deliberative processes.
Only seven per cent of deliberative processes have had an independent evaluation, usually by a private consulting company or a non-governmental organisation with expertise in citizen participation. Such independent evaluation complements the before-mentioned participant survey, allowing for a more comprehensive evaluation. However, while the idea of independent evaluation rings well, it is not entirely clear who would be best-suited to carry it out. Being able to do so would require a good understanding of representative deliberative processes. Thus, it may not be necessary or feasible for smaller scale processes due to practical constraints of time and budget. For larger scale processes that involve greater numbers and last a significant period, an independent evaluation could be recommended to ensure public confidence.
Two per cent are known to have had an official process reflection by the organisers. However, this percentage is likely to be much higher in reality. Qualitative research for this report suggests that organisers are constantly learning and adapting their approaches with each deliberative process they deliver. Formalising this, particularly for larger and more significant deliberative processes such as national Citizens’ Assemblies, could help promote institutional learning and improve future practice.
Finally, the fourth criterion that Nabatchi et al. (2012) identify relates to evaluating deliberative processes is their long-term effects on themselves, the wider public, and on macro-level political processes (changing public officials’ attitudes/behaviour and altering strategic political choices during elections). However, no data was collected for this report about how participation in a deliberative process impacts on the participants themselves in terms of agency and efficacy, nor on macro-level political processes. These are important aspects of impact and have been researched by academics, though further study is also needed (see Grönlund et al., 2010; Niemeyer, 2011; Knobloch et al., 2019).
This section thus focuses on the impact on the wider public. It considers the role that public communication as a mechanism for public learning plays in achieving this impact. It also looks at how deliberative processes have been combined with forms of participatory democracy to involve a larger portion of society beyond the small group of randomly selected participants.
Public communication is understood as any communication activity or initiative led by public institutions for the public good. It is different from political communication, which is linked to the political debate, elections, or individual political figures and parties. With effective public communication, a deliberative process can be a mechanism for the broader public to learn about an issue as well as encourage it to participate more in public life in general. This is particularly the case as deliberative processes lead to citizens’ voices being heard and help bridge the gap between citizens and governments. Public communication can also help gain support and legitimacy for the use of deliberative processes for decision making, as well as the recommendations developed by the participants in the deliberative process (Raphael and Karpowitz, 2013), which further facilitates the implementation of the recommendations and the resulting policy.
There are several good practices of public communication in support of deliberative processes that can help achieve the goal of public learning and ensure a smooth deliberative process. Rather than solely making information about the whole process available, the most effective examples demonstrate that the public authority has made an active effort to reach a wide range of citizens to increase awareness of the process and its purpose.
For smaller scale deliberative processes, information about the process (recruitment, agenda, experts, etc.) is made available on existing government websites and platforms and/or on the website of the independent organiser that has been commissioned to deliver the process. For larger scale processes that involve larger numbers and last a significant period, notably for Citizens’ Assemblies, the common practice has been to set up a separate website where the public and the media can find all information relevant to the deliberative process. Examples include the websites set up for the 2016-2018 Irish Citizens’ Assembly, the 2019-2020 French Citizens’ Convention on Climate, and the 2020 UK Climate Assembly.
Having an individual responsible for public communication from the beginning of the process can help to ensure a coherent communications strategy both with participants of the process as well as the broader public (OECD, 2019).
An example of how good public communication expands public learning beyond the participants of the process is the Irish Citizens’ Assembly of 2016-2018. The Assembly was comprised of 99 randomly selected citizens, who were tasked with providing recommendations for the constitutional amendment regarding the right to abortion. The topic was complex and had been the subject of political debate for many years before the Assembly was convened. Participants of the deliberative process had an opportunity to learn from experts, listen to stakeholders, and deliberate amongst themselves to reach a conclusion. They recommended to the special cross-party parliamentary committee that was set up to especially to consider its conclusions to change the eighth amendment of the constitution, which at the time banned abortions, and suggested for the government to hold a referendum on the matter, which is required in Ireland for constitutional changes.
As the Irish Citizens’ Assembly was well-communicated throughout the process (with online streaming of proceedings, interviews with participants in the press, all of the information related to the policy issue being made available online publicly, and extensive coverage on television, particularly by the public sector broadcaster), broader society was aware that it was taking place, knew about its mandate and composition, watched the livestreams, and read the submissions.
As research on the deliberative process shows, evidence presented to the Citizens’ Assembly helped to increase the public’s understanding of the issue in question. An exit poll after the referendum found that 66% of voters were aware of the Citizens’ Assembly, including a plurality in all age groups, social classes, and regions, with the exception of those under twenty-four years old who were less aware (Suiter, 2018). Seven in ten voters (70%) knew that it comprised randomly selected Irish citizens, and three-quarters (76%) knew that experts informed the discussions (Suiter, 2018). These findings highlight the potential of deliberative practices to provide a wider platform for informed discussion in broader society. The high awareness levels also indicate that transparency and public communication can have a significant impact and are central to the legitimacy of the deliberative method used.
Strong public communication about the representative deliberative process can also be a tool to help counteract disinformation and polarisation regarding the issue that is being addressed by the process. Empirical research has shown that “communicative echo chambers that intensify cultural cognition, identity reaffirmation, and polarisation do not operate in deliberative conditions, even in groups of like-minded partisans” (Dryzek et al., 2019; see Grönlund et al., 2015). There is also evidence from places such as Belgium, Bosnia, Colombia, and Ireland to suggest that deliberation can be an effective way to overcome ethnic, religious, or ideological divisions between groups that have historically found their identity in rejecting that of the other (Ugarizza et al., 2014). Interviews with observers of the 2016-2018 Irish Citizens’ Assembly regarding the issue of abortion also suggest that having ordinary people discussing the topic and presenting the findings publicly helped to counter disinformation during the referendum campaign.
Proactive and effective public communication by raising awareness about the deliberative process, and ensuring its transparency, can also potentially increase trust in both directions: of citizens in government and of government in citizens. There is some evidence that participating in a deliberative process does positively impact on citizens’ trust in government (Box 4.8). Being aware that a deliberative process is taking place initiated by the government and being able to follow it as it is taking place in a transparent way can increase citizens’ perceptions of the government as being open, accountable, transparent, and inclusive.
Sixty-two randomly-selected citizens were brought together for a day-long Deliberative Poll to discuss the transit and traffic issues facing the residents of La Plata. Participants were surveyed before and after the process. There was a strong increase in trust in government after participation. The participants dramatically changed their view about whether public officials would listen to their views. Before deliberation, 60% disagreed strongly with the statement that “public officials care a lot about what people like me think.” After deliberation, this position dropped forty points to only 20%.
More information can be found at: https://cdd.stanford.edu/2009/deliberative-polling-on-transit-and-traffic-issues-in-la-plata/.
Typically, representative deliberative processes are not used in isolation, and are rather a central part of a wider strategy of citizen participation around a specific policy issue. The most common types of stakeholder participation that are used in conjunction with deliberative processes are online calls for proposals/ submissions (used in 33 cases) and surveys (29 cases) (Figure 4.12). Other common methods are public consultations (19 cases) and roundtable discussions (16 cases).
Some deliberative processes have built-in other participation processes by design. For example, Citizens’ Councils are typically followed by a Citizens’ Café, where recommendations are discussed with politicians and the broader public.
Stakeholder participation typically happens before the deliberative process, with a goal of gathering the public’s opinions, that the participants can then take into account when deliberating and producing recommendations. However, sometimes stakeholder participation takes place in parallel to the deliberative process and can even be facilitated by the participants themselves. A common example is for the participants to host roundtable discussions open to anyone in the wider community to answer questions and gather perspectives and reactions from broader society. For instance, during the St. Joseph’s Health Centre Community Reference Panel in 2015, the panel members convened public hearings and discussions, which then fed into their considerations for developing recommendations to St. Joseph’s Health Centre (Box 4.9).
Participants of deliberative processes can become active conveners of the broader public. In order to involve more citizens in the process and enhance transparency and inclusion, St. Joseph's Health Centre Community Reference Panel in Canada organised a Community Roundtable Meeting to discuss the opinions of other community members. This evening session allowed members of the community to participate in the deliberative process and meet the members of the community panel.
More information is available here: https://stjoestoronto.ca/wp-content/uploads/2016/03/SJHC_Reference-Panel_Final-Report.pdf
Carson, Lyn (2018), “Framing the Remit”. newDemocracy Research and Development Note, https://newdemocracy.com.au/wp-content/uploads/2018/07/docs_researchnotes_2018_July_RampD_Note_-_Framing_the_Remit.pdf. Accessed on 21 January 2020.
Center for Deliberative Democracy (2009), “Deliberative Polling on Transit and Traffic Issues in La Plata, Argentina”, Center for Deliberative Democracy: University of Stanford, https://participate.melbourne.vic.gov.au/10yearplan, accessed on 4 March 2020.
Chwalisz, Claudia (2017), The People’s Verdict: Adding Informed Citizen Voices to Public Decision-making, New York: Roman & Littlefield.
Chwalisz, Claudia (2015), The Populist Signal: Why Politics and Democracy Need to Change, New York: Roman & Littlefield.
City of Melbourne (2015), “10-Year Financial Plan”, https://participate.melbourne.vic.gov.au/10yearplan, accessed on 4 March 2020.
Climate Assembly UK (2020), “Who is taking part?”, https://www.climateassembly.uk/detail/recruitment/, accessed on 14 April 2020.
Dalton, Russel (2008), The Good Citizen: How a Younger Generation is Reshaping American Politics, Washington D.C.: CQ Press.
Davies, Ben B., Kirsty Blackstock, and Felix Fauschmayer (2005), “‘Recruitment’, ‘Composition’, and ‘Mandate’ Issues in Deliberative Processes: Should We Focus on Arguments Rather than Individuals?”, Environment and Planning: Politics and Space 23(4): 599-615.
Dryzek, John S., André Bächtiger, Simone Chambers, Joshua Cohen, James N. Druckman, Andrea Felicetti, James S. Fishkin, David M. Farrell, Archon Fung, Amy Gutmann, Hélène Landemore, Jane Mansbridge, Sofie Marien, Michael A. Neblo, Simon Niemeyer, Maija Setälä, Rune Slothuus, Jane Suiter, Dennis Thompson, and Mark E. Warren (2019), “The Crisis of Democracy and the Science of Deliberation”, Science 363(6432): 1144-1146. DOI: 10.1126/science.aaw2694.
Dryzek, John and Simon Niemeyer (2008), “Discursive Representation”, American Political Science Review 102(4): 481-493.
Gerwin, Marcin (2018), Citizens’ Assemblies: Guide to Democracy That Works, Krakow: Open Plan Foundation, http://citizensassemblies.org, accessed on 3 March 2020.
Gouvernement français (2019), “Convention citoyenne pour le climate: les 150 citoyens tirés au sort débutent leur travaux", https://www.gouvernement.fr/convention-citoyenne-pour-le-climat-les-150-citoyens-tires-au-sort-debutent-leurs-travaux, accessed on 4 March 2020.
Grönlund, Kimmo, Maija Setälä and Kaisa Herne (2010), “Deliberation and civic virtue: lessons from a citizen deliberation experiment”, European Political Science Review 2(1): 95-117.
Grönlund, Kimmo, Kaisa Herne and Maija Setälä (2015), “Does Enclave Deliberation Polarize Opinions?”, Political Behaviour 37: 995-1020.
Hartz-Karp, Janette (2002), “Albany Administration Centre Site Citizens' Jury”, 21st Century Dialogue, http://21stcenturydeliberation.com/index.php?package=Initiatives&action=Link&file=albany_admin_centre.html, accessed on 4 March 2020.
Hartz-Karp, Janette (2001), “Reid Highway Extension”, 21st Century Dialogue, http://21stcenturydeliberation.com/index.php?package=Initiatives&action=Link&file=reid_hwy_extension.html, accessed on 4 March 2020.
Jefferson Center (2000), “Citizens Jury: Dakota County’s Comprehensive Plan”, Jefferson Center, https://jefferson-center.org/wp-content/uploads/2012/10/land-use.pdf, accessed on 4 March 2020.
Jefferson Center (1988), Final Report: Policy Jury on School-based Clinics, Minnesota: Jefferson Center.
Landemore, Hélène (2012), Democratic Reason: Politics, Collective Intelligence, and the Rule of the Many, Oxford: Princeton University Press.
Knobloch, Katherine R., Michael L. Barthel, and John Gastil (2019), “Emanating Effects: The Impact of the Oregon Citizens’ Initiative Review on Voters’ Political Efficacy”, Political Studies 2019: 1-20.
Korean Center for Social Conflict Resolution (2019), Activity report: KCSI.
MASS LBP (2019), “Final Report of Toronto’s Transform TO Reference Panel on Climate Action”, Toronto: MASS LBP, https://www.toronto.ca/wp-content/uploads/2019/11/9048-TTO-Reference-Panel-on-Climate-Action-Report_FINAL.pdf, accessed on 4 March 2020.
MASS LBP (2017), “How to Run a Civic Lottery: Designing Fair Selection Mechanisms for Deliberative Public Processes”, Toronto: MASS LBP, https://static1.squarespace.com/static/55af0533e4b04fd6bca65bc8/t/5aafb4b66d2a7312c182b69d/1521464506233/Lotto_Paper_v1.1.2.pdf, accessed on 3 March 2020.
MASS LBP (2017), “The Reference Panel Playbook: Eight moves for designing a deliberative process”, https://www.masslbp.com/the-reference-panel-playbook, accessed on 20 January 2020.
Nabatachi, Tina, John Gastil, Matt Leighninger, and G. Michael Weiksner (2012), Democracy in Motion: Evaluating the Practice and Impact of Deliberative Civic Engagement, Oxford: Oxford University Press, DOI:10.1093/acprof:oso/9780199899265.003.0010.
newDemocracy Foundation and United Nations Democracy Fund (2019), Enabling National Initiatives to Take Democracy Beyond Elections, Sydney: newDemocracy Foundation, https://www.newdemocracy.com.au/wp-content/uploads/2018/10/New-Democracy-Handbook-FINAL-LAYOUT-reduced.pdf, accessed on 30 October 2019.
newDemocracy Foundation (2017), “Yarra Valley Water – Price Submission Process (2017)”, newDemocracy Foundation, https://www.newdemocracy.com.au/2017/02/21/yarra-valley-water-price-submission-process/, accessed on 3 March 2020.
newDemocracy Foundation (2014), “Noosa Community Jury (2014)”, newDemocracy Foundation, https://www.newdemocracy.com.au/2014/10/01/noosa-community-jury/, accessed on 4 March 2020.
Niemeyer, Simon (2011), “The Emancipatory Effect of Deliberation: Empirical Lessons from Mini-Publics”, Politics & Society 39(1): 103-140.
OECD (2019), Communicating Open Government: A How-to Guide, Paris: OECD Publishing, https://www.oecd.org/gov/Open-Government-Guide.pdf, accessed on 15 April 2020.
Olsen, V. Beth Kuser, Matthias Ruth and Gerald E. Galloway Jr. (2018), “The Demographics of Public Participation Access When Communicating Environmental Risk”, Human Ecology Review 24(1).
Page, Scott (2007), The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies, Princeton: Princeton University Press.
Parkinson, John (2003), “Legitimacy Problems in Deliberative Democracy”, Political Studies 51: 180-196.
Pincock, Heather (2012), “Does Deliberation Make Better Citizens?”, In Nabatachi, Tina, John Gastil, Matt Leighninger, and G. Michael Weiksner, Democracy in Motion: Evaluating the Practice and Impact of Deliberative Civic Engagement, Oxford: Oxford University Press, DOI:10.1093/acprof:oso/9780199899265.003.0010.
Pratt, Julie (2005), A Guidebook for Issue Framing, The Kettering Foundation, http://commons.kettering.org/system/files/documents/Issue%20Framing%20Guidebook%202015%20FINAL.pdf, accessed on 21 January 2020.
Raphael, Chad and Christopher F. Karpowitz (2013), “Good publicity: The legitimacy of public communication of deliberation”, Political Communication 30: 17-41.
RTE, Universities Exit Poll (2018), “Thirty-sixth Amendment to the Constitution Exit Poll”, https://static.rasset.ie/documents/news/2018/05/rte-exit-poll-final-11pm.pdf.
Service-Public.Fr (2019), “Convention citoyenne pour le climat au CESE : quelle indemnisation des participants ?”, https://www.service-public.fr/particuliers/actualites/A13609, accessed on 25 March 2020.
Smith, Graham and Rosemary Bechler (2019), “Citizens Assembly: Towards a Politics of ‘Considered Judgement’ Part 2”, OpenDemocracy, https://www.opendemocracy.net/en/can-europe-make-it/citizens-assembly-towards-a-politics-of-considered-judgement-part-2/, accessed on 5 March 2020.
Smith, Aaron, Kay Lehman Schlozman, Sidney Verba, and Henry Brady (2009), “The Demographics of Online and Offline Political Participation”, Pew Research Centre, https://www.pewresearch.org/internet/2009/09/01/the-demographics-of-online-and-offline-political-participation/, accessed on 3 March 2020.
St. Joseph’s Health Centre (2016), “Community Reference Panel”, Toronto: St. Joseph’s Health Centre, https://stjoestoronto.ca/wp-content/uploads/2016/01/StrategicPlan.pdf, accessed on 4 March 2020.
Suiter, Jane (2018), “Deliberation in Action – Ireland’s Abortion Referendum”, Political Insight, September 2018: 30-32.
The Citizens’ Assembly (2018), “Random Sample of Submissions Received on the Eighth Amendment”, https://2016-2018.citizensassembly.ie/en/Submissions/Eighth-Amendment-of-the-Constitution/Random-Sample-of-Submissions-Received-on-the-Eighth-Amendment/, accessed on 4 March 2020.
Ugarizza, J.E., Didier Caluawerts (2014), Democratic Deliberation in Deeply Divided Societies: From Conflict to Common Ground, London: Palgrave Macmillan.